PEPFAR's annual planning process is done either at the country (COP) or regional level (ROP).
PEPFAR's programs are implemented through implementing partners who apply for funding based on PEPFAR's published Requests for Applications.
Since 2010, PEPFAR COPs have grouped implementing partners according to an organizational type. We have retroactively applied these classifications to earlier years in the database as well.
Also called "Strategic Areas", these are general areas of HIV programming. Each program area has several corresponding budget codes.
Specific areas of HIV programming. Budget Codes are the lowest level of spending data available.
Expenditure Program Areas track general areas of PEPFAR expenditure.
Expenditure Sub-Program Areas track more specific PEPFAR expenditures.
Object classes provide highly specific ways that implementing partners are spending PEPFAR funds on programming.
Cross-cutting attributions are areas of PEPFAR programming that contribute across several program areas. They contain limited indicative information related to aspects such as human resources, health infrastructure, or key populations programming. However, they represent only a small proportion of the total funds that PEPFAR allocates through the COP process. Additionally, they have changed significantly over the years. As such, analysis and interpretation of these data should be approached carefully. Learn more
Beneficiary Expenditure data identify how PEPFAR programming is targeted at reaching different populations.
Sub-Beneficiary Expenditure data highlight more specific populations targeted for HIV prevention and treatment interventions.
PEPFAR sets targets using the Monitoring, Evaluation, and Reporting (MER) System - documentation for which can be found on PEPFAR's website at https://www.pepfar.gov/reports/guidance/. As with most data on this website, the targets here have been extracted from the COP documents. Targets are for the fiscal year following each COP year, such that selecting 2016 will access targets for FY2017. This feature is currently experimental and should be used for exploratory purposes only at present.
Years of mechanism: 2010 2011 2012
The main objectives for the data quality assessment (DQA) project are to assess strengths and weaknesses in data collection and to compile and report for a to-be-determined number of IPs and selected grantees; increase the capacity of IPs to produce accurate and timely data for PEPFAR reporting; and to support health system strengthening through the use of high quality service data for program decision-making. The objectives link to the sixth goal of the PF, which focuses on utilizing strategic and evidence-based data for informed decision-making
The project works nationally and is dependent on where USAID and DOD partners work, including the locations of services that are offered by the Department of Social Welfare.
The DQA project has streamlined its processes in an effort to reduce the amount of time in the field. Also, the DQA staff has worked to transfer skills, such as GIS to the local staff, to reduce the amount of TDY visits.
In terms of transitional efforts, the project is working with a local subcontractor and gradually building up their capacity by adding more activities to their SOW year after year. For instance, within this fiscal year, the local subcontractor was responsible for leading one of the DQA teams. Training support is also being provided to improve writing skills, particularly in report writing. In terms of OVC work, collaboration with the FHI seconded M&E officers, as well as their DSW attached counterparts, has been a focus to increase M&E skills in the department.
The project conducts mini-DQAs of weaker performing partners to monitor if improvements have been made during the year. Improvement to capacity building plans will now include performance indicators to measure the influences of mentoring and coaching activities.
MEASURE Evaluation will conduct DQAs with two to three HBC partners from headquarters to service delivery points to strengthen the IP systems and, in turn, improve the quality of data being reported into the NACP system. Data trace and verification, along with M&E system assessments, will be conducted at the IP headquarters, regional/district/sub grantee offices, and the service delivery points. A trace and verify of HBC clients will also be conducted to ensure that services being received are outlined by the program. The DQA findings will be used to develop capacity building plans for each IP. MEASURE Evaluation staff will work with the IPs to implement the developed plans, providing mentoring where needed. Performance indicators will be developed and monitored for each IP to determine if the capacity building efforts are improving the IPs' systems.
MEASURE Evaluation will work with NACP to determine if any improvements may be needed for the national HBC record and reporting system. A plan will then be developed in conjunction with NACP to roll out the revised system, including working with partners utilization of the system to incorporate capacity building efforts.
MEASURE Evaluation will continue to provide M&E technical assistance to DSW in FY 2012. Activities will include mentoring DSW appointed M&E person, supporting the MVC M&E TWG, training on the national tools and roll-out of participatory monitoring and evaluation approaches to the grass roots level, and training on data quality and data use. A mentoring plan will be developed in conjunction with the DSW M&E focal person that will be monitored over the course of the year, which will include performance indicators to track improvements. A training plan will be developed with DSW to train district level staff on national tools, data quality for the national system, and data use for decision- making at the district level. Through participatory M&E, MEASURE Evaluation will strengthen data collection, quality, and useage at the grass roots level, and in turn improve the data quality that is being submitted for the MVC DMS. MEASURE Evaluation will roll out participatory M&E to sampled districts and wards. Afterwards, supportive supervision at three month intervals post roll out will be conducted to monitor whether communities are continuing to implement the activity. During supportive supervision, technical assistance will be provided, where needed, to ensure communities are collecting quality data and using the data to make informed decisions at the community level.
M&E technical assistance will continue to be provided to OVC IPs, including data quality, data use, and program evaluation and assessment. MEASURE Evaluation will engage in dialogue with OVC partners to determine their M&E technical assistance needs, whereby a plan will be developed with the partner to address those needs (some needs will be addressed directly by the partner and others by MEASURE Evaluation). Regarding the KIHUMBE (a home-based services provider group in Mbeya) job incubation model, MEASURE will work with KIHUMBE, the KIHUMBE M&E officer, and DOD to come up with a monitoring and assessment plan to supervise the implementation.
In terms of the DMS, an improved system will be developed and initially rolled out to districts in FY 2012. With FY 2012 funding, MEASURE Evaluation will continue to conduct a staggered roll out of the improved system to districts and provide training and technical support, where needed. MEASURE Evaluation will have staff that can trouble shoot issues that arise and support district staff through training (classroom style and on-the-job).
This overall activity works towards strengthening the national MVC system, including improving data quality, and increasing data use at the national and sub national levels, through mentoring and trainings. MEASURE Evaluation will also work to move the national MVC M&E agenda forward through participation in the MVC M&E TWG and assisting the DSW in implementing the MVC M&E TWG work plan.
MEASURE Evaluation will conduct Round 5 of the DQAs during this funding cycle. Between 7-10 USAID IPs will participate in full DQAs, while three to four partners will participate in mini-DQAs. MEASURE Evaluation will continue to work with OVC partner sub grantees to strengthen their systems, ensuring quality data is collected for decision-making purposes and reporting requirements.
Capacity building activities during this cycle will focus on the sampling process and fieldwork planning. JL Consultancy will lead at least two of the DQA teams (up from one in Round 4) and take more of a lead on report writing. This will require more involvement in the planning of the DQA and meeting with the IPs, including going with the DQA Project Manager during the initial meetings to discuss the DQA process and the IPs M&E systems.
MEASURE Evaluation will use the findings from the DQAs to develop capacity building plans for IPs and organize workshops. Based on the experience of the last few years, MEASURE Evaluation will shift to more individualized mentoring and coaching, developing plans with clear objectives, activities and anticipated outcomes and creating activity logs that will be maintained to track mentoring sessions with partners. Performance indicators will be developed to track progress over time and will be monitored either through mini-DQAs or informant interviews.
An M&E 101 course for sub-national partners with low levels of M&E experience and low DQA scores will be conducted. MEASURE Evaluation will also continue to offer GIS 101 and 102 trainings and data demand and usage trainings.
Through strengthening the USAID IPs M&E systems, MEASURE Evaluation indirectly strengthens the national M&E systems by ensuring that partners are reporting quality data into the systems. In addition, since many partners work with government counterparts, skills in data quality and evidence-based decision making can be transferred across to government counterparts.