DOC PREVIEW
Berkeley A,RESEC C253 - ARE 253 Handout

This preview shows page 1-2-3 out of 10 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Impact Analysis Handout - 1 - 8/31/04PP 253 / ARE 253 Sadoulet / de JanvryFall 2004Handout #2Impact Evaluation1. Evaluation systemsProject sequence:Inputs  Activities  Output  Intermediate outcomes  Final outcomes (goals) Implementation ResultsTypes of evaluation: Programmatic evaluation: logframeFrom activities to outputs and outcomes (indicators).Evaluate achieved against planned outputs and outcomes at given times(intermediate and final). Comprehensive expenditure analysisUse of resources: observe and explain inconsistencies between actual andplanned expenditures. Impact analysis:Changes in selected indicators of outcomes that can be attributed to a specificintervention.To do an impact analysis: We need to clearly identify a specific intervention (what program, whatexpected objectives, at what time, at what place, applied to what unit ofanalysis). We need to identify a counterfactual with no intervention against which thechange with intervention can be measured: before/after, with/without. We need to specify indicators of outcomes (endogenous variables) to be used tomeasure impact. Hence, the project objectives (goals, mandates) need to beclearly defined. These indicators must be observable before/after orwith/without the intervention. They can be indicators of intermediate or finaloutcomes. If there is impact heterogeneity, we need to identify exogenous variables thatmay make the impact differentiated across units of analysis. We need reliable/credible/verifiable information.Objectives of evaluation systems:Often required by law: Yearly in Mexico, as required by Congress; U.S. 1993Government Performance and Results Act, fully implemented starting in 1997.Impact Analysis Handout - 2 - 8/31/04Allows to engage in results-based management. Use results of evaluation to: Assess value of program (ex-post). Adjust program (feedbacks): minor adjustments, major adjustments, redesign,cancel. Link to resource allocation, budgeting, personnel management. Evaluation is a learning process (hence role of participation, ownership). Improving evaluation = learning to learn (start simple, use pilots, and improveover time). Need incentives to learn, use results, and change programs.Impact evaluation challenge and techniques:• Selection bias:- program placement- self-selection• Techniques for impact evaluation- Experimental design, randomization. Treatment and control groups- Quasi-experimental design: Treatment and comparison groupsMatching methods, Double-difference techniques- Non-experimental design,Instrumental variablesStatistical methods- Qualitative methods2. Experimental design – RandomizationRandomization allows to create identical treatment and control groups.• Procedure and ethical issueTreatment group and control groupExample: Rural education program Progresa in Mexico• Program impact from simple difference  y Te • ----------------------------------- Impact =  y Te− y Ce y Ce •--------Eligible (e) Eligible (e)Random treatment (T) group Random control (C) group Impact =1NTyii∈T∑average outcome in treatment group     −1NCyjj ∈C∑average outcome in control group    Impact Analysis Handout - 3 - 8/31/04Can be done on subgroups to evaluate heterogeneity of program effectExample: School subsidy in urban Pakistan (Quetta)Need to check that control and treatment groups have similar distribution ofexogenous variables, outcome prior to program (if available), and behavior prior tothe program (if available).3. Matching method to construct comparison groupsIdentify non-participants that are comparable in essential characteristics to theparticipants.Possible for program with partial coverage, i.e., when there exists a large populationwhich, for exogenous reasons, has been excluded from the program.Key: The selection of the participants is not related to the outcome of the programExamples: Local programsBy contrast to: - credit program placed where economic opportunities are highest.- health clinics placed where most needed- self-selection for program participation  yP • ---------------------------------- Impact =  yP− ymNP ymNP •-------Participants (P) Matched (m)non-participants (NP)• Data needed: A sample of participants (usually from a special survey designed forthe program evaluation) and a large sample of non-participants (usually some otherlarge existing survey, such as LSMS for households) from which one can pick thecomparison group. Both surveys must include variables X that are importantdeterminants of program participation and outcome.• Simple matching: Construct a comparison group with non-participants that havecharacteristics that are “similar” to those of the participants.Example: Job training program for women, unemployed for less than 3 months, andwithout kids.• Propensity score matching (individual matching):Variables X that help predicts program participation.Instead of matching on all the X, one matches on the probability of participation.a) Using both sample to estimate the probability of participation as function ofvariables X.Impact Analysis Handout - 4 - 8/31/04b) For each participant i to the program find the closest matches m(i) among the non-participants, i.e., the non-participants with the closest predicted value ofprobability of participation. One can choose 1 to 5 closest matches.c) For each participant i, compute the average outcome of the closest matches:ymi=1nyjj∈m( i)∑d) The impact of the program is:Impact =1NTyi− ymi( )i ∈T∑There are many variations of this method (using just one match, or a few matches, ormany matches that are weighted according to how “close” they are to the treatedperson).Example: Argentina’s workfare program TrabajarNote: The income effect is less than the payment from Trabajar, because of theforegone income. By subtracting the predicted income effect from each observedincome, one can estimate the “without program” income.Whatever the matching method, program impact can be computed for different subgroupsof the population to assess heterogeneity in the impact of the program.4. Double difference methodWhen the control or comparison groups are not perfectly comparable:- imperfect randomization, imperfect matching- there is


View Full Document

Berkeley A,RESEC C253 - ARE 253 Handout

Documents in this Course
Impact

Impact

9 pages

Load more
Download ARE 253 Handout
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view ARE 253 Handout and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view ARE 253 Handout 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?