Impact Evaluation

Video: How to design a randomised evaluation

- 0 comments

Randomised evaluations generate rigorous evidence on the impact of a programme. In this video, Professor Howard White explains the different designs of a randomised evaluation and shows how to choose the right design for an intervention.

Video: How to build a theory of change for an impact evaluation

- 0 comments

A comprehensive theory of change is integral to designing a high quality and policy relevant impact evaluation. In this video, Professor Howard White uses the example of a school feeding programme to illustrate the steps involved in building a theory of change for an impact evaluation.

Blog: Evaluations that make a difference. Evaluations stories around the world

- 0 comments

"Evaluations that make a difference" is a collection of eight evaluation stories from around the world which is one of the first pieces of systematic research looking at factors that contribute to high quality evaluations that are used by stakeholders to improve programs and improve people’s lives.

FANTA: Sampling Guide (with 2012 Addendum)

- 0 comments

Authors: Robert Magnini (1999) and Diane Stukel & Megan Deitchler (2012) (for FANTA)
Publication date: Sampling Guide (1999) and Addendum (2012)

These materials provide sampling guidance for baseline and final performance evaluation surveys in the context of USAID Food for Peace Title II development food assistance programs. The guide provides methods and instructions for developing the design of a population-based household survey and provides information on how to randomly select samples of communities, households, and/or individuals for such surveys. It emphasizes the use of probability sampling methods, which are essential to ensure that the survey represents the target population. In the addendum, an updated approach for sample size calculation is provided, which will result in a household sample size that is more likely to achieve the required sample size of children for child-level indicators.

ILAC Brief 26: Making causal claims

- 0 comments

Author: John Mayne
Publication date: 2012

An ongoing challenge in evaluation is the need to make credible causal claims linking observed results to the actions of interventions. This ILAC Brief argues the need for a different perspective on causality, where interventions are seen as contributory causes to certain results.

Making causal claims workshop 2013

- 0 comments

Author: Mayne, J.
Publication date: 2013

This powerpoint presentation on causal contribution was part of the Workshop on Impact, Learning and Innovation, held at the Institute of Development Studies, March 2013. The presentation focuses on intervention causality and draws attention to the notion of interventions not being a single but a contributory cause to certain results. Theories of Change are offered as a model to see the intervention as a contributing cause.

Making causal claims workshop 2013

- 0 comments

Author: Mayne, J.
Publication date: 2013

This powerpoint presentation on causal contribution was part of the Workshop on Impact, Learning and Innovation, held at the Institute of Development Studies, March 2013. The presentation focuses on intervention causality and draws attention to the notion of interventions not being a single but a contributory cause to certain results. Theories of Change are offered as a model to see the intervention as a contributing cause.

Trends in impact evaluation: Did we ever learn?

- 0 comments

In 2006, the Evaluation Gap Working Group asked, “When will we ever learn?” This week, 3ie’s Drew Cameron, Anjini Mishra, and Annette Brown (hereafter CMB) have published a paper in the Journal of Development Effectiveness that uses data on more than thirty years of published impact evaluations from 3ie’s Impact Evaluation Repository (IER) to answer the question. The full article can be found here

Video of IEG Event: Why Focus on Results When No One Uses Them?

- 0 comments
This is a video of the IEG event: Why Focus on Results When No One Uses Them? that took place on April 17, 2015. 
Who cares about getting results in development?  Everyone! But, how many know how to use evidence to drive and show results?  Probably not as many.

The growth of impact evaluation for international development: how much have we learned?

- 0 comments

Authors: Drew B. Camerona , Anjini Mishrab & Annette N. Brown

Published: 28 April 2015.

This article examines the content of a web-based repository of published impact evaluations of international development interventions. To populate this repository, we conducted a systematic search and screening process. We find that of the 2259 studies published from 1981 to 2012, annual publication increased dramatically after 2008.