Program evaluation and spillover effects

AuthorsAngelucci, M.; Di Maro, Vincenzo.

Published: 2015.

This paper is a practical guide for researchers and practitioners who want to understand spillover effects in program evaluation. The paper defines spillover effects and discusses why it is important to measure them.

(Breaking) The Iron Triangle of Evaluation

Author: Reynolds, M.
Publication date: 2015

Ideas from complexity science and systems thinking are demonstrably helpful in a shift from exploring (systematic) linear net effects of an intervention towards exploring wider (systemic) effects occurring elsewhere.

But where these ideas of ‘impact’ are coupled with a narrow use of the contingency approach, some less helpful ‘triangulated’ relationships might be evident. These relationships might be regarded in terms of an ‘iron triangle’, a metaphor used frequently to exemplify pernicious relations of power.

Aiming for Utility in ‘Systems-based Evaluation’: A Research-based Framework for Practitioners

Author: Grove, J.T.
Publication date: 2015

System dynamics modelling (SDM) was used and process researched as a case to investigate its utility as a systems-based evaluation (SBE) approach. A system dynamics (SD) model1 was developed to evaluate the potential requirements and implications on the health systems of the ambitious antiretroviral therapy scale-up strategy in Lusaka, Zambia.

Research on SDM for strategic evaluation provided insights and principles for future application of SBE.

Systems Dynamics Modelling in Industrial Development Evaluation

Authors: Derwisch, S. & Löwe, P.
Publication date: 2015

The complexity of development processes makes it difficult to observe and interpret the impacts of policies. The authors demonstrate the use and benefits of system dynamics modelling (SDM) in impact evaluation of private sector development programmes.

Going Beyond Mixed Methods to Mixed Approaches: A Systems Perspective for Asking the Right Questions

Authors: Garcia, J.R. & Zazueta, A.
Publication date: 2015

An impact evaluation’s primary task is to determine which impacts were caused by an intervention, distinguishing them from those produced by other causes.

However, in complex systems, interventions may contribute towards less apparent forms of impact (such as negative, unintended, indirect and secondary) that are no less significant, but which require a different way of asking questions.

Learning, Systems Concepts and Values in Evaluation: Proposal for an Exploratory Framework to Improve Coherence

Authors: Hummelbrunner, R.
Publication date: 2015

The three core systems concepts – interrelationships, perspectives and boundaries – can be used for framing an impact evaluation (see Williams, this IDS Bulletin). But their use also has implications for the type of learning that an impact evaluation is likely to generate. Moreover, they can help to make the value base of evaluations more explicit.

Prosaic or Profound? The Adoption of Systems Ideas by Impact Evaluation

Author: Williams, B.
Publication date: 2015

All evaluation approaches have to address questions about their legitimacy, validity, relevance and usefulness. As the complexity of interventions is more widely acknowledged, impact evaluation appears to be especially vulnerable to these challenges. This article explores the potential of the systems field to address these vulnerabilities. 

IDS Bulletin: Introduction – Towards Systemic Approaches to Evaluation and Impact

Author: Befani, B., Ramalingam, B. & Stern, E.
Publication date: 2015

This IDS Bulletin focuses on exploring the potential of systems ideas and complexity concepts to meet the increasingly complex challenges of an increasingly ambitious development agenda.

IDS Bulletin: Introduction – Rethinking Impact Evaluation for Development

Author: Befani, B., Barnett, C. & Stern, E.
Publication date: 2014

This IDS Bulletin is the first of two special issues presenting contributions from the event ‘Impact Innovation and Learning: Towards a Research and Practice Agenda for the Future’, organised by IDS in March 2013. 

This introduction articulates first what these challenges are, and then goes on to summarise how the contributors propose to meet these challenges in terms of methodological and institutional innovation.

Using case studies to explore the external validity of ‘complex’ development interventions

Authors: Woolcock, M.
Publication date: July 2013

Rising standards for accurately inferring the impact of development projects has not been matched by equivalently rigorous procedures for guiding decisions about whether and how similar results might be expected elsewhere.