printUse ctrl + p to print the page

Presenting findings at staff forums and conferences

Aim of the tool
To share your key findings and experiences to stimulate discussion and further develop your methodologies

When to use it?
This tool can be used at different stages of an intervention process. It is particularly useful (i.e. beginning, mid-way and end) to communicate the findings of the evaluation.

How difficult is it to use it?
Easy – moderate – for experienced users/facilitators

Tool for thought or tool for action?
Thought

Benefits
It can be a good way to increase the visibility and impact of your programme and share your experiences with a wide audience. It also creates the opportunity to exchange views, critically look at your own data and further develop your practice

Issues to be aware of
It is important to know your audience, to define your key message and then choose the most appropriate means of communication. Share your findings at a strategic moment, when people may need the information

Description of the tool
The findings of your evaluation will need to be communicated to your primary intended users (identified as part of framing the evaluation); you may also need to communicate your findings to other people for different reasons. For example, lessons learned from the evaluation can be helpful to other evaluators or project staff working in the same field – this is sometimes done at staff forums and subject matter conferences.

Agree with your primary intended users of the evaluation, how and what findings they need communicated to them, when and where they want them presented and how they intend to use these findings.

Evaluation findings should be presented in a clear way and in the local language. Findings are best explained when they are illustrated with the aid of examples. Develop a communication plan, consider the various audiences for the evaluation findings and develop an appropriate strategy for reaching each audience. Pay attention to your organisation’s internal and external sharing policies for guidance.

Example: Overview of the kind of discussions that take place at a workshop
Impact evaluation (IE) and utility were at the heart of a two-day conference, organised on 25-26 March 2013 by the Centre for Development Innovation (CDI), Wageningen UR, in collaboration with Learning by Design and the Agricultural Economics Institute (LEI), Wageningen UR. Impact evaluation has been at the centre of attention within the evaluation field for several years. The demand for IE is increasing, but “the utility of IE remains to be proven” and there is “very little evidence of the use of evidence” according to keynote speaker, Dr Irene Guijt. Questions were framed on what influences design and communication of IE findings and how these then influence the utilization of IE. Keynote speaker Professor Elliot Stern, indicated that the “IE ‘brand’ (as if it is new) is now too narrowly focused, methods-led rather than content-led, ignoring major developments in policy and practice in international development”.

These topics were further elaborated on during the conference in group sessions dealing with cases and additional working group sessions around the core questions of the conference.