Competence model articlesDesign an evaluation process and impact assessmentDesigning Educational ProgrammesEnsure that outcomes are based on the content of the evaluation and impact assessmentSkills to connect evaluation and impact assessments with relevant conclusions for further learning.

How can the findings be reported and their use supported?

Make more visible and clear the outcomes of any evaluation and impact study for better use in future training.

Introduction:

The evaluation report should be structured in a manner that reflects the purpose and questions of the evaluation and should be clearly addressed for proposing changes and proposing adaptation in the future model of the training.

The specific evaluative rubrics should be used to ‘interpret’ the evidence and determine which considerations are critically important or urgent. Evidence on multiple dimensions should subsequently be synthesized to generate answers to the high-level evaluative questions.

Content:

The structure of an evaluation report can do a great deal to encourage the succinct reporting of direct answers to evaluative questions, backed up by enough detail about the evaluative reasoning and methodology to allow the reader to follow the logic and clearly see the evidence base.

A facilitator should be able to design and planning such development of the evaluation focusing already on the future steps and not only focusing on the nowadays or short term vision of using the evaluation data.

The following recommendations will help to set clear expectations for evaluation reports that are strong on evaluative reasoning:

  1. The executive summary must contain direct and explicitly evaluative answers to the questions used to guide the whole evaluation.
  2. Explicitly evaluative language must be used when presenting findings (rather than value-neutral language that merely describes findings). Examples should be provided.
  3. Use of clear and simple data visualization to present easy-to-understand ‘snapshots’ of how the intervention has performed on the various dimensions of merit.
  4. Structuring of the findings section using questions as subheadings (rather than types and sources of evidence, as is frequently done).
  5. There must be clarity and transparency about the evaluative reasoning used, with the explanations clearly understandable to both non-evaluators and readers without deep content expertise in the subject matter. These explanations should be broad and brief in the main body of the report, with more detail available in annexes.
  6. If evaluative rubrics are relatively small in size, these should be included in the main body of the report. If they are large, a brief summary of at least one or two should be included in the main body of the report, with all rubrics included in full in an annex.

Exercises:

How to apply it in everyday work?

While you prepare your next evaluation, would it be possible to start to think about the final format of the reporting?

Preparing your next training would be important to address the following point while preparing the evaluation.

  • How does the audience prefer to receive information – text, graphics, numbers, written, visual or a mixture of all of these?
  • What is the preferred length (or duration if an audio/visual presentation)?
  • What access does the audience have to information technology (this may inform whether you use web-based formats)?
  • What is the purpose of the report and how does this inform the choice of format? Purposes may include:
    • keeping stakeholders engaged during an evaluation
    • providing feedback to and maintaining the commitment of people collecting data during implementation
    • flagging emerging findings and implications for ongoing program development and for the evaluation
    • presenting interim recommendations
    • seeking feedback on draft reports to assist in identifying causal factors
    • informing planning, funding or policy decisions
    • broader dissemination of findings to support the use

Reflection Questions:

  • How often do I think in advance to the final use that I will do about training evaluation?
  • Do I plan something strategically or only for one activity ahead?
  • Am I capable to prepare a report that is connecting all the above elements in one short and simple document?
Federica de Micheli

Federica de Micheli

A training focusing on participation as methodology (not only as topic) is based on a certain value premise that believes in the empowerment of all the learners and supporting the equal participation of the ones with fewer opportunities or in situations of disatatage (temporary or long term). The focus of participatory training is not just about ‘knowing more’ but about…

Click here to read more about Federica de Micheli

Read more from this author

Source
Reference/made by/originally from: betterevaluation.org

Demicheli Federica

Master in Intercultural Mediation, trainer and researcher for Erasmus + National Agencies, GIZ and other international institutions about Intercultural Learning,Youth work and Youth Policies and Community Development. She is part of the pool of trainers about Recognition of non-formal education “Recognise it” of the German National Agency Erasmus + and Salto EuroMed RC and she coordinated the Conference on the recognition of non-formal education in 2018 in Naples. Member of the working group of the “Time of Show Off” publication on the role and methodologies of youth work and non-formal education. Initiator of the online professionals discussion group in Italy on “Youth work in Codiv crise”. Founder of the association “NINFEA” for the recognition of youth work in Italy. Professional Youth worker and expert in Youth Policy at National and European Level (author of the Youth Wiki page for Italy).

Related Articles

Leave a Reply

Back to top button