Making evaluation a process not just a product can enhance the likelihood of your findings being used. But at some point, you’re also going to have to deliver the product – the report.
By this point, particularly in a large-scale, long-term and/or mixed-methods evaluation, you’ll have a lot of data. It can be hard to know what your story is and where to begin with structuring it into a coherent report.
Some writers work by constructing the frame first; others find the story through the writing. Both approaches are fine – you’ve got to go with what works for you. But if you don’t start with the framework, you’ll have to retrospectively construct it. This means re-ordering what you’ve written so your message is clear.
Know your message
A clear structure comes from a clear message. As Max Rixe recently wrote in an article on Writing capability in the public sector for The Mandarin, ‘Good writing is clear thinking made visible.’
You can’t work out how to order information without knowing what it is you are trying to say. It would be like building a house in the dark. Are you telling a story of triumph, tragedy or transformation?
If it’s a tragedy – aka it didn’t work out like we thought it would – you’ll also need to think about how you can make this palatable. Can you create a positive sandwich?
Don’t cut it by data source
There is no one best way to structure an evaluation report – because the focus and findings always differ. But this doesn’t mean anything goes.
Don’t structure your report by data source.
This doesn’t tell a story and leaves the hard work of piecing it together to your audience. It’s an evaluator’s job to synthesise, not just describe, the data.
Better options are structuring by:
- levels of the program logic
- key evaluation questions
- program components
- streams of beneficiaries
Which one of these will work best depends on the kind of content you are working with and the composition and interests of your audience. (We spoke about knowing your audience in the second blog of this series).
Learn from journalism
The concept of telescoping speaks to my journalistic instincts to make it easy for the reader to jump in and get what they want, then drop out. Telescoping means thinking of each level of your report as telescoping outwards, empowering each reader to choose the level of detail they need – like they can in a traditional news story.
- The key findings one-pager gives all audiences the low-down at a glance.
- The executive summary provides key information for everyone, including busy executives.
- The report body provides primary analysis and evidence – the most important content to support the key findings and recommendations in the executive summary.
- The appendices provide detailed information, detailed processes and analysis and background research – the content that the technical specialists may want to check or that only particular audiences need to know.
As a journalism student, ’Don’t bury the lead’ was hammered into me. But evaluators have taken on the mores of academic research – with long introduction and methods sections – ahead of the good stuff. While we’re not journalists – and we shouldn’t take up all of their tricks – in some cases, we might better meet the needs of our audiences if we tried flipping our reports and starting with the outcomes.
Test it to make sure it’s sound
When drafting and reviewing your structure, ask yourself:
- Will it help your audiences quickly grasp what they need to know and avoid the detail they don’t?
- Does it give emphasis to what’s most important?
- Does the structure support the argument – the story?
- Does each chapter logically flow from the other? Or are there chapters that should be switched around because you need the information in one to understand the other or the implications of it?
- It is clear what each chapter is about?
- Are chapters balanced in length?
- Are there repetitions and overlaps between chapters and sections?
- Are there gaps?
The next in our communication for evaluation series will cover telling stories for evaluation.