Strengthening evaluation through a community of practice

Evaluation can be a tough gig. While everyone talks evidence-based policy, no-one really wants to hear that the policy or program they designed or have been working hard to implement is not delivering what they’d hoped.

Kicking off the one-year anniversary of the Department of Finance, Services and Innovation’s (DFSI) Evaluation Community of Practice, Secretary Martin Hoffman recognised this challenge. To overcome it, he emphasised the need to build evaluation into organisational culture and systems – so it’s part of how agencies do business, not an add on.

As a member of the NSW Australian Evaluation Society Regional Committee and Co-convener of next year’s International Evaluation Conference in Sydney, I’m a big believer in the potential of communities of practice to strengthen the culture of evaluation – through sharing stories and building skills. This is critical given the history of evaluation reports gathering dust on shelves.

At the DFSI event on 6 December, participants worked on the rapid development of a program logic for a Service NSW initiative, and heard about how a recent evaluation had ended a policy that wasn’t achieving its objectives.

I presented with Emma Bedwin from NSW Fair Trading on the learnings from our evaluation of their Consumer Awareness Protection Initiative – a consumer engagement program with people with disability. Here’s what we found.

  1. A program can be more than a program – it can change systems – and logic models can capture this. Traditional logic models recognise external factors that can detract from or enhance a program’s outcomes. But if you think big, you can incorporate ways to transform systems within your program model.
  2. If you’re intentional, you can leverage a one-off evaluation to build monitoring and evaluation into systems. When Fair Trading asked us to evaluate the program, they asked for a transferable monitoring and evaluation framework that they could apply to other initiatives. This meant that when the evaluation ended, they had more than a report.
  3. External evaluators can complement internal capacity. With a focus on capacity building and refining data systems, core data collection can be done in-house, meaning external evaluators can focus on the data collection and analysis that requires specific skills or is better done by someone with some distance from the program.
  4. When you’re starting out, it helps to build on existing systems. They may not perfectly suit your purpose, but it can be easier to tweak something that is already there than to start everything from scratch.
  5. Pay attention to the situation analysis. We were informed by Michael Quinn Patton’s Utilisation Focused Evaluation – engaging end users in formulating the evaluation questions and design, and building methods into the program. But it takes time to ensure staff are comfortable with new outcomes data collection.
  6. Qualitative data can be powerful, but you need to persuade people of its potential. In the program, Fair Trading staff collected consumer stories through two-way conversations, which helped to identify issues to be addressed in the system. In the evaluation, we teamed observations of engagement sessions and interviews with all key stakeholder groups, and were able to triangulate findings across these sources to identify what worked well and what could be improved.
  7. Peers can strengthen evaluation. In our evaluation of the next iteration of Fair Trading engagement with people with disability, we are working with peer researchers – who can tell us what kinds of approaches and questions will work to seek feedback from program participants.

Not everything went according to plan – evaluations hardly ever do – but we learned a lot, produced information that has been used, and left behind a framework that is being transferred to other initiatives.

The DFSI Evaluation Community of Practice was a great opportunity to share these learnings and we’re looking forward to swapping more stories in future.

Receive our latest news and insights
  • This field is for validation purposes and should be left unchanged.