Evaluation for place based community-led programs

Several sessions at this year’s Australian Evaluation Society conference drew together reflections from funders, community organisations and evaluators collaborating to deliver community-led change initiatives. Many of these initiatives adopt developmental approaches, creating space for communities to co-create the solutions that respond to their local contexts.

We’ve written about approaches to developmental evaluation here. In this blog, we’ll share some practical tips for planning and executing place based, community led programs, which draw on the principles of developmental evaluation.

Shared principles as a foundation for success

Image source: Stronger Communities for Children

When partnering with communities, often the outcomes they want to achieve, and the activities they’ll do to get to the outcomes are not known. Many communities are, however, clear on what they stand for. Evaluators can facilitate a conversation to draw out the principles or values that will underpin the partnership between community, funders and evaluators.

These principles are most powerful when they are taken off the page and activated through decision-making. Members of the Stronger Communities for Children program team, including evaluators, funders and facilitating partner organisations, told the AES audience about how the shared principles help the team negotiate difficult decisions together.

Program logics can help

We are strong proponents of program logic. It was interesting to hear evaluators at the AES discuss two different (opposite) approaches to logic building for place-based approaches. Representatives from the Our Town project – a ten-year place-based collaboration between the Fay Fuller Foundation, TACSI, Clear Horizon and regional South Australian communities—described a process of building logic models with each community, and then coalescing these to a broader over-arching logic. Other evaluations, including one of our own, take the opposite approach of creating an overall logic that identifies broad goals and pathways to achieve them and then working with community to identify mini logics for each initiative.

Regardless of whether the logics are developed ‘up’ or ‘down’, it is clear that it’s best to take a flexible approach and adapt the overarching program logic together as the work progresses, so that it remains responsive to what’s happening in communities, and to community voices.

Embracing the role of community in monitoring and evaluation.

With the right opportunity and support, community members can play a role in every stage of the evaluation, from having input into the design of program goals and logics, to identifying what success looks like, the most culturally appropriate methods for capturing data or even the questions to ask and how to best ask them in an evaluation interview. Training local community members to undertake interviews or co-facilitate focus groups can also yield deeper insights, given that they often already have a rapport with interviewees – of course there are also cases where it’s more appropriate to have someone independent doing data collection.

We loved hearing from members of the Our Town initiative – these two community-leads had become staunch advocates of logic models!

Take a learn as you go approach 

Finally, but crucially, we heard a range of presenters acknowledge just how difficult it is to work in a space of complexity. Community-led, place-based initiatives are often trying to solve multifaceted and intractable issues that are held in place by complex systems, relationships and social norms. It’s unlikely that a single service model or program can resolve all of these issues, particularly not at the outset. This can feel overwhelming, but it can also be liberating, as Judy Oakden reflected. Evaluators and the communities we partner with don’t need to know all the answers to get started: we just need to get started and be open to learning (and being wrong!)

Things to be aware of as an evaluator of place-based community-led projects
  • Be ready to switch. An evaluator’s role is more likely to be ‘part of the team’ than ‘independent assessor’.
  • Stay flexible. Your thinking, your processes, your measures of success—all of these may need to change in response to local shifts and the community’s growing capacity.
  • Take time. Trust underpins the relationships between community, funders and evaluators. This doesn’t happen overnight, but it will happen if you approach it slowly and with intent.
  • Watch the gap. Communities are full of people who step up. They are also full of people who step back. It’s important to focus on including all voices—established leaders and emerging leaders—in designing the initiative and evaluating it.

Read more on our work and publications on community-development related approaches to evaluation here.

  Thanks to Louise Stanley, from Nama Jalu Consulting, who co-authored this piece.


Receive our latest news and insights
  • This field is for validation purposes and should be left unchanged.