What role can and should evaluators play in reducing the inequalities that perpetuate social disadvantage?
The 2018 Evaluation for Change: Change for Evaluation ANZEA conference held in Auckland from 16–18 July was focused on tackling this question and asked attendees to consider what changes they could make in their practice to contribute to societal improvement.
Keynote speakers focused on the power of stories and the value of recent and emerging areas of evaluation and program practice, including co-design. Marcus Akuhata-Brown, a highly experienced educator, speaker and current University of Melbourne Atlantic Fellow, delivered a particularly resonating message. He offered thought-provoking reflections on the importance and value of connection and place for New Zealand Indigenous communities when engaging in the evaluation process, and this process can reinforce efforts to change social structures that exacerbate inequality and affirm power hierachies.
Lifting the lid on social inequalities and hierarchies of power
Marcus called on evaluators to facilitate opportunities for communities to connect through place and culture, to use their strengths and identify what is needed to reduce social inequalities. Providing an outline of his whakapapa, Marcus showed how a process of tracing family lineages can not only be empowering, but also a practice to connect to place and culture. In the context of evaluands targeting social inequalities for Indigenous communities, this could be a valuable method to incorporate in the engagement process and could also support program design and development.
He noted that ‘evaluators should be opening up conversations, rather than finding ways to narrow and limit them’.
The keynote and indeed the conference theme itself reflected the close relationship between the practice of evaluation and the subject of the evaluation, where evaluators can implement practices that reinforce the aims of the program, policy or intervention and assist in the design process. Recent social change practice frameworks, including co-design and Collective Impact, also illustrate this.
Evaluating ‘lifting the lid’ efforts
In my presentation with Ruth Aston, University of Melbourne and Robbie Francis, Director of The Lucy Foundation and a social change practitioner in Pluma Hidalgo Mexico from the Lucy Foundation, we tackled questions associated with the impact measurement in evaluations of social change efforts.
There is significant growth and increased application of a variety of approaches to understanding social impact and social change. Concepts of Collective Impact, co-design, human-centered design or design- thinking and principles-based evaluation can all reinforce and be a foundation for social impact measurement.
As well as these, social science perspectives and fields from intervention design, Implementation Science, Behavioural Insights, Realist Evaluation, and process evaluations that accompany Randomised Control Trials also reinforce the importance of implementation monitoring and quality intervention design for the impact of social change efforts.
Indicators for measuring progress towards social impact
In Ruth’s review of 7,123 interventions taking action on social inequalities in health outcomes, variables associated with implementation and program design—such as fidelity, dosage, implementation quality, reach and sustainability—were found to be moderators of intervention impact.
When these variables were present, the magnitude of impact on social outcomes increased. Variables included cultural relevance in the content, delivery and intentions underpinning the intervention – specifically, the intervention content (what it contained), how it was delivered (communication, language used, mode of delivery), and what participants did (behaviour change, session attendance) needed to be relevant to the culture (ethnicity, local community) participants identify with.
All identified variables were factor analysed, and two clustered factors emerged:
- intervention design
- intervention implementation.
The findings illustrated that despite the complexity of social change efforts, intervention design and implementation are related to long-term impact. Therefore, monitoring and evaluation need to, at least in the early stages of social change efforts, be focused in this area.
Practical implications for evaluating ‘lifting the lid’ efforts
Robbie reflected on the findings of the research, noting that it has been challenging to demonstrate their progress to commissioners, funders and supporters. Measures associated with intervention design and implementation would support in-field adaptation and continuous improvement. However, this would require advocacy of the worth of such measures, as those interested in funding the work of the Foundation can have an unrealistic or Westernised perspective of what social change outcomes might be feasible within a timeframe, and what social change outcomes matter to the communities the Foundation works with. For instance, working with standardised tools to measure quality of life (QALY) tend to be irrelevant to the communities Robbie works with, but is favoured by funders.
She also shared that the biggest enabler of social impact measurement was the need to ensure that the evaluation process mirrors the principles of the social change effort itself, again highlighting the relationship between evaluation and the subject of the evaluation, and how evaluation can support change. For the communities The Lucy Foundation work with and people with disabilities, Robbie advised that evaluation should be underpinned by the ‘nothing about us without us’ principle.
This principle could be applied to work with other communities and highlights the critical foundation of social impact measurement, which is the involvement of all those involved in the social change endeavour in evaluation, including those who fund and commission the evaluation.
Continuing the conversation
Many questions remain about social impact measurement and evaluation, and we will be continuing the conversation at the AES conference. Come along to The Promise Design-thinking and Implementation Science holds for Social Impact Evaluation: Views from Practitioners and Evaluators at Chancellor 6, Hotel Grand Chancellor Launceston on Wednesday 19 September from 1:30-2pm.