On Leading Change and Impactful Evaluations

Liana Downey recently spoke to our team about her experiences driving change and impact in roles as a consultant, as well as in government as the Deputy Secretary, Strategy and Delivery for NSW Department of Education, and Expert Advisor to the Department of Prime Minister and Cabinet.

Her session was instructive for us, and we’re delighted to share her perspectives on what’s need to create an enabling environment for change, and communicating for change and impact in this blog.

The role and responsibility of evaluation consultants

Q: Tell me about your experiences of supporting organisations to develop a mature understanding of evaluation’s purpose and learning potential, and embed a virtuous learning cycle?

As consultants, I believe we’re responsible for helping drive impact. And that starts by making sure our clients are asking the right questions. The first step is seeking to understand their purpose and intent in doing an evaluation. Some agencies may not have worked with a consultant or done an evaluation before and may need a ‘critical friend’ to help tease out what they really need.

If the purpose of the conceived evaluation is quite narrow (for example, ‘We think it’s a great project and we want more funding’), I’d suggest it’s the consultants’ role to ask: ‘How ready are you for positive and negative findings? How do you intend to embed the lessons from this evaluation into your other work?’

One way that can make that conversation easier, is to share our passion for the work. We can explain that by helping an organisation to understand better what’s working, we can help direct resources into programs that work for citizens, and away from those that don’t.

Evaluation consultants also have a role in contextualising findings, particularly where there has been some sort of negative finding or project ‘failure’. One of the challenges in the public sector is that even minor hiccups in project delivery can be picked up and brandished as political weapons. That can create an environment where it’s really difficult to innovate — because with innovation comes the risk of failure — and where the backlash may result in overcorrections that can be resource intensive and not necessarily impactful.

An alternative framing is that if everything has gone right, it likely means nothing new has been tried. Through their communications, evaluators can play an educative role, helping to elevate a more nuanced public understanding that negative findings and things that haven’t gone as planned are often the greatest source of valuable lessons to build on next time.

Creating an enabling environment for change

Q: There’s a lot of overlap between evaluation and change management — what’s key in creating the environment where both can support optimal impact?

In my role in senior leadership in government, I did my best to create an environment in which people felt safe to try new things and fail. I tried to own my mistakes and those of my team. I kept the outcomes we wanted for students front and centre of our decision-making and made sure our project approach incorporated the need to test, iterate and adapt right from the outset.

We also worked closely on project design with the people most impacted by the work. When you do this, I’ve learned it’s important not to hold too tightly to an agenda – while it’s fine to have a hypothesis about how things might be and what might work, you need to remain genuinely open to what comes out of engagements with stakeholders. You should expect and hope your plan will be challenged, remain humble enough to dig deeper with curiosity about why, and be prepared to adapt your plan. That’s easier said than done, but it is powerful when it happens.

Reaching key audiences with creative communication and engagement strategies

Q: You’ve said that senior public servants experience an avalanche of reading, and many other demands on their time, to make the point that communications need to work hard to ‘cut through’ and capture attention. What are the most effective communications you’ve seen that have achieved ‘cut through’? Why do you think they’ve been so effective?

Many people think about evaluation communications in the form of a written report, but effective communication requires more. The evaluator’s job is not just to do a review, but rather to ensure people understand the findings, their implications, and what to do next.

 People absorb information in different ways, so it’s always best to plan for three or more communication modes, for example written, visual and auditory. Think about who the audience is, the bigger issue your client is trying to solve, and how best to tap into shared motivations.

 In my experience, the old adage ‘less is more’ is true in getting cut through with communications — the pithier the presentation, the more powerful it is. For reports, a short executive summary is key, with a coherent structure that makes good narrative sense. For presentations, it’s about having one message per slide, and providing analogies and illustrative quotes to help people make sense of quantitative information.

For some projects, the usual modes of communication won’t be enough to create the necessary collaboration for change. This is where you need to get creative about how you will structure engagements and communications to overcome barriers and reach the desired goals.

 A good example is a project I led within the Department of Education. We had two groups of people with very different experiences of a system – both of whom held tightly to their version of the facts. We could see that the findings from the data we’d collected were going to meet some resistance because of this. Our approach was to bring the two groups together for a trivia contest. We asked for their answers, before revealing what the data showed in reality.

We also worked to produce archetypes of different service users, which created a shared perspective for both groups (who had previously struggled to see things through the same lens). We focused discussions on what would help each user most, rather than on which viewpoint was the more accurate.

 Video is an incredible tool for communication and change. When we can see the emotions of people who are invested in change or creating an outcome, or who are experiencing positive or negative interactions with a system, our emotional brain gets much more involved in processing and retaining information.

It’s not enough to just deliver in different modes. Even if you’ve written the clearest, most concise report or presentation there’s no guarantee that it will be understood. You need to test that people understand by asking them what they think, whether they have remaining questions etc. The failure rate of a report in terms of a reader’s understanding and engagement is probably quite high, whereas failure rate of conversation is likely much lower.

I’ve rarely had a client who specifically asked for these things. For me, it’s not so much about responding to what you’re directly asked for, as it is thinking about what will help the client or team reach their goal and drawing creatively on whatever tools are at your disposal to get them there. 

 

 Liana is the author Mission Control: How Nonprofits and Governments Can Focus, Achieve More, and Change the World; co-founder of Common Ground on Climate, a bi-partisan national strategy and research project and led Liana Downey & Associates, a boutique advisory firm advising governments and nonprofits across the US and Australia. She has also taught at NYU’s Robert F. Wagner’s School of Public Services in New York.
Receive our latest news and insights
  • This field is for validation purposes and should be left unchanged.