Transforming evaluation: what we’re taking from #aes18LST

This year’s Australasian Evaluation Society (AES) international conference challenged us to transform evaluation practice to address complex social and environmental issues in a changing world and to ensure cultural safety and respect in our work with Indigenous communities.

The crowd was lively – with a lot of newcomers to the AES (though just how many was a topic for debate – did the online polling really capture a representative sample of conference goers?), and Launceston provided a lovely backdrop for considered conversations about professionalisation, innovation, advocacy and the responsibilities of evaluators.

Our team enjoyed learning from peers and presenting on a range of subjects – including codesign, participatory approaches, strengthening program impacts on systems and building evaluation systems, evolving the evaluation deliverable, leveraging public datasets, and campaign evaluation.

Here’s what our team is taking from #aes18LST.

Sue Leahy, Managing Director

Penny Hagen’s keynote and workshop highlighted really different ways of working with community to achieve social change, bringing together shared skillsets of design and evaluation. We need to change the way we think about resourcing this work, taking long-term approaches – with evaluators involved at the community level as partners and critical friends or mentors. Sharon Gollan and Kathleen Stacey’s keynote also provided an important and clear call to action to name practices that exclude Aboriginal people from having voice in how evaluation is done.

Andrew Hawkins, Partner

My highlight was meeting up with lots of old friends for our annual catch-up on new developments in the field. It was also great to hear Gill Westhorp – a giant brain and contributor to evaluation – was inducted into the hall of fame as an AES fellow. It should make all of us in the evaluation society proud!

Jade Maloney, Partner

I enjoyed being challenged by Michael Quinn Patton to transform evaluation. I’ve been using his principles-focused evaluation and drawing on systems thinking, and am now working through how to integrate this thinking into daily practice. I also took Sharon Gollan & Kathleen Stacey’s call to action on ensuring cultural safety and respect in evaluation to heart. And it was great to engage in conversations about an advocacy strategy and pathways to professionalisation. These are important conversations for ther thefuture of evaluation and the AES, so I hope the conference continues to provide a forum for engagement.

Ken Fullerton, Consultant

Attending my first AES conference was an inspiring experience and opened my eyes to new approaches and innovative evaluators. My favourite speakers included Michael Quinn Patton, who provided the opening keynote and spoke of the butterfly as a real-world example of a Transformation and an example for evaluators to aspire towards in their professional work. Anne Markiewicz’s interactive session on Ethical dilemmas in evaluation practice challenged me to think of the role of evaluators – when and the extent to which an evaluator can or should step in and act when they believe something to be unethical.

Kerry Hart, Senior Consultant

I loved the Ignite sessions – succinct summaries of good work and lessons learnt. I enjoyed hearing from colleagues working in domestic violence and mental health, and those working with peer researchers. I’m also going to check out techniques like chatterbox and using photos and drawings to feedback transcripts to people who have literacy issues.

David Wakelin, Senior Consultant

Gerard Atkinson reminded us that leveraging available open data will be an increasingly powerful tool for evaluators. Kristy Hornby’s views on machine learning in evaluation resonated with my own studies: machine learning has a role to play, but before we jump in we must consider the implications of letting predictive modelling algorithms make decisions that have an impact on people who may be vulnerable. It’s also important that we monitor the output of the algorithms to ensure they meet the ethical responsibility we have as evaluators. And we can never afford to forget the people’s experience in and of the programs we are evaluating, even if it’s easy to get lost in quantitative data from time to time.

The conference also highlighted that truth is integral to data visualisation. Making visualisations easy to understand and immediately actionable is extremely beneficial, especially for real-time monitoring feedback. Jenny Riley’s session on Outcomes, dashboards and cupcakes was a nice example: https://aes18.sched.com/event/Er90/outcomes-dashboards-and-cupcakes.

Rachel, Senior Consultant

The conference was a great opportunity to listen – to the ideas colleagues and clients are exploring and what they’re struggling with and get the lay of the land.

Gill Westhorp introduced realist axiology – the philosophy of value and valuation, and what she believes the next development in realist evaluation needs to be based on. She laid out her grounding perspectives on defining value in realist ontology as her next area of focus.
Gerard, Manager
My highlight was seeing Nan Wehipeihana, Judy Oakden, Julian King and Kate McKegg’s presentation on Evaluative Rubrics. The seminar room was so packed that people crowded out the door. Those who made it into the session got a clear and usable introduction to using a rubric-based approach to evaluation, followed by a lively and honest discussion of implementing these in practice. I suspect from the interest and positive response we’ll be seeing more of this approach in future.

Jasper, Senior Consultant

The highlight for me was learning about Feminist Participatory Action Research from Tracy McDiarmid and Alejandra Pineda from the International Women’s Development Agency. I liked how it explored methods focused on empowering participants in evaluation and balancing power dynamics, including through role play.

AES 2019

We’re all looking forward to next year’s conference in Sydney, exploring the theme Evaluation: un-boxed. We’ll be cracking open evaluation to end users and opening up evaluators to what can be learned from the communities they work with.

Receive our latest news and insights
  • This field is for validation purposes and should be left unchanged.