Join us to un-box evaluation #aes19SYD

#aes19SYD is all about un-boxing evaluation so it can fulfil it’s potential. Not sure what this means? Don’t fret.

Our staff – along with colleagues in the field – have put together a number of presentations – ranging from 5-minute ignite sessions through 30-minute short papers to 1-hour interactive sessions – to un-box the conference theme.

What’s in the box? – Using theory, creating value

The Consolations of Evaluation Theory – Brad Astbury and Andrew Hawkins: this short paper will argue for the value of evaluation theory (and inspire you to rush home and read Foundations of Program Evaluation: Theories of Practice among others) and present a conceptual map for drawing on different theorists.

The role of evaluation in social impact bonds – Sue Leahy, Ruby Leahy Gatfield and Claudia Lennon (The Benevolent Society): this short paper reflects on a five-year evaluation of the first social impact bond to mature in Australia, including descriptions of some of the challenges for evaluation in a bond context, and key benefits of evaluation in identifying learnings and improvement for both the program and the bond mechanism itself.

Using Program Design Logic to manage the risk of program failure – Andrew Hawkins: this short paper is about identifying, managing and mitigating the risk that a program will not produce its intended effects by using a program design logic approach.

Who should hold the box? – Questioning power, exploring diversity

Beyond co-design to co-evaluation: Reflections on collaborating with consumer researchers – Rachael Aston, Amber Provenzano and Amelia Walters (consumer researcher): this short paper discusses how to practically support consumer researchers in evaluation to contribute their lived experience, to further develop their professional skills, and to foster greater ownership of evaluation for the community.

Harnessing the power of co – practical tips – Jade Maloney and Alexandra Lorigan: this short paper draws on projects with organisations working with people with autism, dementia, psychosocial disability and intellectual disability to provide practical ideas for meaningfully engaging people with lived experience in all phases of an evaluation (including options for when you have years versus weeks or days).

Disrupting power dynamics and bringing diverse voices to evaluation – Jade Maloney: this interactive session – drawing on a case study evaluation of a co-designed, co-delivered community engagement program – will use creative techniques to enable evaluators to first embody the power dynamics involved in several evaluation scenarios, and then explore ways to disrupt these (to influence who is at the table, welcome and give space to diverse voices, and balance competing perspectives).

Unpacking the competencies – in theory and practice – Sue Leahy, Amy Gullickson (University of Melbourne – Centre for Program Evaluation) and Delyth Lloyd (Department of Health and Human Services): this short paper will report on findings from recent research with the AES community that was undertaken to update the competency set outlined in the AES Professional Learning Competency Framework, as well as recent theoretical work in this area, and discuss next steps.

What’s beyond the box? – Welcoming innovation, embracing disruption

A Primer on Using Qualitative Comparative Analysis (QCA) in Evaluation – Brad Astbury: this short paper reports on the use, benefits and challenges of QCA in the context of a study that sought to identify different pathways of conditions leading to sustainability of demonstration projects, and provides advice on using this approach in the context of theoretical and case-specific knowledge.

What the arts can teach evaluators – Gerard Atkinson: this interactive presentation gives evaluators an opportunity to engage with a series of artistic provocations that will promote discussion and reflection on the practice of evaluation, with the aim of challenging assumptions about what we value, opening up new ways of looking at problems, and highlighting the diversity of perspectives that we and those we work with bring.

Integrating Behavioural Insights into Evaluation – Jack Cassidy and Georgia Marett: this short paper shares insights into how behavioural economics and Behavioural Insights (BI) are used in program and service design and explores ways in which evaluation can and should take BI into account.

Machine-assisted qualitative analysis in Evaluation – Georgia Marett, Jasper Odgers and David Wakelin: this short paper explains how Natural Language Processing (NLP) (a form of machine-assisted qualitative analysis) can be used to reduce time and costs associated with qualitative analysis, and discusses the ethics, limitations and future directions for this approach.

Lessons from the dark side: How corporates do client experience – Emily Verstege: this ignite presentation un-boxes client experience for evaluators, with anecdotes from the “dark side” (aka the corporate sector).

How do we stack up? – Building skills, growing the profession

Making the numbers count: Being evaluation ready for administrative data analysis – Fiona Christian and David Wakelin: this short paper provides practical advice for evaluators and evaluation commissioners on what’s needed for administrative data to more effectively and efficiently support evaluations and contribute to stronger findings and recommendations.

Design tips for visualising your data – David Wakelin: this ignite presentation will share simple design tips to instil clarity in the visualisations you design to help your audience see what you see, know what you know, understand your message and turn evidence into action.

AES sessions

Bringing the voice and knowledge of Indigenous people and communities to evidence building and evaluation that empowers – Simon Jordan: this panel will examine what evaluation and evidence means in an Aboriginal and Torres Strait Islander context and how culture and knowledge systems can inform concepts of evaluation, and discuss real-world suggestions for supporting communities and those who work with them to own their evaluations.

Peer Assessment as a step toward professionalisation – Sue Leahy, Helen Simons FAcSS FRSA (University of Southampton) and Delyth Lloyd (Department of Health and Human Services): reflecting the growing conversation around the professionalisation of the evaluation sector, this interactive session showcases the experience of the United Kingdom Evaluation Society in piloting its Voluntary Evaluator Peer Review System by presenting a pre-recorded Q&A with the convenor of the Professionalisation subgroup and a facilitated Q&A session that will allow participants to explore the implementation of the UK peer assessment process.

Finding your voice: sharing your knowledge and elevating evaluation through social media, blogging and the Evaluation Journal of Australasia – Jade Maloney, Liz Gould (NSW Department of Education), Carol Quadrelli (Self-employed), Bronwyn Rossingh (Tiwi Island Training and Employment) and Eunice Sotelo (Australian Institute for Teaching and School Leadership): this interactive session delivered by editors of the EJA and the AES blog aims to support evaluators to find and amplify their voice to elevate evaluation. The editors will share tips and then turn it over to participants to ask questions, pitch ideas and find partners to collaborate with.

Many staff at ARTD will attempt to un-box evaluation in a number of exciting ways. Personally, as someone who is passionate about elevating the voice of lived experience in evaluation, I am particularly looking forward to presentations that explore the potential for diverse perspectives in evaluation, including David Fetterman’s workshop on empowerment evaluation.

For more detail on other presentations, check out the detailed draft conference program. We hope to see you all there!

Receive our latest news and insights
  • This field is for validation purposes and should be left unchanged.