Our highlights from the AES19 Conference
Swept up in the busyness of every day life, it can be difficult to integrate what we learn at conferences into our regular practice. The AES NSW Professional Learning session, held in late October, was an opportunity for members to reflect on what they learnt at the aes19 Intentational Evaluation Conference (aes19) and how they might integrate these learnings into their everyday practice. For me, what stood out was how different people and communities engage one another, in particular, how we can best engage with evaluation participants, vulnerable communities and other evaluation professionals.
Having cultural understanding of communities
Tracy Westerman’s keynote address on ‘Without measurability there is no accountability’ was a big highlight for many who attended aes19. What resonated for many of us, including Sophie Duxson, was Tracy’s emphasis on evaluators having a deep cultural understanding of the communities they wish to engage with as well as being highly competent in their work.
There were some great examples from members of this in practice. For example, Holly Kovac, from ARTD, reiterated how she was inspired to use Tracy’s saying ‘Let’s PREVENT the gap, rather than CLOSE the Gap – with optimism’ in a recent university presentation when discussing the design of an early intervention in alcohol abuse program for Indigenous people.
Engaging and understanding vulnerable populations
One of the conference themes that struck a chord with many members was ensuring vulnerable people and communities are fully engaged in designing and conducting evaluations.
For example, Kylie-Evans Locke’s presentation was mentioned, which described an innovative body mapping approach, where children draw an outline of themselves and write feedback or draw pictures inside their outlines. Dr Suzanne Evas and Antoinette Bonaguro’s presentation mentioned how they have use Jenga blocks with key messages or questions written on them to facilitate discussions and enable the voices of vulnerable children to be heard.
AES members also heard the message about the need for trauma-informed practice which was discussed by Claire Grealy and Christina Bagot in their presentation. We discussed how understanding where people are on the survival and recovery spectrum can have different implications on their mental health state and at what point of time evaluators may get different answers to the same questions. For example, a person recently affected by a bush fire who has received some immediate care and support might respond differently to a person who was affected a year previously and has got through the immediate survival stage but is still trying to rebuild their destroyed home.
Of special interest to me was hearing about Takara Tsuzaki’s presentation and how she used a case study of the Rural Generalist Program in Japan to highlight different understandings and interpretations of evaluation between different cultures. She pointed out that the word ‘evaluate’ in Japanese literally translates to ‘judgement’ and there is no other Japanese word that captures all the subtelties that the word ‘evaluate’ does in the English language. It was fascinating to learn of the extent that Takara went to to engage participants to gather qualitative data. To help gain their trust and respect, she made use of personalised YouTube videos delivered in both English and Japanese, regular Skype calls, regular updates and sharing of interim results. A ‘data party,’ a concept promoted by Kylie Hutchinson, is being arranged for 2020 to celebrate the evaluation and allow all participants an opportunity to receive and discuss the findings together.
Using rubrics in evaluation
The use of rubrics for evaluating complex initiatives with multiple programs, communities and/or stakeholders was heavily promoted at aes19. Some members really engaged with Jane Davidson’s keynote address and the ‘Rubrics – a tool for unboxing evaluative reasoning’ presentation by members of the Kinnect Group in New Zealand.
There are a range of reasons why AES members are keen to use rubrics more. It was suggested rubrics could be a very powerful tool for people new to the evaluation field while aes19 co-convenor Ben Barnes felt there was an increased level of comfort amongst evaluators to show their workings for how decisions or recommendations are made and allow themselves to be questioned. Others suggested that rubrics were beneficial because they make evaluation criteria and standards explicit, they are set out in Plain English and can make use of a wide range of data sources (both qualitative and quantitiave).
Engaging conference participants
aes19 adopted some great approaches to driving participant engagement. Sli.do, a tool for crowdsourcing audience questions, was integrated into the aes19SYD program and there were differing views about its use. AES members noted some challenges in using the tool, including whether users should be anonymous or not, how submitted questions might be filtered to ensure appropriate and relevant questions are displayed on the screen, and how having to engage with technology can be a distraction from listening to the presenter(s). Others felt it was a useful tool for engaging with presenters, and generally for ensuring that questions asked were short and straight to the point.
Another approach to engagement was the ‘un-conference’ agenda. According to aes19 co-convener, Jade Maloney, its purpose was to enable participants to have deep conversations about topics of interest to them. It will be great to see how these, and other, mechanisms for engaging conference participants are included in the aes20 conference in Brisbane.