Welcome to the AES Blog
AES FestEVAL - Celebrating and Challenging Evaluation!
by Jade Maloney and Lia Oliver
This has been a challenging year, with Covid-19 impacting our home, social and working lives. In a recent survey run by the AES, three-quarters of respondents said their work has been impacted at least moderately – by changing scope and timelines and, in some cases, cancelled contracts. And that’s before we get to the impact Covid-19 has had on our connections and wellbeing.
The Australian Evaluation Society’s (AES) FestEVAL provided opportunity to take time out, celebrate evaluation and connect. It began with four provocations for evaluation in our times, designed to seed conversations that would continue throughout the week.
- Simon Kuestenmacher: Pay attention to our changing demographics
Australia’s demographics are changing.
Simon Kuestenmacher noted the Australian workforce has been ‘hollowed out’. This has occurred as skill level three jobs (which are jobs considered to be for the middle class) have only grown by 1% from 2011 to 2016. This means the workforce has formed a ‘U shape’ with the wealthy and the poor on opposite ends. There is also ‘primary carer gap’, as women who choose to become mothers earn less than their male counterparts across the working life cycle.
More broadly, there is potential for the recent pandemic to drive population movements from cities to regional centres. This would be underpinned by increasing opportunities to work from home and build on regional cities’ strengths of agriculture, lower land prices and local manufacturing.
These changing Australian demographics have implications for public policy and evaluation. Every policy and program we evaluate is dealing with inequity. This means we need to be conscious of how a program’s target audience is conceptualised and how the program is meeting the needs of diverse clients.
- Nicole Tujague: Cede the power
Nicole Tujague, picking up the threads of inequity Simon identified, challenged us to cede the power – to enable Indigenous people to lead evaluations of programs and policies for Indigenous communities, to ensure they are telling the story, and to question who decides what counts as evidence. As Nicole pointed out, Indigenous people ‘have a knowledge system that has used sophisticated evaluation for over 50,000 years’.
Don Bemrose held a space for evaluators to reflect on this challenge, as well as to consider the related issues of data sovereignty, colonialism and de-colonisation, and strengths versus deficit approaches. In these sessions, evaluators recognised ways we can do better.
The Indigenous Evaluation Strategy roundtable provided insights into how we, as a profession, can do better.
- Eleanor Williams: Speed up to stay relevant
Our standard evaluation approaches often take a long time to deliver findings.
Eleanor Williams claims increasing the speed of evaluation will allow findings to be more relevant and have the opportunity to influence policy or program design. Speed and utilisation are linked as often ‘just-in-time’ decisions can be critical. Quicker evaluations are particularly useful in the Covid-19 context, when interventions are expected to have an impact in the short-term, but also when you need to see if a policy or program is on track to deliver outcomes.
To achieve speed, consider running parallel processes, using shorter surveys and fewer data sources and synthesising data while data collection is underway. Think about what questions need to be answered through reports, and consider shorter, more frequent reporting. Have a team of experienced qualitative and quantitative analysts as there isn’t time to test and refine approaches.
On the other hand, Sean Chung, in the debate session, argued there are reasons to go slower:
- some initiatives, like collaborative ones, take longer to establish and demonstrate change
- time is needed to build relationships, for example when working with Indigenous communities and culturally diverse communities
- when an evaluation is going to inform a large investment or one that will affect a lot people, credibility is important and this may be compromised it we move too fast
- with time to mull over the data, we can deepen the insights from our analysis.
The participant poll showed we generally think there is room to go faster – 79% of respondents agreed. Although, in the chat, most comments reflected a need to balance fast and slow, and find the right kind of evaluation to respond to the challenges we are facing.
- Michael Quinn Patton: Think in systems
Lastly, as Michael Quinn Patton put it, we need to adapt to our normal – to recognise the influence of the global on the local, be adaptive as conditions and initiatives change, and get out of the projects into systems evaluation.
This new direction will include evaluating mission fulfillment, strategies, advocacy campaigns and systems change. This will require real-time and developmental approaches to evaluation.
Brian Keogh followed this up with a deep dive into systems thinking, in which he suggested we need to think about our purpose before we draw our systems maps, be conscious about where we draw our boundaries, and identify levers of social change that can have the most impact.
Underlying Michael’s provocation was also a challenge to acknowledge our position as evaluators. We have skin in the game as evaluations,. We are not bystanders, particularly in the Covid-19 context, but also the climate emergency and Black Lives Matter. We need to acknowledge this.
To make these changes will take different skills. The new evaluator competency self-assessment tool (based on the AES evaluator competency framework) will let us check how we are tracking. The AES also has a new mentoring initiative that will enable people to tap into the wisdom of AES fellows and learn with peers.
And there will be plenty of opportunities to keep connections and conversations going with evaluation colleagues through AES networking events and plans to extend the FestEVAL club into a regular event.
FestEVAL provided us the opportunity to celebrate evaluation and challenges ourselves. As we consider these discussions and their implications on our work, we need to remain connected. Being connected will allow us to share our ideas and remain an innovative evaluation community.
Jade is a Partner and Managing Director of ARTD. She works with government agencies, not-for-profits and citizens to co-design, communicate and evaluate social policies, regulatory systems and programs.
Lia is a Consultant at ARTD. She enjoys working with government and the community to understand how they can improve the outcomes of services and programs.