Member login
 
   
Forgot Login?   Sign up  

This is the AES Blog, where we regularly post articles by the Australasian evaluation community on the subjects that matter to us. If you have an idea, please contact us on This email address is being protected from spambots. You need JavaScript enabled to view it.. Blog guidelines can be found here.



covid statement

June 2020
by AES Relationships Committee

The changing context
The global scale and speed of disruption caused by the COVID-19 pandemic is unprecedented in our lifetimes. The pathway to recovery and management of COVID-19 is expected to be complex and challenging, with significant long-term implications for individuals, organisations, governments and the country.

The coordinated national response in Australia has so far been successful because the best available data and evidence has significantly influenced decision-making. The evidence-informed approach that has served us well to-date remains equally critical going forward.

During the pandemic, many public sector initiatives and supports have been designed, adjusted or expanded to assist individuals, households and businesses to survive and adapt. Some services have been interrupted or halted. As restrictions lift, consideration will need to be given to which adjustments are maintained.

The AES considers that sound data collection and analysis should be built into the establishment of any new or adapted initiatives to maximise the value of evaluation. Evaluation can also support the development of new initiatives and support service redesign activities.

Evidence and evaluation play an important role
Evaluation – and evaluative thinking – remains central in offering systematic review of new and changing initiatives, and to pre-empt potential unintended consequences. It can be undertaken across the policy and program life-cycle to:

  • Ensure clarity of purpose, objectives and alignment of values
  • Assist with monitoring progress and meeting reporting requirements
  • Identify immediate improvement opportunities
  • Understand impact and its drivers, including for different cohorts
  • Understand how design and operation influence impact in different contexts
  • Support good governance, sound decision-making and smart resource allocation
  • Promote knowledge transfer and capability development.

Evaluators are adapting their approaches
To effectively deliver on existing work, evaluators have adapted their approaches to meet physical distancing requirements. Although service clients and stakeholders may feel harder to reach, digital platforms are enabling connections across traditional geographic and social boundaries.

Evaluators are able to continue their work by:

  • Reassessing objectives: Updating evaluation objectives to ensure they remain useful
  • Shifting phasing: Changing delivery timeframes and milestones
  • Adapting design: Shifting design, methods and data collection to achieve the evaluation’s objectives
  • Appropriately engaging stakeholders: Considering how COVID-19 is affecting key stakeholders and adapting engagement methods appropriately
  • Contextualising findings: Interpreting data and forming findings based on contextualised information across different phases of the crisis (e.g. the response and recovery phases).

The AES recommends that monitoring, evaluation and evidence continue wherever possible to support post-pandemic recovery and review.

This statement has been prepared by AES members for AES members to support discussions about why evaluation has particular relevance and value during the pandemic, and how evaluations may be adapted.

Download as a PDF.

Fellows zitaU

May 2020
by Anthea Rutter

Zita has been in the evaluation profession for over 26 years and has taken a number of roles over that period. She has been an evaluation lead, consultant and lecturer in evaluation. Zita was introduced as a Fellow in 2013 at the International Evaluation Conference held in Brisbane. 

People come into the evaluation profession through a number of routes, so I was interested to find out from Zita what brought her into the field of evaluation. Her answer echoed the experiences of a large number of evaluators.

I fell into it! Happenstance played a role. I was contracted by Griffith University to write a critique of their environmental educational materials following my PhD dissertation. While also undertaking an instructional design course (incorporating program evaluation), I wrote a module for a Griffith University teacher education project, Teaching for a Sustainable World, which included an evaluation instrument for resource materials. In a roundabout way, this led to lecturing in evaluation to graduate Instructional Design students at Deakin University.

All the evaluators I have talked to have a myriad of evaluation expertise and interests. Zita’s was straight to the point!

Governance and survey design. In all of that work, the front-end design was always my interest. A lot of the work you do at the back end – the reporting end – is dependent upon how you developed the front end. Another of my interests is organisational development and 365-degree feedback. They all feed into each other.

Most of us have been challenged during our careers: indeed, you could say that overcoming challenges helps us to grow as professionals. Zita shares her own lessons, plus sound advice on pricing evaluations.

The first one is that people often do not know what evaluation is. Capacity building has been a large part of any consultancy which I do, so that people not only understand the process but they get to champion evaluation. 

Another challenge is with organisational development. When undertaking a strategic review for a medical institution, I was struck by the fact that consultants within organisational development were using similar techniques to evaluation techniques. 

Pricing is always a big challenge. When I had my evaluation consultancy, I was asked to give a presentation at the AEA [American Evaluation Association]. My presentation was on pricing evaluation, “the tail wagging the dog”. I talked about program budgets, workflow, project activities pipeline and cost benefit (plus other areas). Attention to pricing can position evaluation and our services. 

Another challenge is having the confidence to talk about cost in the early stages of a career. Of course, clients really expect that conversation. Towards the end, I put in several line items. One of them was meta evaluation, review, thinking time etc. You need the confidence and experience to do this. If younger evaluators can shadow the older more experienced ones it would give them some confidence. Need to have the budget talk. It is important to match each other’s expectations – makes for a happier evaluation, than overpricing and not delivering.

Alongside the challenges, a long successful career has its highlights.

On the work front, Zita talked about a major evaluation in the tertiary sector. For her, this was particularly enjoyable as it covered a large number of issues and areas: evaluation of their project writing; implementation; and an institutional review of each of the universities. 

Another highlight was being awarded the Evaluation Training and Service Award – a co-award. As well, the evaluation development award, for an online survey management system. An outstanding highlight was becoming a Fellow. It says so much about peer recognition and the huge amount of work we put in as evaluators. This recognition brings it all together.

In common with many of the Fellows’ interviews, there are a number of influences defining her practice, and Zita cited a few.

John Owen’s methods book [Program Evaluation, Forms and Approaches] was very influential. I was asked to write a critical review of it, so I became familiar with it. Then I was very struck by Patricia Rogers when I heard her speak on logic modelling. It was at the AEA. The theme was truth, beauty and justice, based on Ernie House’s book [Evaluating with Validity]. I was very impressed, talking about logic modelling in terms of truth, beauty, justice etc. so then I looked at logic modelling in a different way. Also, the AEA big names were very influential, especially in capacity building, the GAO [US Government Accountability Office], and Ray Rist from World Bank – he talked about evaluation capacity to strengthen governance. He referred to a supply and demand model which talked about institutional, financial, human, and technical capital. I felt it was very important for my practice.

Like many professions, evaluation has gone through many changes over the years, and these changes mean different things to different people. I asked Zita what she saw as the major changes to the profession.

Zita responded that there is now more focus on cultural sensitivity and Indigenous evaluation. Also, more focus on evaluative thinking. She also felt that there is a greater emphasis on process than there used to be. Outcome has changed over time – less on performance but more on process matters. As well, evaluations now have a greater focus on impact. However, she feels that we should always take into account the latest trends as it changes your own thinking over time.

To keep pace with the emerging trends in evaluation practice, I was curious to find out what Zita felt were the main skills or competencies that evaluators need to have or develop.

People need to be open to a range of methodologies. You might get comfortable with a range of methods, but you need to be open to different ways of doing things. You need variety, and to try to think about something new and see what that means for your toolbox. Being reflective as you make it more diverse.

We also discussed what she saw as the main social issues that evaluators ought to be thinking about and seeking to resolve in the next decade.

The conversation pointed to a desire for accreditation. The professional development the AES runs has no standard. The government needs to say that we are a professional body which provides professional training and competency which is recognised by someone – we need to be an accredited society.

Zita has been involved with the society in a number of roles: member of the awards, ethics and standards committees, and member and presenter for the AES Victorian branch. She was also on the AES Conference Committee for the Melbourne conference (twice). I felt she would be in a good position to ponder the direction which the AES should take in the future.

Zita reiterated her desire for an accredited AES. She also felt that the AES should be the go-to for any media comment in evaluation. University departments teaching evaluation in their own discipline without reference to the AES is something she would like to see changed.

--------------------------

Zita Unger is an independent director on various boards and has several governance roles. Her main interests are in the areas of governance, strategy and evaluation.


 

 DSC7612

June 2020
by Eleanor Williams

COVID-19 has, for many, been a time of adaptation and creation of a new sense of normality.  As we move away, gratefully, from local crisis management, we have the opportunity to reflect on not only our own resilience through this time, but what we have learned and how we have adapted through adversity. 

Eleanor Williams from the Centre for Evaluation and Research Evidence, Victorian Dept of Health and Human Services and the Australian Public Sector Evaluation Network shares her reflections on Evaluation Adaptation through COVID-19.

COVID-19 has brought about strange new ways of working.  Evaluation teams across Victoria, including ours, are adapting to working remotely, away from colleagues, comfort zones and familiar professional places and spaces.  As we adjust to navigating these new evaluation environments and experiences, I have really valued the opportunity to share perspectives with others through blogs, online forums and webinars. 

I’ve been reflecting on a number of challenging questions, many of which are focussed around how evaluation can provide maximum value when key systemic and organisational decisions are having to be made quickly and reactively to rapidly emerging and changing situations.

COVID-19 forces big policy questions onto the table as we all try to identify how this crisis impacts our services; how it can be best mitigated; what can stay the same and what needs to swiftly evolve to meet projected changes in patterns of need across health and human services.

In our context evaluators are trying to work out how we best adapt to the changes happening around us. Should we be holding steady and proceeding with existing project work? Can we feasibly do this?  Are the agencies we are working with able to focus on evaluation at the moment?  Do we adapt our focus instead to new knowledge priorities and to make space for emerging demands? 

Our team has had to cope with losing staff to emergency response secondments, bringing about a sudden re-prioritisation of projects. At the same time new COVID-related evaluation requests are coming in daily and our remaining team members are running a series of rapid evaluations to provide fast evidence to decision-makers about the impact of service and practice changes that have emerged during Covid-19.  

It has been a balancing act of trying to retain the integrity of longer term and larger evaluations, while downscaling to manage additional demands and depleted team numbers.  And we ask ourselves, can we really deliver robust and insightful findings within extra tight timeframes and in a rapidly changing landscape?  And cutting across it all, we ask ourselves how we can hold onto our reflective practice principles to use these times of challenge and change as learning opportunities? 

Adapting our practice

Rapid but thoughtful adaptation has been key to the resilience required to navigate these times.  Not only have we had to find ways to work without the face-to-face engagement which has always been central to evaluation capacity building, data collection and delivery of findings, we have also had to individually adapt to fast changes to our team member’s roles and availability. 

It is testimony to the versatile skillsets of evaluators that our team members have been deployed into not only rapid evidence reviews for public health emergency responses, but front line data collection, working phonelines to provide emergency information to the public, even acting as concierges for hotels being used for quarantine purposes.   

In parallel, team members learnt how to run program logic and investment logic mapping sessions through Microsoft Teams becoming, like so many, over-night experts in video-conferencing and telephone interviews.  And alongside this we shared the challenges of the world’s professionals now forced to work in shared space with partners, children and pets; with laptops propped up on books at kitchen tables or in cramped bedrooms in share houses. 

What will we take forward?

As life returns to “new normal” in the coming months we will be alongside the rest of our communities in dealing with the backlog of work and life that had to be put aside during the crisis – at the same time as rising to the new demands of 2020-21’s recovery phase. 

Breaks in data collection and unexpected, unforeseen changes in the ways that human services are used in the community will pose challenges for reliable evaluation findings.  In particular it will be difficult to separate the internal effectiveness of programs and projects from the impact of COVID-19.   

While there will be no easy answers, there are some emerging upsides as well. The world has discovered that it is possible to be less reliant on the face-to-face engagement, which opens up the opportunity for major efficiencies of time and resources devoted to logistical coordination of face-to-face focus groups, workshops and interviews.  As evaluators we can start to imagine a life with less time and money spent on function rooms, hire cars and accommodation. Information could be only a click away through effective use of Zoom, Skype or Teams or any number of online platforms.  These experiences can show us that distance need not be an obstacle to engagement with stakeholders across state, country and even the world.

At the newly formed Australian Public Sector Evaluation Network (APSEN), as a recently endorsed Special Interest Group of the Australian Evaluation Society (online presence and email address coming soon!) we look forward to continuing to provide a space for these discussions.  Official email addresses and details are coming, but anyone who would like more information about this group in the meantime can email This email address is being protected from spambots. You need JavaScript enabled to view it..

It is these conversations, whether facilitated by AES or in our daily lives, that will support our profession to adapt and deliver the best possible evidence and insights in the emerging new environment.


 

May 2020
by Jade Maloney

Over the last couple of months, evaluators around the world have been grappling with the question of whether and how we evaluate in the COVID-19 context. What can and should be done now, and what should wait? How can we be most useful?

For a recent online session with AES members, which Keren Winterford, Greg Masters and I hosted on behalf of the NSW Committee, I rounded up a range of reflections on these questions to prompt discussion.

We need to consider carefully whether to pause or press on

Evaluation is the oxygen that powers decision making. Too little and we are likely to make poor decisions. And when faced with big challenges, we need more than usual. Too much evaluation without action leads to hyperventilation. Analysis paralysis. As an evaluator, it is your responsibility to keep the breathing steady. [Chris Lysy]

To decide whether to pause or press on with our existing evaluations, we need to ask ourselves a series of questions.

Can it be done without undue stress to an organisation responding to COVID-19? At the best of times, evaluation can be anxiety inducing, does the organisation/ team have the bandwidth to engage?

Can the evaluation still be useful now? Can you adapt your questions? Can you assess how COVID-19 adaptations are working? Can you help to identify what adjustments should continue post COVID-19?

Can you adapt your methods to comply with physical distancing? Will the people you are trying to engage, engage online? Can you draw on existing data sources?

The World Bank’s, Adapting Evaluation Designs sets out four questions that you can adapt to work through whether to press on. Better Evaluation, has also begun a series on Adapting evaluation in the time of COVID-19. Part 1: MANAGE has a range of useful prompts to help you work through changes to stakeholder context, engagement, decision-making protocols, information needs, Terms of Reference and budgeting.

Think beyond individual “evaluations” to tap into our value

I think one of the key gaps or aspects I don’t see addressed much is around utility of evaluation in this space. A lot of the discussion online is around the ‘how’ – how do we adapt evaluation? But I feel a deeper question is around the ‘why’ of evaluation. Why is it still important to do evaluation in this context? Is it actually important to making a difference? This is quite a tricky question and one that can make an evaluator really uncomfortable as it forces us to reconsider our work. But, on the contrary, I see this as an opportunity to reinforce our conviction, sense of purpose and clarity. Evaluation was already often an after-thought and now urgent customer-facing delivery initiatives are definitely taking priority. The case for evaluation will be harder to make. We need to genuinely think about the value evaluation can bring in these times and more broadly. {Florent Gomez, NSW Department of Customer Service]

As Michael Quinn Patton has said, we need to be prepared to make a case for the value of evaluation now. We can do this by proactively demonstrating the ongoing relevance of evaluative thinking, supporting real-time sensemaking of data, engaging in systems thinking (identifying the interconnections and their implications), enabling decision-making based on “good enough” data, and identifying the potential for negative unintended consequences so they can be prevented. In other words, “All evaluators must now become developmental evaluators, capable of adapting to complex dynamics systems, preparing for the unknown, for uncertainties, turbulence, lack of control, nonlinearities, and for emergence of the unexpected.”

For guidance on Sense-making in real-time check out Canadian facilitator Chris Corrigan’s blog.  First, observe the situation. Then, look for patterns and inquire into these. What do you notice in general? What are the exceptions to these generalisations? The contradictions? The surprises? What are you curious about? Then, using complexity concepts, look at what is keeping the patterns you have identified in place and the actionable insights that could enable change.

Sense making in real time 

My team at ARTD have also developed the 3 R Framework as a tool for using evaluative thinking under pressure. It is based around questions because, in our experience, being an evaluator is about asking effective questions at the right time, not about having all the answers. You can use the framework to direct responses at an organisational, team, program or individual level. If you’re applying it within your organisation, team or to a program, we suggest getting a diverse group of people together to reflect, drawing on existing data, stories and experiences to ensure you are not missing critical insights as you make decisions.

3 R Framework 

While being useful right now, we can also keep our eye on the long game – what data needs to be collected now to enable evaluation of pandemic responses?

Think through the implications of your choices

Among evaluators I have spoken to around Australia and overseas, there is a strong concern about the equity implications of changes. It is important we recognise differential impacts of the crisis, consider accessibility when adapting our methods and whose voice is missed if we draw only on existing data.

We also need to be as purposeful in choosing our online methods as we are in planning methods generally. Not everything has to become a Zoom session. Asynchronous (contributing at different times) online methods bring different benefits and drawbacks to synchronous (contributing at the same time) online methods.

Remember: not everything is changing and some things should

One of the things I have found most useful in this time is my colleague Emily Verstege’s reminder (with reference to Kieran Flanagan and Dan Gregory’s Forever Skills) that, while many things are changing, including how we evaluate, what is at the core of evaluation is not. We can take comfort in this, as well as in the potential to change things that need changing.

One of the benefits of taking our regular AES session online was the ability to engage with members in regional areas and other jurisdictions. It’s something the committee is already thinking about continuing when physical distancing rules are relaxed.

I have been most struck by the validity of that old maxim that necessity is the mother of invention. In many areas of work and life, more fundamental change has occurred in the last few weeks than in previous years, despite the relentless urging for innovation. Witness working from home arrangements, expansion of telehealth services, online delivery of educational programs.

Hopefully, one of the legacies of this awful crisis is that some of these new practices become ingrained and that we become more comfortable challenging the status quo and traditional modes of operation. Returning to normal is neither feasible nor desirable. Evaluators have a large role to play in leading that campaign but we also need to challenge our existing practices. [Greg Masters, Nexus Management Consulting and NSW AES Committee member]

If you have ideas for further AES blogs, the AES Blog Working Group would be keen to hear them. Please complete the online form below.

 

-------------------------- 

Jade Maloney is a Partner and Managing Director of ARTD Consultants, which specialises in program evaluation.