Member login
 
   
Forgot Login?   Sign up  

This is the AES Blog, where we regularly post articles by the Australasian evaluation community on the subjects that matter to us. If you have an idea, please contact us on This email address is being protected from spambots. You need JavaScript enabled to view it.. Blog guidelines can be found here.



1909 Michelle Bowron GettyImages 83266614 600

November 2019
by Florent Gomez

Have you ever tried to grow evaluation capacity across your organisation? And this, with very limited resources?

At the recent AES International Evaluation Conference in Sydney, I shared some learnings from our successful Evaluation Community of Practice in the NSW Department of Customer Service (previously NSW Department of Finance) and other soft approaches to evaluation capacity building we are using in our department.

When I started as an internal evaluator with the department in February 2017, I quickly realised that contrary to other government departments, such as in community services, health or education, we didn’t have an established central evaluation unit; only pockets of evaluation capacity here and there. However, there was definitely a need and appetite to learn more about evaluation across the department!

This is why we decided to put together an Evaluation Community of Practice at the department level. The Community of Practice is an open and informal forum allowing staff from different roles and with varying levels of evaluation experience to share and learn about good evaluation practices. Quarterly events are the main component, supported by a Yammer group (corporate social media platform) that we use as a blog to keep the community engaged between events, and an Intranet page with key resources, templates and presentations from the events.

After one-and-a-half years of existence, we decided to evaluate ourselves and see how we were travelling in terms of building evaluation capacity across the department. We refined our intended outcomes by developing a detailed program logic. The evidence we gathered for the different outcome levels showed that this low-cost approach effectively contributed to raising evaluation awareness and capability across the department. Our evaluation showed that:

  • A wide range of people participate in the community and continue to do so (an average of 56 participants attend each event and there are 127 members of the Yammer group).
  • Participants learn some good evaluation practices and report applying these back in their work place (42% already and a further 56% probably will).
  • There is also some evidence that the quality of evaluation deliverables produced within the department increased with, for instance, a more frequent use of program logic.
  • For the first time, the Department was represented at the AES conference!

As one key stakeholder summed up: the Evaluation Community of Practice collectively played the role of the evaluation centre of excellence the department didn’t have.

The following key success factors we identified could be applied to other organisations facing similar challenges:

  • Keeping this community open and informal increases people’s confidence and contributes to reducing ‘imposter syndrome’ – this is, I believe, a core mechanism that made it work in our particular context.
  • Leadership support and licence to innovate – this is critical in the context of limited resources. In our case, we had to streamline processes and constantly innovate to deliver successful events within existing resources.
  • Having a rotating chair – helps to reach out to all parts of the organisation and also shares the responsibility for the organisation of the events
  • Having a few staff dedicated to coordinating the community on top of their daily priorities – helps ensure continuity and efficient delivery of the events, although this is a risk factor at the same time because of the reliance on a few individuals.

This is definitely a low-cost evaluation capacity building approach I’d recommend, in particular in organisations with less established evaluation capacity. What tips can you share from successful approaches to evaluation capacity building in your organisation?

-------------------------- 

Florent is a Manager for planning, evaluation and reporting at the NSW Department of Customer Service. Before that, he worked as an external evaluator for over 10 years in Europe then Australia. 


 

Fellows jennyN

September 2019
by Anthea Rutter

All of us in the AES were greatly shocked and saddened by the sudden death of Jenny Neale. Jenny had been a member of the Australian Evaluation Society for over 20 years and was an active contributor to the society both in her local Regional Network Committee in Wellington as well as a regular contributor at the AES International Conferences. 

Jenny was a Senior Research Fellow Health Services Research Centre, Faculty of Health, University of Wellington, New Zealand.

I interviewed Jenny last year and was rewarded by a frank discussion of life in the field of evaluation, its ups and downs and its frustrations!

The first question I asked her to reflect on was, what brought her into the field of evaluation? 

I have swapped between being a researcher and then working in the evaluation field. I first got into evaluation through the Wellington Evaluation Group. I think it would be correct to say that in those days they were a fairly loose list of people. At that stage I was teaching an applied research degree, and I guess that would have been the 90’s.

The Fellows have a diverse range of evaluation or research interests, which keep them involved in the profession. I was, therefore, interested to find out what had been Jenny’s main areas of interest?

My role for the last 8 years has been that of evaluator in the health services research unit at the University. We evaluate a range of health services initiatives. My main interest has been the social justice area. So, I think I am in the right space, as working in the health services field fits well into that.

Like most of us, a career spanning a number of years usually has a few challenges – some have more than others. So I asked Jenny what have been the major challenges to her practice?

I think it is something that we are facing again at the moment – people’s understanding of evaluation, what it is and what it does, as well as what it is not. Sometimes people ask you for an evaluation but at the end of the project they decide they want something different.  So, it’s that whole issue around what evaluation means. Then there is an idea out there that you can evaluate impact immediately.

Another challenge is that people underestimate the amount of time it takes to actually do an evaluation as people want things yesterday.

The other sort of challenge for us is that the main evaluators in NZ are contractors or work for government departments. So, people move in and out of evaluation and research. If you are a more mature person, you sort of drifted into evaluation (as it was a new field in the 80s). So new evaluators, who possibly have been trained in evaluation, talk about new methods and some of them are things we knew years ago. I try very hard to not say “In my day”!

Apart from challenges to practice, careers also have a number of highlights. It was pleasing to note that highlights for Jenny involved the bringing together of evaluators from Australia and New Zealand.

One of my early highlights was teaching with John (Owen) on a course in Wellington. The second time he brought his course over to NZ he also brought Ros Hurworth with him. Then we had some feedback which suggested that it would be good to have local content and, subsequently, I provided that. And John ran a course with my master’s students. Another highlight was joining the AES and then getting to know lot of people in the field. Ralph Straton was over here as a visiting scholar at the time of 9/11. He was due to head to the US. But the insurance companies would not cover US travel. So, Ralph stayed in Wellington and we all learnt a lot. It was a very good interchange of ideas and skills.

Evaluation practice is defined by many factors, including people, evaluation models and cultures. For Jenny, her practice was influenced a lot by Māori and Pacifica.

I think a lot comes from where I sit with Māori and Pacifica, there is also a strong social justice factor, particular because we are bi-cultural (the Treaty) but becoming multi-cultural. So, treaties of understanding and friendship with other Pacific nations. We have to be clear that’s it’s important for them as well as Māori. We don’t want evaluations to just be a ‘tick box’, which leaves people worse off than before.

I am not a theorist; my research was always applied. There have been other influences which have shaped my practice: listening to people talk at AES conferences. Michael Quinn Patton was also influential, John Owen’s teaching and book was very influential and useful. Also, the Community of Practice within AES was very important.

I wanted to find out from Jenny in what ways the field of evaluation has changed during her career.

Things in evaluation become fashionable and then go out of fashion. But I think that people are realising that it is important. I think it has changed. We now have a number of different theories and practices, for example, realist evaluation and developmental evaluation. The profession has changed from being a broad field where it had practitioners who had a research background to people looking to apply what they know.

With these changes what skills and competencies are required to keep pace with emerging trends in evaluation practice. Jenny’s response was spot on!

I think it’s a bit like life skills, you need to be open to different ideas. Listen to people and ideas. Then craft it in the field you are working in. I remember a short debate a while ago about RCTs as being the gold standard, other countries still talk about it as the only method. In social services it does not always work. You need theoreticians and practitioners and the people in between. So, you need both ends of the spectrum. Most of us are in the middle. Both sides ensure that it is a lively debate. You need to be a good listener and adapt key ideas. Problems are the same in several countries, but the context differs. Making sure there is open debate.

Jenny was very definite when asked about the key issues that evaluators ought to be thinking as about and seeking to resolve in the next decade.

The main one which AES is tackling is the professionalism aspect, and quite a bit of work has been done in this area already, and in particular what we need to make it a recognised profession. In government jobs both in NZ and Australia people move between policy, research or evaluation. 

The other thing which we must do is to undertake an educative job in explaining what evaluation and monitoring means.

Finally, I asked Jenny to comment on what she wished she had known before she set out on the evaluation path.

Certainly, wishing I had known more… But my comment is really a wish – wishing other people understood what evaluation could do and what it was! I guess I assumed that if someone wanted an evaluation, they knew what they would get. Certainly, I think that moving between research and evaluation was advantageous in terms of methodologies and seeing what others were doing.

 

This piece is a tribute to a Fellow of the Australian Evaluation Society who made her mark on the work of the Society as well as for the profession of evaluation. I personally regarded her as a friend, and she will be missed. 


 

Fellows chrisM

October 2019
by Anthea Rutter

Chris Milne was an early pioneer in the use of program logic.  As a founding partner of ARTD Consultants, he has designed and delivered numerous evaluations across diverse sectors and built the evaluation capacity of government and non-government organisations. In recent years, he worked with another AES Fellow, Patricia Rogers, on the NSW Government evaluation toolkit.

I enjoyed speaking with Chris. He struck me as a man with a high degree of humility, as well as someone who considers his answers in a balanced way. He is obviously committed to the environment and the world in which we live, and passionate about making it a good place for the generations that follow.

How did you fall into evaluation and establishing ARTD?

I was working in adult education with Aboriginal people at Tranby College in Sydney. Then, in 1989, two of us set up ARTD as a training consultancy. I became more interested in evaluating training rather than doing it, which led me to the work of Donald Kirkpatrick and Robert Brinkerhoff in the US. Then I saw the program logic approach developed by Sue Funnell and others for the NSW Government. I found program logic a great tool for monitoring and evaluation, and I began to use it with all kinds of programs.

What have been your particular interests in evaluation?

Program theory and program logic. The spread of program logic has been a highlight, especially the approach developed by Sue Funnel and others in the 1990s. I’ve seen it go from a pioneering concept to a fundamental tool of evaluation. In 1995 at the first international evaluation conference in Vancouver, Sue and I ran a workshop on program logic – attended by a lot of experienced American evaluators who gave us very positive feedback. At ARTD we developed a computer training package on program logic in the 1990s and sold hundreds of copies around Australia and internationally – a couple of years ago I heard from a woman in Alaska who had been using it for years. 

I’ve also enjoyed working out evaluation strategies, questions, designs and plans and advising organisations on all aspects of evaluation including overall approaches, capacity and use. I am very interested in meta-evaluation, whether assessing the quality of an individual evaluation, or more rarely, reviewing a collection of evaluations. 

I have enjoyed supporting people to be sound and informed evaluators, whether clients or our staff at ARTD. I particularly liked coaching people to write executive summaries that are succinct, balanced and evidence-based – getting it all down to a couple of pages is an art!

I would imagine that anyone who has been involved in a profession for over 30 years would have faced a number of challenges along the way. What was interesting was the wide-ranging nature of Chris’s reply, bringing in issues of practice, culture, methods and the political landscape.

Well the world of evaluation itself is one big challenge, that’s why we love it!

At the practice level we need to deal with all the constraints in doing good evaluation work, especially in organisations where people have limited experience. Take costs for example – some people have no idea of the likely cost of the evaluation that they want.

A more recent challenge is the increased complexity of interventions and, therefore, the complexity of the evaluation.

Another is the clash of cultures around evidence and methods across different policy fields, so expectations vary across health, education, environment, human services, economics and so on. Similarly, governments go through different fads around evidence, so that requirements change; for example, managerialist approaches tried to make decisions on a few metrics (KPIs), rather than the full story of contexts, strengths and weaknesses.

How has evaluation changed over the past 30 years?

Organisations involved in public policy are always going through changes in how they use evidence and make decisions. I’ve seen two or three cycles of evaluative approaches come and go. Each earlier approach is retained somewhere, so it seems that evaluation will always be multi-faceted and contested. 

Another change is the greater and greater influence of technology. In evaluation, we have more access to big data, sophisticated tools for qualitative and quantitative analysis, social media and the prospect of artificial intelligence. But, as far as I can tell, evaluative arguments and executive summaries will remain human endeavours for some time.

What are the main skills or competencies that evaluators need to keep pace with emerging trends?

I think that evaluators need a broad base in evaluation theory and practice, in addition to their specialist skills. You need to keep up with literature in evaluation and related fields, such as public policy, management and systems. I believe that scepticism is an important attitude. You also should approach evaluation with curiosity and mindfulness and be able to live with ambiguity and uncertainty.

We live in an uncertain world in which goal-posts change at a rapid rate. So what do you see as the main social issues that we should be thinking about and seeking to resolve in the next decade?

In recent times, a major issue is the less certain role of democratic institutions and governments in our society, and their lack of capacity to deal with the important problems. Everything becomes short term and often ideological. There is less focus on evidence in decision-making. Governments are becoming more populist, with less capacity for, or even interest in, rational and balanced decisions about the big problems that we face. 

Reconciliation with Aboriginal people is far from complete in Australia and this has many implications for how we do evaluation. The AES has had a good record in recent years, but we need to continue our focus on Aboriginal issues and the involvement of Aboriginal people. 

More broadly, I believe that the biggest issue we face is addressing climate change and its impact on all aspects of our lives. For evaluators, this includes how we deal with special interests and the unbalanced use of data. We also need to address the increasing inequity within our society, whereby, my generation is way better off than younger people. Another issue is the control and use of data collected by Google and social media companies. Data that are not-transparent may breach our ethical standards, and are essentially used for commercial and political purposes.  The challenge for evaluation is to be able to access and use the growing amount of data for reasoned inquiry, balanced decisions and ultimately the public good.

How can the AES position itself to remain relevant into the future?

It’s so important we stay inclusive of people with very different interests, approaches, backgrounds and experiences with evaluation. We should be a platform to communicate the trends and challenges for evidence and evaluation in public policy.

The Society needs a high profile – it should stand out as the key authority on evaluation in Australia.

--------------------------

Chris Milne is a founding partner of ARTD Consultants, a public policy consulting firm specialising in evaluation established in 1983. While he is mostly retired, Chris continues to chair the ARTD Board and act as a sounding board for ARTD Directors.


 

September 2019
by Aneta Cram, Francesca Demetriou and Eunice Sotelo

We’ve heard it time and again: people don’t necessarily set out to be evaluators, but fall into the field. For those of us relatively new or emerging, this can be confusing to navigate.

As three self-identified early career evaluators (ECEs), who also grapple with what it means to be ‘early career’ or ‘emerging’, we were interested to learn more about how ECEs orient themselves, set their career pathway, and build their evaluation capacity.

For the past eight months we‘ve been working on a research project exploring the experiences that current self-identified ECEs have had entering into and developing themselves across the diverse range of entry pathways and work contexts in Australia and, in part, New Zealand.

Our project

We chose to take an exploratory approach to this research for a number of reasons. For one, we wanted to hear peoples’ lived experiences and be able to share them without the confines of a set analytical framework. Secondly, we didn’t know what would emerge or what we would find. From our own experiences as ECEs working in different sectors in Australia and abroad, we knew what interested us about entering the field, but – because of the variety of individuals and experiences – we didn’t want to make any assumptions about who our research participants might be or what their experiences have been.

Our overarching research questions were: What are early career evaluator experiences in entering and developing careers in the evaluation profession? What facilitating factors, opportunities and challenges emerge as important to early career evaluators in their experience entering and developing a career in the evaluation profession?
We decided to contact ECEs through evaluation associations and our own professional networks and asked them to support our work by sharing project information with their networks. From this, we received responses from 49 self-identified ECEs.
Even though we would have liked to have interviewed them all, as this is a voluntary project, we only had the capacity to interview 14. The 14 were ECEs from 5 different states in Australia and New Zealand. We wanted to include a diverse range of individuals and so chose our participants based on age range, geographical location, sectors and cultural identity.

FindingsESgraph

We are excited to share with you some of our emerging findings and see if they align or differ from your own experiences entering the evaluation field.

From the preliminary analysis, some of the stand-out findings are:

  • There is ambiguity around what it means to be in the ‘early career’ or ‘emerging’ stages of evaluation work.
  • Peer support and mentorship, access to training and resources, and the role of evaluation associations have important roles in facilitating support for early career evaluators.
  • Early career evaluators experience different and unique enablers and challenges across the variety of workplace contexts.
  • Individuals have faced challenges around age discrimination, cultural representation in the field, and how identity plays out in the way that individuals approach evaluation practice.
  • Early career evaluators bring unique and diverse values, experiences and lenses to evaluation from their prior professional experience, life experiences and identities.

You can read the emerging findings report here. We will be presenting these early findings and conducting a participatory sensemaking session on Wednesday, 18 September, at the Australian Evaluation Society’s conference. We will be incorporating feedback from the conference session into the final report. Come along and help us make sense of the Australian and New Zealand ECE experience/s.

 -------------------------- 

The research team includes:

Francesca Demetriou

 

Francesca Demetriou (Project Lead) works as a freelance evaluation consultant. She also volunteers her Monitoring and Evaluation skills to the Asylum Seeker Resource Centre’s Professional Mentoring Program.

 

  

Aneta Katarina Raiha Cram

Aneta Katarina Raiha Cram is a Māori and Pākeha (caucasian) evaluator from Aotearoa New Zealand. Her primary evaluation experience has been grounded in Kaupapa Māori, a culturally responsive indigenous way of approaching evaluation and research that prioritises Māori knowledge and ways of being, working with Māori communities in New Zealand. Aneta identifies as an ECE. She is currently working as a sole contractor, and has been living and working in the United States and will be returning to New Zealand early next year to begin her next journey as a Doctoral Candidate. 


Eunice SoteloEunice Sotelo is an educator and evaluator, with a particular interest in evaluation capacity building. Before moving to Australia over three years ago, she worked as a high school teacher, copywriter and copy editor. Her experience as a migrant – moving to Canada from the Philippines as a teenager, and working in China for two years – has shaped the way she sees her role in the evaluation space. She recently volunteered as mentor and trainer for the Lived Experience Evaluator Project (LEEP) at the Asylum Seeker Resource Centre.