Welcome to the AES Blog
Reflections from a seasoned evaluator, Chris Milne
by Anthea Rutter
Chris Milne was an early pioneer in the use of program logic. As a founding partner of ARTD Consultants, he has designed and delivered numerous evaluations across diverse sectors and built the evaluation capacity of government and non-government organisations. In recent years, he worked with another AES Fellow, Patricia Rogers, on the NSW Government evaluation toolkit.
I enjoyed speaking with Chris. He struck me as a man with a high degree of humility, as well as someone who considers his answers in a balanced way. He is obviously committed to the environment and the world in which we live, and passionate about making it a good place for the generations that follow.
How did you fall into evaluation and establishing ARTD?
|I was working in adult education with Aboriginal people at Tranby College in Sydney. Then, in 1989, two of us set up ARTD as a training consultancy. I became more interested in evaluating training rather than doing it, which led me to the work of Donald Kirkpatrick and Robert Brinkerhoff in the US. Then I saw the program logic approach developed by Sue Funnell and others for the NSW Government. I found program logic a great tool for monitoring and evaluation, and I began to use it with all kinds of programs..|
What have been your particular interests in evaluation?
Program theory and program logic. The spread of program logic has been a highlight, especially the approach developed by Sue Funnel and others in the 1990s. I’ve seen it go from a pioneering concept to a fundamental tool of evaluation. In 1995 at the first international evaluation conference in Vancouver, Sue and I ran a workshop on program logic – attended by a lot of experienced American evaluators who gave us very positive feedback. At ARTD we developed a computer training package on program logic in the 1990s and sold hundreds of copies around Australia and internationally – a couple of years ago I heard from a woman in Alaska who had been using it for years.
I’ve also enjoyed working out evaluation strategies, questions, designs and plans and advising organisations on all aspects of evaluation including overall approaches, capacity and use. I am very interested in meta-evaluation, whether assessing the quality of an individual evaluation, or more rarely, reviewing a collection of evaluations.
I have enjoyed supporting people to be sound and informed evaluators, whether clients or our staff at ARTD. I particularly liked coaching people to write executive summaries that are succinct, balanced and evidence-based – getting it all down to a couple of pages is an art!
I would imagine that anyone who has been involved in a profession for over 30 years would have faced a number of challenges along the way. What was interesting was the wide-ranging nature of Chris’s reply, bringing in issues of practice, culture, methods and the political landscape.
Well the world of evaluation itself is one big challenge, that’s why we love it!
At the practice level we need to deal with all the constraints in doing good evaluation work, especially in organisations where people have limited experience. Take costs for example – some people have no idea of the likely cost of the evaluation that they want.
A more recent challenge is the increased complexity of interventions and, therefore, the complexity of the evaluation.
Another is the clash of cultures around evidence and methods across different policy fields, so expectations vary across health, education, environment, human services, economics and so on. Similarly, governments go through different fads around evidence, so that requirements change; for example, managerialist approaches tried to make decisions on a few metrics (KPIs), rather than the full story of contexts, strengths and weaknesses
How has evaluation changed over the past 30 years?
|Organisations involved in public policy are always going through changes in how they use evidence and make decisions. I’ve seen two or three cycles of evaluative approaches come and go. Each earlier approach is retained somewhere, so it seems that evaluation will always be multi-faceted and contested.
Another change is the greater and greater influence of technology. In evaluation, we have more access to big data, sophisticated tools for qualitative and quantitative analysis, social media and the prospect of artificial intelligence. But, as far as I can tell, evaluative arguments and executive summaries will remain human endeavours for some time.
What are the main skills or competencies that evaluators need to keep pace with emerging trends?
|I think that evaluators need a broad base in evaluation theory and practice, in addition to their specialist skills. You need to keep up with literature in evaluation and related fields, such as public policy, management and systems. I believe that scepticism is an important attitude. You also should approach evaluation with curiosity and mindfulness and be able to live with ambiguity and uncertainty.|
We live in an uncertain world in which goal-posts change at a rapid rate. So what do you see as the main social issues that we should be thinking about and seeking to resolve in the next decade?
|In recent times, a major issue is the less certain role of democratic institutions and governments in our society, and their lack of capacity to deal with the important problems. Everything becomes short term and often ideological. There is less focus on evidence in decision-making. Governments are becoming more populist, with less capacity for, or even interest in, rational and balanced decisions about the big problems that we face.
Reconciliation with Aboriginal people is far from complete in Australia and this has many implications for how we do evaluation. The AES has had a good record in recent years, but we need to continue our focus on Aboriginal issues and the involvement of Aboriginal people.
More broadly, I believe that the biggest issue we face is addressing climate change and its impact on all aspects of our lives. For evaluators, this includes how we deal with special interests and the unbalanced use of data. We also need to address the increasing inequity within our society, whereby, my generation is way better off than younger people. Another issue is the control and use of data collected by Google and social media companies. Data that are not-transparent may breach our ethical standards, and are essentially used for commercial and political purposes. The challenge for evaluation is to be able to access and use the growing amount of data for reasoned inquiry, balanced decisions and ultimately the public good.
How can the AES position itself to remain relevant into the future?
|It’s so important we stay inclusive of people with very different interests, approaches, backgrounds and experiences with evaluation. We should be a platform to communicate the trends and challenges for evidence and evaluation in public policy.
The Society needs a high profile – it should stand out as the key authority on evaluation in Australia.
Chris Milne is a founding partner of ARTD Consultants, a public policy consulting firm specialising in evaluation established in 1983. While he is mostly retired, Chris continues to chair the ARTD Board and act as a sounding board for ARTD Directors.