The AES Fellows: More than the sum of their individual contributions
by Anthea Rutter
Over the past two years I have written a monthly blog on our AES Fellows, including Jenny Neale who we sadly lost in 2019.
Looking at the demographics we are a good mix of men and women (with nine male and 11 female Fellows) and cover most states in Australia. However, there are still some gaps in our representation – with no current Fellows from Queensland and the Northern Territory and no Indigenous Fellows.
The process of becoming a Fellow in the AES is very thorough. Apart from needing to be nominated by two people, prospective Fellows have to demonstrate knowledge and experience in a number of areas, including practical evaluation, teaching as well as holding office in the AES. For myself, I regard being a Fellow as an honour as well as a responsibility.
As a group the Fellows, have amassed an abundance of skills and expertise. I felt it was a real privilege to interview them to understand their hopes and their disappointments, as well as their career highlights. This final piece on our Fellows sums up their insights and my own. As professionals in their field, they have honed their craft and have given back to their profession in spades
What were the major influences in the careers of our Fellows?
What was gratifying was that most of the Fellows spoke about other Fellows as influencing their decision-making and career journeys. In my own case, both Jerry Winston and Anona Armstrong encouraged and supported me in the early days of my evaluation career. I caught up with Jerry at Box Hill TAFE in the early 90s as I was wondering how to put together a manual to assist staff to prepare for evaluation, research and monitoring projects. It was through Jerry that I heard about the fledging AES, ran at that stage by Anona Armstrong.
The early evaluation theorists also played a big role for the Fellows. People such as Carole Weiss, Michael Scriven, Joe Wholey, Michael Patton and Ernie House were some of the theorists mentioned.
Policy makers and other evaluation practitioners were also instrumental in the Fellows’ growth as evaluators. Evaluation conferences were a useful source of knowledge, networks and information on the profession, and the AES was a major influence in the Fellows’ career trajectories.
I would suspect that young evaluators today use pretty much the same networks and models – including the AES and conferences – but of course social media plays a greater role for them.
How has the field of evaluation changed?
In the early days of evaluation in Australasia, the profession was trying to define itself or, as Yoland Wadsworth put it (quoting Miles Horton): We were initially ‘making the road by walking’.
Although in those early days there were a number of evaluation approaches to choose from, these have increased considerably – Participatory, Realist, Most Significant Change and Developmental, to name a few. Technological skills, as well as sophisticated software, have assisted us in manipulating big data.
A number of Fellows pointed to how the profession has been strengthened by being multi-disciplinary, and we now recognise the importance of drawing on many fields.
The AES has also changed – early on the membership was mainly academic and government department personnel. Indeed, when I joined there was a large number of accountants, auditors and finance people. (Admittedly this could have been because in the Victorian Branch of the AES we were sponsored by The Department of Finance in the 80s/ 90s). Nowadays the society has more consultants and more people from NGOs.
What we evaluate has also changed and expanded to include environmental initiatives, community development, Indigenous programs and policy advice.
A number of Fellows commented that evaluation is seen now as a profession which has come of age, and there is a greater demand for evaluators’ skills and expertise.
What are the major skills evaluators need to have or develop to keep pace with emerging trends in evaluation practice?
Fellows agreed that the core skills required have remained similar. The idea of the toolbox, with multiple methods, tools, and skills to draw on seemed to be the key. Both quantitative and qualitative skills are required. Another must is to be able to get along with people and be open to different ideas. We also need to be flexible – often half-way through a project, you need to re-group and do things differently. On top of this, a few of the Fellows identified the need to understand both governance and strategy, and to be able to understand the context of an evaluation in these terms.
It was also agreed that evaluators need to be clear about what is needed and ensure that the evaluation design is going to meet those needs. It is important to ensure that the evaluation results serve the needs of those who have to use the evaluation for decision-making or planning.
My thoughts certainly echo those above, and I’d also add that it’s important to keep checking with the funder or sponsor that what you are doing is still on the right track, and be prepared to change if it is not.
What do you see as main social issues/ problems evaluators ought to be thinking about and seeking to resolve in the next decade?
This turned out to be an interesting topic eliciting many different thoughts. For some Fellows, after decades working in the government sector a level of cynicism was obvious in relation to decision-making at higher levels. The general sense from some Fellows was that Governments are more populist than ever and don't seem too keen on making big decisions to address some of the problems our society faces.
One of the issues wrestled with was "how to give a voice to the clients and consumers of human services" so that they can get their message across. The catch-cry of this being "how to speak truth to power" – the theme of the American Evaluation Association conference in 2018, and long a concern of evaluators and public servants. It is so important not only to include the consumers, but to actually listen to them, and make decisions and changes in the areas which affect them.
Other issues for us are the condition of our environment and inequality in our society and what our role should be. We provide evidence to decision-makers but have to leave it to them to make the decisions. How can we have a role in ensuring that things get done?
Some Fellows feel that they have not fulfilled what we set out to do in the 60s and 70s, in terms of influence in policy and program improvements There is talk about transparency, which should produce better program results, but our world does not always work in the way we would like. As evaluators we rely on a supply driven model, which is focussed on delivering reports but not building demand for a feedback mechanism.
Another of the difficulties is that we do not make the decisions about what is to be evaluated. As was pointed out by a few Fellows, evaluators are usually responding to tenders and are generally not called on to suggest topics for evaluation.
Some of the Fellows felt that, as evaluators, we should be more involved at the front-end of program planning and design I think we are seeing more of this, so we are probably having more of a say than we had in the past.
How can the society best position itself to still be relevant in the future?
The Fellows had a lot of ideas about the future of the AES.
When the interviews began a couple years ago, the Fellows identified the need for the AES to engage more with Government, and we have begun to see this happen, for example, through our engagement with major government inquiries and the Productivity Commission’s Indigenous Evaluation Strategy.
Another idea was credentialling of training courses offered by the AES. One Fellow took this further and suggested the establishment of a school or an institute for education around the various approaches to evaluation, which could provide basic entry-level qualifications, right across Australia. The AES would then have a role in representation on the advisory committee for the courses.
A few Fellows commented on the need to have greater links with other societies, in particular those which have cognate interests, such as auditors, market research and the like. This would spread the word about evaluation and partnerships could be beneficial. I should add here that the AES Relationships Committee is working on this issue at the moment.
Most importantly, we need to encourage people with different interests, approaches, backgrounds and experience with evaluation. The rich mix of people would help us to be a platform to communicate the trends and challenges for evidence and evaluation in public policy. We need to be seen as at the forefront of evaluation and be there to be part of the discussion on “where next”.
Continuing to tap into the wisdom of the Fellows
This summary can only skim the surface of the wisdom the Fellows have to offer. If you haven’t already, you can tap into the wisdom of each Fellow through the individual blogs. And the learning doesn’t stop here – the AES has been discussing ways that we can better use the expertise of the Fellows. I’m heartened to see that the Fellows insights will be used in mentoring emerging evaluators.