Share your ideas
AES Blog

Welcome to the AES Blog

Australasia has some excellent evaluators. More than that, we have an evaluation community full of ideas and a willingness to share. The AES has long provided a place for us to come together, at regional events and the annual conference, to develop our community together. Now we’re taking it online! The new AES blog will be a space for AES members – both new and experienced – to share their perspectives, reflecting on their theory... If you have an idea, please contact us on blog@aes.asn.au. Please also view our blog guidelines.

Jerome Winston: 45 years of Evaluation Insights

Jerome Winston: 45 years of Evaluation Insights

by Anthea Rutter

Jerome Winston’s career spans over 45 years. He has fascinating insights into how evaluation was viewed in the 70s, which reminded me that back then, evaluation was not viewed as a separate profession, but as part of other disciplines.

 

I started teaching at Preston Institute of Technology (which, following two mergers, became Phillip Institute of Technology and then RMIT University).  At first, I was teaching both diploma and degree courses in engineering and applied chemistry. When the School of Social Work opened, they were looking for staff who would teach computing and statistics. As an applied scientist, I proposed building monitoring and evaluation into practice, so recommended that computing and statistics be taught as just one aspect of program planning, monitoring and evaluation. This suggestion, first adopted in social work, was later included in a number of other disciplines such as nursing, chiropractic and leisure studies.

 

Jerome then talked about the 80s and the advent of program budgeting in the Victorian – and later, federal – government, and what this meant for the next stage of his career.

Although program budgeting was intended to incorporate evaluation, Jerome believed that reporting simple, aggregated, numerical data as ‘performance indicators’ would not provide the depth of information needed about most government programs.  The use – and misuse – of ‘performance indicators’ became a main focus of Jerome’s research.

In 1978, Jerome designed post graduate programs in data collection and analysis for research, monitoring and evaluation. These programs started at Phillip Institute of Technology (PIT) at about the same time that John Owen’s program in evaluation was starting at The University of Melbourne. Most of Jerome’s career was as a senior lecturer in multi-method research, monitoring and evaluation at PIT (later, RMIT).

  

The AES Fellows’ reasons for coming into the field of evaluation have been eclectic and Jerome presented yet another pathway.

 I wouldn’t have gone into evaluation unless I had started with an interest in both science and government. When I met social work academics at PIT, I found they shared a broad sense of systems theory, research methods, and data collection and analysis. I ended up as an applied scientist teaching multi-method approaches to evaluation in the human services.

My main interest is in applying systems theory to the planning and evaluation of human services. My other interest is integrating multiple methods of data collection and analysis, and their use in building practice knowledge. I don’t expect any method, on its own, to be particularly useful.

 

As an evaluation practitioner, he points to the challenges of bringing together multiple disciplines.

Most of the challenges I have encountered have to do with responding to the splitting of disciplines from each other – finding ways to bridge gaps among disciplines – gaps between public administration, applied science, planning, budgeting, evaluation and management.

 

The main highlights for his career have been about building networks as well as being able to embrace opportunities.

In the 70s and early 80s, colleagues supported me to set up two different networks: the Australian Evaluation Network and its occasional newsletter were intended to link people across Australia. In Victoria, Colin Sharp and I set up the Evaluation Training Network, so that our colleagues could run low-cost evaluation workshops.  Then, meeting Anona Armstrong and being invited by her to contribute to planning the first evaluation conferences, then becoming a foundation member of the AES, and then a Board member. 

Towards the end of the 80s, I was encouraged by colleagues in Canberra to apply for an executive interchange into the Australian Public Service. I was selected to work for six months in the evaluation section of the Commonwealth Department of Finance at the time they were introducing program budgeting – and performance indicators – across the public service. 

About the same time, I started to speak on evaluation and performance indicators at conferences on public administration and accounting in Australia and New Zealand. This led in 1994 to co-leading conference workshops in Kuala Lumpur with Dr. Arunaselam Rasappan – then an evaluation trainer and consultant at the Malaysian government’s public administration training college and later the head of an evaluation research, training and development centre that a few of us established in Malaysia.

 

Of the influences in his career, it was no surprise that they have been practice based.

The first influence was the philosophy of social work to which I was introduced at PIT.  Their approach saw evaluation as an ‘intervention for change’ integral to professional practice. Another influence was having the opportunity to work within the Department of Finance in Canberra on evaluation and what it meant within that department.

 

I also asked him what changes he had seen during his career. Jerome’s perception is that formative evaluation has disappeared as a concept in some organisations that promote evaluation. He thinks that the emphasis has been more on summative and impact evaluation, with limited work on theory, without which summative evaluation provides limited information. 

In Australia and New Zealand, evaluation was typically understood as a team activity. We did not expect one person – ‘the evaluator ‘– to carry out an evaluation, largely on their own, so we did not use the term ‘evaluator’ as frequently as it is used now, referring instead to ‘evaluation teams’ and ‘evaluation practitioners.

 

I was also keen to find out what skills and competencies the evaluators of today need to have to keep up with emerging trends in evaluation practice.

I think most of the members of the AES come from a narrow professional or academic background. In the 80s, the AES conferences included more auditors, public health, public administration and public finance professionals, economists, etc. We need to return to our multi-profession roots, which were evident in evaluation journals in the 1970s and early 1980s.  

If you let society get unjust enough, and I think we are right there now, then the situation becomes a state of dangerous unrest. Those are my driving forces and where I think that’s where the field of evaluation can make its best contribution.

 

When I asked Jerome about what he saw as the major social issues evaluators ought to be thinking about as well as seeking to resolve, his answers were very perceptive.

We need to understand that Indigenous cultures have different approaches to using knowledge in their community from what is common in the dominant Aussie culture. We sometimes have quite naïve approaches to Indigenous cultures. 

Another issue is including the ‘value’ in ‘evaluation’.  Some evaluation practitioners do what they are told is wanted, rather than insist on reporting on how other ‘values’ may influence findings, conclusions and recommendations

 

I asked Jerome how he saw the AES maintain its relevance. His answer was focused and direct.

Build those bridges between professional disciplines that share an interest in evaluation. Take advantage of individuals’ different sources of knowledge and skills. Increase the relevance of evaluation at the practice level, and it is important that we keep doing research about the practice of monitoring and evaluation.

 


Jerome Winston continues to work with the research centre in Malaysia – the Centre for Development and Research in Evaluation. He does a range of work for government and aid programs on how well new evaluation models and frameworks work, and why. He also runs a small consultancy in Australia.


 

Evaluating in a pandemic: why and how and when?
Keeping it real with Gill Westhorp

Related Posts

ABOUT US   |    CONTACT US   |    LEGALS
Search site
© 2023 Australian Evaluation Society Limited 
ABN 13 886 280 969 ACN 606 044 624
Address: 425 Smith St VIC 3065, Australia

We acknowledge the Australian Aboriginal and Torres Strait Islander peoples of this nation. We acknowledge the Traditional Custodians of the lands in which we conduct our business. We pay our respects to ancestors and Elders, past and present. We are committed to honouring Australian Aboriginal and Torres Strait Islander peoples’ unique cultural and spiritual relationships to the land, waters and seas and their rich contribution to society.