Welcome to the AES Blog
Keeping it real with Gill Westhorp
by Anthea Rutter
Gill was named an AES Fellow in 2018, and I was pleased to introduce her at the AES conference in Launceston that year. We started with what brought her into the field of evaluation, and what it was about realist methodology that not only piqued her interest but now defines her as a practitioner.
I came into evaluation from a background in human services and managing human services. I’d always been concerned about how we could tell whether we were doing any good or not. I was introduced to realist evaluation through some work I was doing in crime prevention, and it provided a way to work out why some things work for some people but not for others. I found out through reading evaluations that there’s quite a common pattern – that programs often don’t work for those who are most disadvantaged, and some actually cause harm to them. I wanted to know why.
The realist approach assumes that outcomes will be different for different people. The more I worked with it, the more I realised that it’s not just how I approach evaluation, it’s actually how I see the world. I am a realist. It has shaped my life and my thinking in general. People who use it often don’t understand it and often get it wrong. It’s a methodological approach rather than a method, [that is] a philosophy for method.
It was clear from our conversation that Gill is committed to realist philosophies and methodologies. I was intrigued by her passion.
| I describe myself as a realist methodologist. Within that I think my real area of expertise is developing methods or strategies for the application of realist methods in things which are hard to evaluate, for example, prevention programs. How do you evaluate things which haven’t happened? More recently I have looked at how to use realist methods in very large scale, very complex programs.
The other area of interest is in grappling with the implications of the fundamental philosophy of realism. Others have done a lot of work on realist ontology. My two current interests are realist axiology – how you think about valuing from a realist perspective – and what does that mean for evaluation? The other is realist epistemology. Some people have argued that realists are constructivists, epistemologically. But I think there are points of difference and I’m interested in what that means for practice.
All of us have experienced challenges along the way, and I was keen to explore these with Gill.
It’s not a single thing but a range of things. Some commissioners have asked for realist evaluation, but it turned out they didn’t understand it and what it can do. There are challenges in other projects where people who have been taken on as part of the team look as though they will be ok using a realist lens, but it turns out they’re not.
Challenges in terms of the usual constraints on evaluation, money and time. I do pick difficult things to evaluate and there can be challenges with that. Generally it’s the interaction of a number of factors in particular programs. The skill is being able to think through and negotiate the different factors in an evaluation.
She also pointed out some highlights.
A particular one is Nick inviting me to do the PhD –this was in a sense a starting point and an influential moment which changed my direction. I had decided to move into evaluation in some way, but this changed everything.
Writing the standards for realist evaluation was another one – that was an honour – but also working deeply and closely with those who really understood realist approaches. I enjoyed thinking about what really matters if you want to use this approach coherently and consistently.
A number of people and methodologies had a great influence on Gill’s practice.
Nick Tilley and Ray Pawson, of course. Bhaskar’s work, including his model thinking about levels of reality, the empirical, the actual and the real. Patricia Rogers. I’ve done a lot of training in other methods too, and probably each of them has had some influence.
I’ve also adapted other methods to suit realist evaluation. One example is Most Significant Change stories. To do that, you have to look back at what the developers of a particular theory or method were trying to achieve, and the strengths and weaknesses of that for realist work. So for MSC stories, I looked at what Rick Davies intended, but then recognised that selecting the ‘most significant’ changes hides all the variation that realist analysis depends on. So I worked with a project to develop other strategies to maintain that variation while still identifying what it was that mattered to people, and why.
Gill had some definite ideas on how evaluation had changed over the years.
The pendulum swings back and forth in relation to methodologies and methods. At the moment there are parts of government here, and some overseas, that are swinging towards positivist approaches, i.e. Randomised Controlled Trials. I worry about that and think it could be a danger because RCTs don’t give all the information you need to make some kinds of evidence-informed judgments.
I see a lot of younger people coming into the profession, which I think is great. The courses at University of Melbourne (CPE) and our own in Darwin does help to bring in younger people. I see the influence of technology, for example, the ability to manipulate big data.
I think there are some challenges too. For example, the use of social media in evaluation is fraught with dangers, but the ability to record data via iPad in the international development context is great. There are lots of implications in regard to new technologies.
Gill’s response to the issue of skills and competencies for the evaluator of today reinforced some of the fundamental qualities evaluators need in order to be successful practitioners.
The two biggest competencies for evaluators, I think, are the ability to think hard and well, because our job is to make judgments. Your judgments are supposed to be well informed. The skill of the evaluator lies in the analytic ability to think through the implications of what people are doing, but also the implications of the data you’ve collected, and work out what it all means.
The other competency is that you have to be able to engage with people, even though it can be difficult because people often feel uncomfortable with being evaluated, and with some of the findings. The relationship with the client is important.
She was definite about some of the social issues she thinks evaluators should be thinking about as well as helping to resolve in the next decade.
I choose to work in areas that are grappling with things which are threats to humanity – environment and climate issues, or international development issues, which have big implications for the balance of power.
The other priority for me relates to social justice, for example, women’s issues, youth, domestic violence, sexual assault, employment/unemployment – anything to do with social disadvantage, which is underpinned by injustice.
If you let society get unjust enough, and I think we are right there now, then the situation becomes a state of dangerous unrest. Those are my driving forces and where I think that’s where the field of evaluation can make its best contribution.
Gill has been involved with the society in a number of roles: as a committee member, a Board member (twice) and convening a conference committee, so I felt she would be in a good position to ponder the direction which the AES should take in the future.
|The AES has gone through a necessary stage of being inward focused, looking at the constitution, the strategic plan and so on. Now it needs to be more outwardly focused. At this exact moment, it needs to think about the implications of the proposal for an Evaluator General.
The society should have a stronger policy advocacy focus, which should be manifested at both a national and a state level. The members live in states and territories, and for many of us, our working lives are framed by state and territory legislation.
The third way in which it can look outward is dealing with other professions because the things they are doing are informing policy and practice. We need stronger bridges with other fields. It needs to begin a conversation which can inform practice both ways; otherwise we will become irrelevant.
The fourth way is to build some knowledge of the implications of new technologies. There are people within the field with specialist knowledge but many of us don’t know enough, and haven’t thought hard enough, about them as yet. Myself included.
Gill Westhorp is a Professorial Research Fellow at Charles Darwin University, Darwin, where she leads the Realist Research Evaluation and Learning Initiative (RREALI). She is also Director of Community Matters Pty Ltd, a research and evaluation consultancy based in South Australia.