Member login
Forgot Login?   Sign up  

This is the AES Blog, where we regularly post articles by the Australasian evaluation community on the subjects that matter to us. If you have an idea, please contact us on This email address is being protected from spambots. You need JavaScript enabled to view it.. Blog guidelines can be found here.

September 2019
by Aneta Cram, Francesca Demetriou and Eunice Sotelo

We’ve heard it time and again: people don’t necessarily set out to be evaluators, but fall into the field. For those of us relatively new or emerging, this can be confusing to navigate.

As three self-identified early career evaluators (ECEs), who also grapple with what it means to be ‘early career’ or ‘emerging’, we were interested to learn more about how ECEs orient themselves, set their career pathway, and build their evaluation capacity.

For the past eight months we‘ve been working on a research project exploring the experiences that current self-identified ECEs have had entering into and developing themselves across the diverse range of entry pathways and work contexts in Australia and, in part, New Zealand.

Our project

We chose to take an exploratory approach to this research for a number of reasons. For one, we wanted to hear peoples’ lived experiences and be able to share them without the confines of a set analytical framework. Secondly, we didn’t know what would emerge or what we would find. From our own experiences as ECEs working in different sectors in Australia and abroad, we knew what interested us about entering the field, but – because of the variety of individuals and experiences – we didn’t want to make any assumptions about who our research participants might be or what their experiences have been.

Our overarching research questions were: What are early career evaluator experiences in entering and developing careers in the evaluation profession? What facilitating factors, opportunities and challenges emerge as important to early career evaluators in their experience entering and developing a career in the evaluation profession?
We decided to contact ECEs through evaluation associations and our own professional networks and asked them to support our work by sharing project information with their networks. From this, we received responses from 49 self-identified ECEs.
Even though we would have liked to have interviewed them all, as this is a voluntary project, we only had the capacity to interview 14. The 14 were ECEs from 5 different states in Australia and New Zealand. We wanted to include a diverse range of individuals and so chose our participants based on age range, geographical location, sectors and cultural identity.


We are excited to share with you some of our emerging findings and see if they align or differ from your own experiences entering the evaluation field.

From the preliminary analysis, some of the stand-out findings are:

  • There is ambiguity around what it means to be in the ‘early career’ or ‘emerging’ stages of evaluation work.
  • Peer support and mentorship, access to training and resources, and the role of evaluation associations have important roles in facilitating support for early career evaluators.
  • Early career evaluators experience different and unique enablers and challenges across the variety of workplace contexts.
  • Individuals have faced challenges around age discrimination, cultural representation in the field, and how identity plays out in the way that individuals approach evaluation practice.
  • Early career evaluators bring unique and diverse values, experiences and lenses to evaluation from their prior professional experience, life experiences and identities.

You can read the emerging findings report here. We will be presenting these early findings and conducting a participatory sensemaking session on Wednesday, 18 September, at the Australian Evaluation Society’s conference. We will be incorporating feedback from the conference session into the final report. Come along and help us make sense of the Australian and New Zealand ECE experience/s.


The research team includes:

Francesca Demetriou


Francesca Demetriou (Project Lead) works as a freelance evaluation consultant. She also volunteers her Monitoring and Evaluation skills to the Asylum Seeker Resource Centre’s Professional Mentoring Program.



Aneta Katarina Raiha Cram

Aneta Katarina Raiha Cram is a Māori and Pākeha (caucasian) evaluator from Aotearoa New Zealand. Her primary evaluation experience has been grounded in Kaupapa Māori, a culturally responsive indigenous way of approaching evaluation and research that prioritises Māori knowledge and ways of being, working with Māori communities in New Zealand. Aneta identifies as an ECE. She is currently working as a sole contractor, and has been living and working in the United States and will be returning to New Zealand early next year to begin her next journey as a Doctoral Candidate. 

Eunice SoteloEunice Sotelo is an educator and evaluator, with a particular interest in evaluation capacity building. Before moving to Australia over three years ago, she worked as a high school teacher, copywriter and copy editor. Her experience as a migrant – moving to Canada from the Philippines as a teenager, and working in China for two years – has shaped the way she sees her role in the evaluation space. She recently volunteered as mentor and trainer for the Lived Experience Evaluator Project (LEEP) at the Asylum Seeker Resource Centre.



Presentations picture

September 2019
by Gerard Atkinson

There is less than two weeks to go until the International Evaluation Conference #aes19SYD, taking place on 15 – 19 September here in Sydney. For those presenting at the conference, it’s time to polish off your presentation skills and get your materials ready. In the theme of “unboxing evaluation”, we’ve unboxed the art of developing effective and engaging presentations and put together an easy guide you can use not just in conferences but in any presentation.

Here are our top tips:

Prepare your thinking.

Preparing for a presentation is entirely different to rehearsal and takes place before you even start making your slides. Effective preparation is about identifying what you want to talk about, doing your research, and building a framework for delivering your presentation. Rehearsal, though important, comes much later.

Create an objective statement.

An objective statement is a single sentence that frames your rationale and scope. A good objective statement clearly articulates the given time period, what the presentation will achieve, and what you want your audience to do as a result. For example, when I teach my one-hour presentations seminar, my objective statement is: “Over the next 50 minutes, I want to cover the key elements of creating and delivering a compelling presentation to inspire you to go make your own.” It’s not setting out to change the world, but it sets out the scope of what to create.

Do your research.

This goes beyond just topic research (which is crucial, of course), and includes understanding such things as:

  • the level of knowledge of the audience
  • the number of people
  • the level of seniority
  • the venue size and layout
  • available technology
  • the time of day.

Develop a presentation framework.

Start building the structure of your presentation as a list or (my favourite) a storyboard. There are many different frameworks and formats out there, and you’ll see quite a few at AES 2019, including the rapid-fire Ignite presentations. My personal favourite framework adapts traditional storytelling techniques by following a format of “Open-Body-Close”. It’s a simple framework but can be adapted to presentations of nearly all formats and lengths. Here’s how it works:

  • The opening section is designed to engage an audience and preview the talk.
  • The body section, which can be repeated for each key point of your presentation, states the point, supports it, then links it logically with the next key point.
  • The closing section reinforces engagement, reviews the topics covered, and gives a call to action to the audience.

By using this framework you can tell many different kinds of stories, for example chronologically or starting broadly and delving deeper into a topic as you go along. You can adapt it to fit the narrative you want to tell.

Kill the deck (if you can).

This is always a controversial tip, but there’s a good reason for it. Slides distract the audience. If you can remove a slide from a presentation, do it. If you need to use slides, remember that they should always be used to underline the point you are trying to make. Photos and (well-designed) charts do this best, followed by diagrams. If you need to use bullet points or text, keep it short and avoid reading them out verbatim.

Use speaker notes.

Scripts can be useful in laying out in exact terms what you want to say in a presentation, but they make it hard to be engaging. Actors train for years to be able to take a script and make it look natural. Instead use speaker notes, which are a shorthand version of a script that outline in abbreviated form the content of each key point. They act as prompts for what you want to say, but allow you to deliver a more natural style of speaking.

Develop useful handouts.

Your slides will not convey the full content of your talk on their own (see above). This means that they shouldn’t be used as handouts. Instead, a handout should be a practical resource that turns the key points of your talk into tools that the audience can use afterwards. Most importantly, distribute handouts after the talk to avoid having distractions during the presentation. 

Rehearse, rehearse, rehearse.

Rehearsal is about replicating your presentation environment as closely as possible. Find a room, set it up as you will on the day, and rehearse the talk as if it were the real thing. It’ll help you get a feel for your timing and flow, and boost your confidence. If you can get some sympathetic co-workers to sit in and give feedback, even better. Repeat this process. The more times you can run through the presentation ahead of time, the more comfortable you will be with the material.

Present with credibility.

Credibility is a combination of confidence, character, and charisma. Confidence comes from research and rehearsal. Character and charisma come from the way you deliver your presentation. Some quick ways to build credibility are to use open body language to engage with the audience, and to vary the way you use your voice (tone, volume, tempo). Both go a long way in engaging the audience and carrying them along with you throughout your presentation.

Handle Q&As at the end.

Question and answer sessions are seen by some people as the trickiest parts of a presentation because they can be hard to predict. Prior research can help you anticipate and prepare for some of the questions you might be asked. It’s best to keep questions until the end of the presentation, as this helps keeps things on track. Let the audience know at the start of the presentation so that they can note down their questions for later. To handle Q&As, here’s a five-step process:

  • Ask: Take a step forward while asking the audience if they have any questions.
  • Select: Select questioners by gesturing to them with an open palm (rather than pointing) or their name, if you know it.
  • Listen: Give questioners total concentration, eye contact, and actively listen to their question.
  • Repeat: Pause, then repeat or rephrase the question to the whole group to show you understand what they’re asking. This also helps when there’s no roving microphone.
  • Answer: Make eye contact with members of the audience while answering.

An alternative (and compatible) approach to managing Q&As effectively comes from Eve Tuck.

  • Ask a neutral person to facilitate the Q&A.
  • At the end of the presentation, invite the audience to talk to each other for a few minutes and share the questions they are thinking of asking.
  • Have the facilitator encourage the audience to consider whether those questions are useful to the broader discussion and best asked during the session, or in another context (e.g. the coffee break).

See Eve’s Twitter feed for the full list of suggestions for Q&As.

I hope these tips can help you prepare, construct and deliver your own presentations with confidence. Looking forward to seeing a lot of great presentations #aes19SYD.


Gerard is a Manager at ARTD Consultants.


michael shannon for Open space blog

September 2019
by Jade Maloney

Ever found yourself more engaged in the coffee break than the conference agenda?

Ahead of the International Evaluation Conference #aes19SYD unconference day, Ruth McCausland, Kath Vaughan-Davies and I trialled an approach for the Australian Evaluation Society NSW meet-up that combined the best of both worlds – purposeful encounters with a coffee break vibe.

We adapted Open Space Technology, established by Harrison Owen in the 1980s, with the aim of finding “a way towards meetings that have the energy of a good coffee break combined with the substance of a carefully prepared agenda.” The approach has since been used around the world as a way of enabling people to self-organise around purpose.

At the NSW meet-up, about 30 evaluators braved the wind and cold to talk about evaluation topics that keep them up at night. For those of you in evaluation, it will be no surprise that these were many and varied:

  • managing your involvement in participatory action research
  • communicating findings effectively, particularly the negative
  • scoping evaluations effectively to meet and manage expectations
  • identifying value and dealing with attribution in an education context
  • planning for the data required for statistical analyses and the ethics of analysis
  • crafting useful and useable recommendations.

And that was before we got to our back-pocket topics

Working what we dubbed the “East Wing”, the “West Wing” and “next door”, groups took their discussions in different directions.  

The group discussing evaluation in an education context shared references: the four levels in Kirkpatrick’s Evaluating Training Programs (reaction, learning, behaviour and results), Guskey’s additional fifth level (although organisational support isn’t a level in the same way), as well as Mitchie, Stralen and West’s COM-B system (thinking about behaviour change in terms of capability, opportunity and motivation).

The participatory action research group was prompted by a question from one evaluator about whether he'd become too involved. They segued into how an evaluator’s participation can shape what is being evaluated and questioned whether this matters. The many lines between questions and the “really??” underneath the word "objective" in their record capture the connecting threads of their conversation, but you had to be there for the depth.

Instead of a traditional report back, we came together as we began – in a circle. The energy was palpably different, shifting from hesitant suggestions to each person sharing something they’d take forward and participants building on each other’s thoughts.

Some focused on practical tips, such as taking the time to clearly scope evaluations upfront and having findings meetings before delivering reports; some on tools (like the COM-B system); others on the process. One participant described it as bringing to life a community of practice in the AES. A few said the problem they’d started with might still keep them up at night, but they felt less alone in it. While we all came from diverse backgrounds, we found common ground among our experiences in NGO, government and private sector evaluations.

Not having set questions to answer gave people the freedom to discuss what they wanted and to go deep on a subject, and the process enabled all to have a voice.

Want to experience the process for yourself? Come along to the #aes19SYD unconference on Tuesday, September 17, to discuss how we might un-box evaluation to better contribute to reconciliation, social justice and a healthy planet. You don’t have to have the answers – just a question and the passion to hold a discussion with others on the subject.

If you’d like to learn more about Open Space Technology, there are a wealth of resources online. Chris Corrigan’s website has an easy-to-navigate collation. Or you could go back to the source: Harrison Owen’s Open Space Technology.


Jade is a Partner & Managing Director at ARTD Consultants.


Fellows anneM

June 2019
by Anthea Rutter

Anne and I have been colleagues and friends for many years. I have long been an admirer of her ability as a practical evaluator and I refer to Anne and Ian’s book frequently for my own practice. I caught up with Anne at the AES International Conference in Launceston, Tasmania, where we found time to share some lunch and some great conversation.

I am always intrigued by the many routes which professionals follow to bring them into the field of evaluation. Although I have known Anne for many years, I was unsure of how she came into the field.

When I was an academic in social work, we were starting to pick up contracts in evaluation. I liked project work, so always put my hand up. I liked the organising aspect, as well as adding new knowledge and improving, rather than service delivery. Eventually, I began sub-contracting for some small evaluation companies before starting my business.

As all of us are aware, we are influenced by a number of elements which eventually shape what and who we are. I asked Anne about the influences which have helped define her practice.

Being part of the evaluation community of practice has been an important part of my career. Being a lone evaluator would be tough without opportunities to engage with other evaluators through conferences and AES Network meetings. You need to interact with others to see different people’s take on things and test your ideas. This is essential for informed practice. Being in a relationship with another evaluator also has its benefits for testing your ideas out.

Anne’s last comment made me reflect that I don’t know a partnership where both parties are evaluators. Over the course of a career, all of us have faced challenges, including AES Fellows. We all have an opportunity to learn from those experiences. I was keen to find out from Anne about the challenges she has faced during her career.

A major challenge in evaluation is managing the political aspect, negotiating the report and findings. People often challenge the findings, so you need all the skills you can muster in terms of negotiation to advocate for and justify your conclusions.

Also, some clients do not fully understand what an evaluation can and can’t do. Expectations that were not part of the original Terms of Reference and outside the remit of the evaluation’s scope and focus often come up when the draft report is delivered.

After being in a career for over 20 years there is bound to be a few changes along the way. I was interested to find out from Anne what had changed and how the field of evaluation is today.

When I was a newbie, I wasn’t sure that the field of evaluation was a good fit for me. At that time, over 20 years ago, the field of evaluation had a more quantitative and positivist focus, with a strong public sector performance and financial management leaning. I was not sure whether I fitted. But evaluation has evolved so much since then. There has been a huge paradigm shift from the quant/qual debates to the evolution of a range of evaluation-specific methods: Realist, Most Significant Change, participatory, developmental etc. Evaluators are also more diverse in their professional backgrounds and methodological leanings. The field is so much richer. It will be interesting to see what happens in the next 10 years!

Anne and I discussed the fact that, with all of the changes in the profession over the years, there could be changes to the skills and competencies needed to cope with the changes. Anne was very specific in her answer and her ideas covered the whole gamut of an evaluation.

Evaluators need foundation skills, including formulating theories of change and evaluation questions, identifying mixed methods data sources and matching data to questions, data collection, analysis and reporting. Evaluators also need foundation skills in how to build organisational systems for both monitoring and evaluation functions. More and more evaluators are being asked to build capacity within organisations for the above competencies and this may be a new skillset for evaluators.

Evaluators also need facilitation skills and conflict resolution skills. And everyone needs an understanding of ethics – you need to know when ethical standards are being held and when they are being compromised.

What do you wish you had known before starting out as an evaluator?

I wish I had known how to better predict and manage my workload as an evaluation consultant… particularly to enjoy the lean periods and just relax into them. The peaks and troughs always evened out over the long term but in retrospect I feel that the troughs were not well used to relax and recuperate from the demanding peaks.

The final question to Anne was a bit of crystal ball gazing. I asked what she saw in AES’s future. Again, in true Anne fashion, she was very clear about where and what the AES should be doing.

I think we should attempt to develop a closer role with government bodies. There are a number of opportunities for building a stronger link between the AES and both levels of government. There was once an exercise undertaken by an AES sub-committee, mapping government bodies across states/territories and nationally, and though a big job, there is an opportunity for the AES to provide an avenue for evaluation capacity building in government.

In terms of training, the AES training program could also cater better to advanced evaluators by identifying specialist areas that could be developed and delivered by experienced trainers in those areas. The AES also needs to make sure its partnerships are robust and, not least, consult its members regularly.


Through her company, Anne Markiewicz and Associates, Anne assists organisations to establish Monitoring and Evaluation (M&E) systems and regularly conducts workshops on developing M&E frameworks for AES members. In 2016, she co-authored Developing Monitoring and Evaluation Frameworks with Ian Patrick.