Dates and time: Monday 30 April, Tuesday 1 May, and Wednesday 2 May 2018, 9am to 5pm (registration from 8.30am) each day
Day 1, April 30: Stream 1: Fundamentals of Program Evaluation / Stream 2: Theories of Evaluation
Day 2, May 1: Stream 1: How to deal with squeaky wheels and engagement fatigue: Evaluating community engagement / Stream 2: Introduction to Quantitative Methods in Evaluation
Day 3, May 2: Stream 1: Facilitation and Participatory Approaches for Evaluators / Stream 2: Introduction to Social Network Analysis
Please note that each stream is a one day workshop so there are two workshops to choose from each day. Registrants can attend one, two or three days.
Stream 1: John Owen, Jess Dart, and Vanessa Hood and Natalie Moxham
Stream 2: Brad Astbury, Sarah Mason, and Matt Healey and Dan Healy
Target audience: Emerging practitioners, project managers, supervisors. Mixed audience from the private, Government, NFP/NGO, and Community sectors.
Location: Melbourne Metropole Central Hotel, 44 Brunswick Street, Fitzroy
Situated in the heart of Melbourne's cafe district, the Metropole is also adjacent to Melbourne's CBD. More details on venue: http://www.metropolecentral.com.au/ Corporate accommodation rates available
Registrations close: Early bird: 15 April. Final registrations: 26 April
Fees (GST inclusive):
|Type||Early bird||After 15 April|
| AES members
|Student day rate*||$250.00||$250.00|
Workshop Stream 1: Fundamentals of Program Evaluation - presented by John Owen
This workshop aims to provide participants with knowledge about the fundamentals of program evaluation and the translation of these fundamentals into practice. The workshop provides information about basic evaluation theory including the logic of evaluation. In addition, participants will be introduced to the key elements of evaluation: negotiation, data management and dissemination of findings. These elements are fundamental to evaluation planning and practice. The workshop will be based on principles of adult learning. Participants will be engaged in several group tasks designed to show how evaluation fundamentals relate to real-world practices.
Outcomes and Benefits
By the end of the workshop, participants will be aware of fundamental aspects of evaluation practice. The workshop will provide additional readings designed to extend the knowledge gained during the sessions. In addition, participants will be encouraged to consider how the knowledge they have gained has relevance to their workplace.
About the Facilitator: John Owen
John Owen has extensive experience in teaching about evaluation. He has provided a range of professional workshops for the Australasian Evaluation Society over the past two decades. In his role as Director of the Centre for Program Evaluation at Melbourne University, he oversaw the development of a graduate evaluation program, which is now also available online. He is a Fellow of the Australasian Evaluation Society.
Workshop Stream 2: Theories of Evaluation - presented by Brad Astbury
This workshop provides an overview of the origins and evolution of evaluation theory. Attention to theory in evaluation has focused predominantly on program theory and few evaluation practitioners have received formal training in evaluation theory. This workshop seeks to remedy this by introducing a framework for conceptualising different theories of evaluation and a set of criteria to support critical thinking about the practice-theory relationship in evaluation.
Outcomes and Benefits
Participants will learn about:
- the nature and role of evaluation theory;
- major theorist’s and their contribution;
- approaches to classifying evaluation theories;
- key ways in which evaluation theorist’s differ and what this means for practice;
- dangers involved in relying too heavily on any one particular theory; and
- techniques for selecting and combining theories based on situational analysis.
Case examples will be used extensively to illustrate why evaluation theory matters and how different theoretical perspectives can inform, shape and guide the design and conduct of evaluations in different practice settings.
About the Facilitator: Brad Astbury
Brad Astbury is a senior manager at ARTD Consulting and works out of the Melbourne office. He has over 17 years experience in evaluation and applied social research and considerable expertise in combining diverse forms of evidence to improve both the quality and utility of evaluation. He has managed and conducted needs assessments, process and impact studies and theory-driven evaluations across a wide range of policy areas for industry, government, community and not-for-profit clients. Prior to joining ARTD in 2018, Brad worked for over a decade at the University of Melbourne, where he taught and mentored postgraduate evaluation students.
Workshop Stream 1: How to deal with squeaky wheels and engagement fatigue: Evaluating community engagement - presented by Jess Dart
Increasingly, public and private sector organisations are seeing the importance of reaching out to communities to engage, consult, co-design and collaborate on policy, programs and other decisions. Engagement is a pivotal part of many programs. Unleashing the power of evaluation supports the realisation of the full benefit of engagement processes.
While the logic of evaluating engagement is consistent with that of program evaluation, there are some unique challenges to overcome to effectively evaluate engagement. The challenges include dealing with the influence of squeaky wheels, engagement fatigue and challenges around attribution.
Outcomes and Benefits
- Understand the basics of engagement evaluation
- Understand how to scope an engagement evaluation
- Learn some useful criteria for evaluating engagement
- Learn a set of practical steps to select appropriate methods for evaluation
- Create a skeleton evaluation plan for a case study engagement project
About the Facilitator: Jess Dart
Inventor of practical methodologies and highly demanded facilitator, Jess Dart has over 25 years of involvement in evaluating and designing social change programs in Australia and overseas.
With regard to engagement evaluation, Jess has a long history of evaluating engagement and in 2015 she, together with Anne Patlillo was invited to develop this tailored training module for the International Association of public participation (IAP2). Jess is the founder of Clear Horizon Consulting and today works as a principal consultant as well as being the Chair of the board of Directors. Jess is passionate about developing and designing real world methods and process for both program design and evaluation. After completing her PhD she co-authored the Most Significant Change (MSC) guide alongside Dr Rick Davies – which is now translated into 12 different languages.
Workshop Stream 2: Introduction to Quantitative Methods in Evaluation - presented by Sarah Mason
Quantitative methods can seem daunting if you have come to evaluation without a quantitative background. Some evaluators may describe themselves as “just not a numbers person,” or simply shy away from situations that call for quantitative approaches. Yet there is growing consensus that evaluation approaches must be appropriate for their contexts. This means that in everyday practice, evaluators may be called upon to know when and what types of quantitative methods are appropriate for a given situation, or to design tools that will later be used in quantitative analyses.
This workshop is designed to provide an introduction to quantitative methods so those without a quantitative background can build their understanding of (1) when quantitative methods might be appropriate, (2) common quantitative methods to choose from, and (3) how to structure data collection so that it best supports high quality quantitative analyses. It aims to begin demystifying quantitative approaches and to build evaluators’ confidence in foundational quantitative methods.
Outcomes and Benefits
By the end of this workshop, participants will be able to:
- Discuss different types of data and how these affect the types of quantitative methods that can be used
- Explain the different ‘families’ or types of quantitative analysis
- Identify appropriate quantitative analyses for a given situation
- Apply these concepts to the design of data collection tools
- Begin interpreting findings from basic quantitative analyses.
About the Facilitator: Sarah Mason
Sarah Mason is a Research Fellow based at the Centre for Program Evaluation, The University of Melbourne where she has taught classes in quantitative methods and mixed methods for evaluation. Over the past 15 years, Sarah has conducted research and evaluation projects across a wide range of contexts, including Australia, the United States, Afghanistan, East Timor, Myanmar, Thailand and Cambodia. She recently led the design and implementation of an international survey of more than 1,000 schools across the globe. She has post-graduate training in experimental, quasi-experimental and non-experimental research designs, qualitative and quantitative data analysis, program theory-driven evaluation and evaluation theory.
Sarah has an MA in Evaluation and Applied Research Methods from Claremont Graduate University (CGU) and is currently pursuing a Ph.D. in the same field. She regularly presents at international conferences and was recently awarded a Faster Forward Fund scholarship for innovative ideas in evaluation. Before becoming an evaluator Sarah worked as a kindergarten teacher and ran a children’s library in Myanmar. In her spare time she still likes to write children’s stories. Before beginning her graduate studies, she would never have described herself as a ‘numbers person’ but she now enjoys finding meaning (and stories!) in numbers and loves to share this with others.
Workshop Stream 1: Facilitation and participatory approaches for evaluators - presented by Vanessa Hood and Natalie Moxham
Stakeholders are more likely to feel ownership of an evaluation and adopt recommendations if they are engaged throughout the process. Therefore, using participatory approaches and having strong facilitation skills is vitally important for evaluators. This is becoming increasingly apparent as projects become more complex and budgets shrink.
Facilitation can occur before or during an evaluation, or after an evaluation has been completed when stakeholders are considering how to use the findings. It refers to the skills needed to run a productive and impartial meeting during an evaluation, whether making a decision about the terms of reference, key questions, evaluation design, preferred methods and techniques for data collection and analysis, interpreting the data, considering the findings, or simply exchanging ideas and information about the evaluation.
Stakeholder engagement is supported by theory about utilisation and developmental evaluation. It’s also the key to building evaluative thinking and using human centred design approaches.
This workshop is for evaluators who use participatory approaches (beginners to advanced). It is an interactive workshop based on adult learning principles. The structure allows people to experientially learn new skills and relate their insights to their work. Participants are supported to apply their new knowledge. They leave with skills they can apply immediately to their work.
Outcomes and Benefits
Participants will learn and practice facilitation skills during the workshop. They will also leave with an understanding of participatory approaches in evaluation.
About the Facilitators: Vanessa Hood and Natalie Moxham
Vanessa Hood is a skilled facilitator and evaluator with over 15 years' experience in a range of contexts, particularly around behaviour change for sustainability. She works as a facilitator and evaluator with Rooftop Social and also has many years’ experience working in government organisations. Vanessa is passionate about working with people and uses a range of creative facilitation techniques to help participants engage deeply with technical content and, importantly, with each other. Vanessa delivers training, mentors others in evaluation and regularly facilitates group workshops with a range of government and non-government clients across Australia.
Natalie Moxham is a facilitator and consultant in organisational processes, participatory program design, monitoring and evaluation. She successfully designed and delivered training in facilitation, program design, monitoring and evaluation. Natalie's participatory approach uses strength based and engagement processes that build agency and collaboration with communities, networks, organisation and programs. Natalie lives on DjaDjaWurrung Country in central Victoria and has a long-term commitment and extensive experience working with Indigenous groups as well as in the Asia-Pacific region and Australia. She holds a master in international development and works across community, NGO and Government sectors.
Workshop Stream 2: Introduction to Social Network Analysis - presented by Matt Healey and Dan Healy
Social network analysis (SNA) is an approach to capturing and visualising the structure and interactions of the networks and connections that exist between individuals, groups and organisations. Predominantly focused on the social elements that operate within a network (e.g. information flows), it provides the framework for visualising complex and dynamic interplays between different stakeholders. Network analysis can reveal features and insights of networks that are otherwise difficult to see, and importantly can be used in communications, workshops and other engagement activities.
Network analysis can be applied in any context that involves different actors or stakeholders – whether it’s about understanding partnerships among grant recipients, who communities trust and access information from during emergencies or identifying the key influencers or brokers in an organisation.
This workshop will introduce SNA as an evaluative technique, the theory that underpins SNA and the ways it is implemented. The focus is on practical application – you will walk away with the knowledge and skills to design and undertake your own network analysis. This is an introductory level workshop with no prior knowledge needed. The skills learned here will be useful for a range of responsibilities - from evaluation or program design through to stakeholder engagement. You will need to bring a laptop to the workshop as we will step through the network analysis process using the cloud-based platform we prefer to use.
Outcomes and Benefits
By the end of the workshop, participants will be able to:
- Understand the role and value SNA can play as part of your toolkit
- Critically assess whether SNA is appropriate in different contexts
- Design and implement data collection
- Understand different types of analysis
- Interpret results and network maps
- Explore the ways in which SNA results can be used and communicated to stakeholders.
About the Facilitators: Matt Healey and Dan Healy
Dan Healy is a Senior Consultant and co-founder of First Person Consulting. His work includes a range of evaluation and social research projects in the areas of community engagement, public health, sustainability and natural resource management. This has included work with local, state and Commonwealth government, industry and research bodies, and non-government organisations. Professionally, Dan has led teams in all stages of evaluation including: planning, design, qualitative and quantitative data collection and analysis, reporting and presentation of findings.
Dan has implemented SNA over the last six years across a variety of content areas, such as public health, emergency management and natural resource management. SNA has been used as a key tool in many of these evaluation and research projects, with network maps and analysis used for planning, engagement, reporting and sharing results. Dan has training in SNA and is continually engaged in extending his knowledge via participating in conferences and special interest groups. Dan has a keen interest in systems-based approaches to complex issues and making the best use of data to produce insights that are useful and presented in an understandable way.
Matt Healey is a Consultant and Co-Founder at First Person Consulting (FPC). He has worked on social research and evaluation projects across a variety of areas including sustainability, climate change, natural resource management, public health, health promotion, waste management, emergency management and innovation.
His clients span all levels of government, not-for-profits and private and social enterprises. Matt has a reputation as an energetic and adaptable consultant and presenter, known for consistently producing and providing products and services that are both engaging and useful. Matt has implemented SNA in a variety of contexts, and has run training with a variety of groups on SNA.
Matt is Secretary of the Australasian Evaluation Society (AES) Special Interest Group (SIG) on Design and Evaluation, which recognises the links between design and evaluation and the importance and possibilities of these fields complementing each other. His current interests include (among other things) design processes, innovation and the role of evaluation in supporting these, complexity and the role of different stakeholders in such areas.