Loading…
This event has ended. Visit the official site or create your own event on Sched.
Conference registrations have closed | Presenter and chair briefing notes and other information available here.
Self-guided historical walking tours: These walking tours are accessed via the Sydney Culture Walks app and highlight Aboriginal history, heritage & culture: https://www.sydneyculturewalksapp.com/barani-redfern 
https://www.sydneyculturewalksapp.com/barani-warrane

Sign up or log in to bookmark your favorites and sync them to your phone or calendar.

Monday, September 16
 

9:00am AEST

Opening plenary: Welcome to Country, followed by Tracy Westerman "Without measurability there is no accountability. Why we are failing to gather evidence of what works"
Welcome to Country: Uncle Charles Madden, Gadigal Elder
Opening address: John Stoney, President, Australian Evaluation Society

Keynote address: "Without measurability there is no accountability. Why we are failing to gather evidence of what works"

Tracy Westerman (A/Prof, Managing Director Indigenous Psychological Services, 2018 WA Australian of the Year)

We are currently amid a spate of Indigenous child suicides and are now considered to have one of the highest rates of child suicide in the world. Despite this, and as a country facing this growing tragedy, we still have no nationally accepted evidence-based programs across the spectrum of early intervention and prevention activities. Staggeringly, funded programs are not required to demonstrate evidence of impact, nor are they required to demonstrate a measurable reduction in suicide and mental health risk factors. So, given this, can governments truly claim they are funding prevention. If you aren’t measuring risk, you can’t claim prevention. It is that simple.

In an area as complex as Indigenous suicide, it is crucial that funding decisions unsupported by clinical and cultural expertise are challenged and redirected in the best way possible. Toward the evidence. Report after report has pointed to the need for ‘evidence-based approaches’ but has anyone questioned why this continues to remain elusive?

Perhaps we need to start with what constitutes evidence. It doesn’t mean attendance. This is output. Not evidence of impact. It means measurable, outcome-based evidence – a reduction in risk factors attributable to the intervention provided. Without measurability there is no accountability. Without measurability we are failing to gather crucial evidence of what works to better inform current and future practitioners struggling to halt the intergenerational transmission of suicide risk.

A/Professor Westerman will discuss the impacts of determining evidence based approaches to the complexity of Indigenous suicide and mental health. Her body of work includes the development of nine unique psychometric tests, the value of which has been to address significant gaps in this vital area. It provides an opportunity to discuss how we can take Indigenous suicide and mental health to cutting edge.

Presenters
avatar for Uncle Charles (Chicka) Madden

Uncle Charles (Chicka) Madden

Gadigal Elder
Uncle Chicka is a respected Sydney Elder. He has lived in and around the Redfern and inner city area most of his life serving the Aboriginal community as Director or the Aboriginal Medical Service, member & representative of the Metropolitan Local Aboriginal Land Council, Director... Read More →
avatar for Tracy Westerman

Tracy Westerman

“Never let go of yourA/Professoreams” – Let’s cultivate environments which encourage expectations of success rather than failure.A/Professor Tracy Westerman is a trailblazer in Aboriginal mental health, having been named the 2018 Australian of the Year (WA); Inducted into... Read More →


Monday September 16, 2019 9:00am - 10:30am AEST
Pyrmont Theatre
  Plenary

11:00am AEST

Evaluation governance: creating fertile ground
Julian Thomas (Urbis), Ariane Hermann (AustralianGovernment Attorney-General's Department), Amanda Shipway (Queensland Department of Justice and Attorney-General), Adam Nott (Australian Government Attorney-General's Department), Kay Hackworth (Victorian Department of Justice and Community Safety), Alison Wallace (Urbis), Frances McMurtrie (Urbis)

This joint presentation by clients (the Commonwealth Attorney-General's Department and the Queensland Department of Justice and Attorney-General) and commissioned evaluators (Urbis) will show how inclusive evaluation governance arrangements for complex, large scale investments can create a strong foundation for collaborative future action.

This presentation demonstrates a case study the client and evaluator experiences during a national review of the funding for legal assistance services, which involved Australian federal, state and territory governments as joint-commissioning clients. The work focused on the extent to which the $1.3b National Partnership Agreement on Legal Assistance Services 2015-2020 (the NPA) supports shared interests among governments to "improve access to justice and resolve legal problems for the most disadvantaged people in Australia and maximise service delivery through the effective and efficient use of available resources".

For the commissioning governments, the review was an important precursor to inter-governmental negotiations over the future shape of national collaboration on legal assistance services, which can often develop into contested and sometimes protracted processes. Planning for the review was a catalyst for Commonwealth and State/Territory governments to establish a Steering Committee to formulate terms of reference and guide a procurement process. A by-product of this early collaborative work was the establishment of productive, multi-lateral relationships and a shared ownership of process and purpose. Following the appointment of the evaluator, the Steering Committee structure sustained a high level of engagement after appointment of the evaluator, with the participation of the evaluator introducing a new and constructive dynamic.

Our take-away observation is that well designed governance arrangements for large, multi-stakeholder evaluation projects addressing contested issues have significant benefits that extend beyond delivery of the evaluation. In facilitating relationship building around a collective purpose, effective evaluation governance promotes broader, post-evaluation collaboration.


Chairs
avatar for Larissa Brisbane

Larissa Brisbane

Snr Project Officer, CCF Strategic Evaluation Services, Dept of Planning Industry and Environment
It was only a short step from training in environmental science, and a background in cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing your stories of what you've done and what you've learned, especially in the areas... Read More →

Presenters
avatar for Julian Thomas

Julian Thomas

Director, Urbis
I'm an evaluator with a particular interest in policy/programs tackling complex social issues, and my work at Urbis is concentrated on health, education and justice portfolios.. My favourite evaluations have focused on how large scale/national investments are planned, delivered and... Read More →
avatar for Ariane Hermann

Ariane Hermann

Assistant Secretary, Attorney-General's Department
Starting my career as an organisational and counselling psychologist, I joined the Commonwealth Public Service in 2009. I have had a range of positions across the Attorney-General's Department, including leading the development of a National Impact Assessment Framework and a National... Read More →
avatar for Amanda Shipway

Amanda Shipway

Director, Legal Assistance, Strategy and Funding (LASF), Department of Justice and Attorney-General (DJAG)
Amanda Shipway has worked for the Department of Justice and Attorney-General since 2009, her fist role with the DJAG was at Victim Assist Queensland where she worked for five years as the Victim Services Coordinator, before moving to her current role as Director, Legal Assistance... Read More →


Monday September 16, 2019 11:00am - 11:30am AEST
C2.6

11:00am AEST

Logic is the beginning of wisdom, not the end of it
Kale Dyer (Family & Community Services)

Program logics provide a framework for a systematic, integrated approach to program planning, implementation, and evaluation. They foster a shared understanding of how a program operates by clearly articulating program activities and desired outcomes, and clearly illustrating the change processes underlying an intervention.

This presentation will demonstrate an extension of program logic focusing on better integrating evidence, making explicit the mechanism of change, and embedding the NSW Human Services Outcomes Framework into program design and evaluation. A distinguishing feature of the approach is the inclusion of sections that articulate the research evidence and mechanisms of change for the program. The approach includes the evidence base for how and why the core components and flexible activities that make up the program are expected to achieve the proposed outcomes. The ability to generalise program findings is improved by identifying core components and flexible activities. These evidence extensions highlight why components of the program are likely to be effective, and links client needs to intended outcomes. This clarification facilitates improved commissioning of research and evaluation, embedding evidence in programs, explicit discussion of mechanisms of change, and a client centred approach to achieving outcomes.

Discussion around the benefits and challenges of implementation of this extended program logic model in a government agency will be provided. Benefits include how it has facilitated more effective program evaluations by identifying areas of focus, informing the development of meaningful evaluation questions and identifying relevant client centred measures to address those questions.


Chairs
avatar for Squirrel Main

Squirrel Main

Research and Evaluation Manager, The Ian Potter Foundation
Dr Squirrel Main is The Ian Potter Foundation's first Research and Evaluation Manager and she co-chairs the Philanthropic Evaluation and Data Analysis network. Squirrel completed her Masters at Stanford University in Evaluation and Policy Analysis (with David Fetterman--hello David... Read More →

Presenters
avatar for Caroline Anderson

Caroline Anderson

Senior Evaluation Officer, NSW Department of Communities and Justice
Caroline is a Research and Evaluation specialist with expertise evaluations of system-level reform, as well as program and project level evaluations. For the past fifteen years, Caroline has worked across State Government, not for profit organisations and academia. Caroline has a... Read More →
AK

Alice Knight

Manager of Evaluation, Department of Communities and Justice
Alice Knight has over ten years experience working in the government, academic and not-for-profit sectors, with academic and policy expertise in the health and human services sectors. Alice currently works at the NSW Department of Communities and Justice (DCJ) where her priority is... Read More →


Monday September 16, 2019 11:00am - 11:30am AEST
C2.5

11:00am AEST

Integrating Behavioural Insights into Evaluation
Georgia Marett (ARTD Consultants), Jack Cassidy (ARTD Consultants)

This presentation shares insights into how behavioural economics and Behavioural Insights (BI) are used in program and service design and explores ways in which evaluation can and should take BI into account. A critical concept discussed in this paper is cognitive load. Research shows that cognitive overload can negatively impact decision-making and lead to more shallow processing of information and poor information retention. One method by which BI improves program decision-making and evaluation quality is by increasing the cognitive capacity of individuals.

We illustrate how service design can take cognitive load and BI into account and what might happen if BI are ignored when designing programs. Then, we examine and explain how to evaluate programs which have incorporated BI (including how cognitive load can be incorporated into a logic model, monitoring and evaluation frameworks and/or key evaluation questions). Finally, we conclude with a discussion about whether evaluation effectively uses the cognitive capacity of its stakeholders and practitioners.

This subject is important because while BI is a hot-topic in general and in evaluation, there is a lack of understanding about the ways in which it can be applied and how to evaluate those applications. Cognitive capacity is less well understood but is vital to understanding how to craft effective services, evaluate these services and conduct evaluations regardless of whether BI are included in the target of the evaluation. We will tie this into a realist perspective of evaluation through a discussion of how BI differ in effectiveness between people and situations.



Chairs
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →

Presenters
avatar for Georgia Marett

Georgia Marett

Consultant, ARTD Consultants
I have many and varied interests when it comes to evaluation. I work across a variety of sectors including education, health and disability. I have a masters in Behavioural Economics so I am always keen to talk about the latest innovations in this space. I also have an interest in... Read More →
avatar for Jack Cassidy

Jack Cassidy

Consultant, ARTD
I mostly work on evaluations of large and complex programs across a range of human services sectors. As a former psychology major, I'm interested in policy informed by the behavioural sciences, particularly behavioural economics, as well as drug & alcohol and mental health responses... Read More →


Monday September 16, 2019 11:00am - 11:30am AEST
C2.2

11:00am AEST

Applying Systems Evaluation Theory
Ralph Renger (Just Evaluation Services, US), Lewis Atkinson (Haines Centre for Strategic Management), Brian Keogh (Cobalt59)

This interactive session will use Systems Evaluation Theory (SET) applied to a case study to explore the limitations of logic models in capturing context and evaluating complexity.

Systems Evaluation Theory (SET) has been developed because of a frustration with logic models in evaluation being linear, isolated and removed from context (Renger R. , 2015) (Reynolds, 2016). SET is advanced as a model that is closer to the true workings of a program. SET looks at a program as a series of systems (rather than a linear cause and effect process) and develops an understanding of the various interactions. Using SET, an evaluation also develops an understanding of the influence of the surrounding environment.

SET incorporates all the principles of effective use of systems thinking in evaluation, released at the end of 2018 by the American Evaluation Society.

A case study will illustrate the benefits of using the systems thinking concepts of elements, relationships & boundaries to guide adaptation to emergent and dynamic realities in complex environments. This study will also show the important link between effectiveness and efficiency, a concept often completely overlooked using program logic. This link is explored through:
  1. Achieving, maintaining and streamlining standard operating procedures
  2. The use of system feedback loops
  3. Reworks and reflex arcs
  4. Subsystem interactions

Chairs
avatar for Gill Westhorp

Gill Westhorp

Professorial Research Fellow, Charles Darwin University
Gill leads the Realist Research Evaluation and Learning Initiative (RREALI) at Charles Darwin University. RREALI develops new methods and tools within a realist framework, supports development of competency in realist approaches and provides realist evaluation and research services... Read More →

Presenters
avatar for Brian Keogh

Brian Keogh

Principal, Cobalt 59
Brian has over 20 years experience in evaluating programs and business models. His consultancy work ranges from facilitating the creation of strategic plans and business cases to carrying out detailed impact and efficiency evaluations. His is particularly interested in the integration... Read More →
avatar for Ralph Renger

Ralph Renger

President, Just Evaluation Services LLC
I spent the first part of career advancing program evaluation methods. I am currently focused on refining a framework for guiding the evaluation of systems.
avatar for Lewis Atkinson

Lewis Atkinson

Global Partner, Haines Centre for Strategic Management Ltd
Lewe is an architect of strategic and social transformation using the systems thinking approach. He is also a member of the Ian Potter Foundation evaluation pool. The Haines Centre for Strategic Management Ltd is a Global Network of facilitators, consultants & trainers using the... Read More →


Monday September 16, 2019 11:00am - 12:00pm AEST
C2.4

11:00am AEST

The un-boxed game: Snakes and Ladders for illustrating the variability of evaluation projects over the career of the evaluator
Anne Markiewicz (Anne Markiewicz and Asssociates Pty Ltd), Susan Garner (Garner Willisson)

We are going to un-box an interactive game designed by two experienced presenters. The game will be Snakes and Ladders adapted to illustrate the ups and downs in the trajectory and life of the evaluator. Well designed assignments with realistic terms of reference and expectations and good stakeholder engagement will push the evaluator upwards in the game whereas ill-conceived, unrealistically scoped and politically challenged projects with hidden agendas and questionable stakeholder engagement will push the player downwards.

Topic for discussion: The fluctuating trajectory and experiences of the evaluator in conducting evaluation projects. This presentation has been influenced by the key note speaker from AES 2018 Karol Olejniczak 'Transforming evaluation practice with serious games'.

Purpose: To use game theory to illustrate how some evaluation projects go well and the success factors involved while others projects do not.
Participants should enjoy the interactive session and provide a forum for them to reflect on their experiences with evaluation projects. The session will highlight success factors and factors that get in the way of successful outcomes in evaluation projects.


Chairs
avatar for Sarah Renals

Sarah Renals

Senior Consultant Evaluation + Research, Allen + Clarke
Sarah is a senior consultant based in Brisbane working for Allen + Clarke. Sarah shares the role of aes20 Conference Co-Convenor and is currently the acting Queensland Regional Committee Convenor and Secretary. Come and speak to Sarah about: Allen + Clarke, aes20 Brisbane and the... Read More →

Presenters
avatar for Anne Markiewicz

Anne Markiewicz

Director, Anne Markiewicz and Associates
Anne Markiewicz is a leading monitoring, evaluation, and training specialist with over 20 years’ experience designing monitoring and evaluation frameworks and developing and implementing a range of evaluation methodologies. She has designed and delivered M&E capacity development... Read More →
SG

Susan Garner

Director, Garner Willisson
I've been involved in evaluation work since 1996, when I was asked to evaluate a health policy about funding of public hospitals. I got 'hooked' from that point onwards, and even after 30 years in the field of evaluation, there's always another challenge to embrace, something interesting... Read More →


Monday September 16, 2019 11:00am - 12:00pm AEST
C2.3

11:00am AEST

Bringing the voice and knowledge of indigenous people and communities to evidence building and evaluation in a way that empowers
This panel will explore key ideas about how evaluation can improve and be more useful in meeting the needs of Indigenous people now and into the future. We will examine what evaluation and evidence means in an Aboriginal and Torres Strait Islander context and how culture and knowledge systems can inform concepts of evaluation. The panel will also discuss real-world suggestions for how commissioners of evaluations and evaluators can support communities and those who work with them to own their evaluations, by fostering a culture of empowerment, strengths based, collaborative design.

Presenters
avatar for Emma Walke

Emma Walke

Academic Lead Aboriginal Health - Co-Lead Community Engagement, University Centre for Rural Health
I'm a Bundjalung woman, my family from Cabbage Tree Island/Ballina area Northern NSW. I have a role that works with Medical and Allied health Students when away from their home base universities to understand and work better with Aboriginal and or Torres Strait Islander peoples. I... Read More →
avatar for Doyen Radcliffe

Doyen Radcliffe

Regional Manager, Indigenous Community Volunteers
Doyen Radcliffe is a Yamatji Naaguja Wajarri man from the Midwest Region of Western Australia. Doyen  is a community minded individual with a passion for empowering  Indigenous communities to reach their real potential to improve  quality of life, health, social and economic wellbeing... Read More →
avatar for Simon Jordan

Simon Jordan

Director, Aboriginal Projects and Partnerships, ARTD
Simon is a highly experienced Aboriginal practitioner and policy analyst with expertise in service design, evaluation, program development and management and change management. Simon has over 20 years’ experience in the human services field, leading major change initiatives across... Read More →
avatar for Megan Williams

Megan Williams

Senior Lecturer​ and Head, Girra Maa Indigenous Health Discipline, ​University of Technology Sydney
Dr Megan Williams is Senior Lecturer and Head of the Indigenous Health Discipline at the Graduate School of Health and a Wiradjuri descendent through her father’s family. She has over 20 years’ experience combining health service delivery, evaluation and research, particularly... Read More →
avatar for Nattlie Smith

Nattlie Smith

Manager Paths to Positive Client Outcomes, Aboriginal Housing Office
Natt is a proud Wiradjuri woman, whose mob is originally from Condobolin. She has over 20 years of experience in both policy and operational roles across many human service areas including Aboriginal health, Aboriginal Home Care Service, education, disability, aged care, child protection... Read More →


Monday September 16, 2019 11:00am - 12:00pm AEST
Pyrmont Theatre
  Special session

11:00am AEST

Using evidence for impact: the client-consultant relationship
Brendan Rigby (Victorian Government),  Zoe Enticott (Department of Education and Training, Government of Victoria)

The client-consultant relationship has evolved to become an integral part of the delivery of evaluation services for government. There is an increasing need for the expertise, objectivity and capacity that consultancies can bring and contribute to building the evaluation evidence base of what works and what doesn't.

This panel discussion will explore the changing nature of the client-consultant relationship, unpacking the crucial elements of a quality relationship and provide practical examples of how two Victorian Department of Education and Training (DET) staff and two external evaluation consultants have worked together to address the anticipated and the unanticipated challenges throughout the delivery of an evaluation.

Moderator will introduce panellists and guide the discussion with key questions, followed by questions from the audience.
  • How do you define 'evaluation'? How do you work in partnership to develop a shared understanding of the purpose and value-creation of conducting an evaluation?
  • What are the most important factors underpinning a strong client-consultant relationship?
  • What power-dynamics are prevalent in the client-consultant relationship and how might these dynamics shift throughout the process of conducting an evaluation?

Moderator: Jess Dart, CEO & Founding Director, Clear Horizon

Chairs
avatar for Donna Stephens

Donna Stephens

Research and Project Manager Wellbeing and Preventable Chronic Diseases Division, Menzies School of Health Research

Presenters
avatar for Brendan Rigby

Brendan Rigby

Manager, Department of Education and Training
In my current role as Manager of the Integration and Evidence Unit, I lead the strategic and operational implementation of initiatives to support the improvement journeys of Victorian government schools. My team acts an interface between policy and practice, brokering the generation... Read More →
avatar for Matt Wright

Matt Wright

Partner, Deloitte Access Economics
Matt is a Partner in Deloitte Access Economics’ National Education Practice, responsible for leading our significant contributions to School Improvement and Vocational Education and Training policies and programs. Matt is recognised for a decade of leadership in the design, implementation... Read More →
avatar for Ruth Aston

Ruth Aston

Postdoctoral Research Fellow, University of Melbourne
Dr Ruth Aston is a Research Fellow at the Centre for Program Evaluation. Ruth has project managed several large-scale evaluations across Australia and internationally. She has a background in public health, including working on a three-year national project investigating the workforce... Read More →
MK

Megan Kerr

Manager, Evaluation, Victorian Department of Education and Training
Megan is a public policy professional with over 15 years experience in policy and program design, implementation, evaluation and research. Megan has worked across education, health, and community development settings in the government and non-government sectors and is currently the... Read More →
avatar for Jess Dart

Jess Dart

CEO & Principal Consultant, Clear Horizon Consulting
Dr Jess Dart is the founder and CEO of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of experience... Read More →


Monday September 16, 2019 11:00am - 12:00pm AEST
C2.1

11:30am AEST

Early insights from evaluating post-disaster community recovery
Claire Grealy (Urbis), Christina Bagot (Urbis)

This presentation draws upon our experience over the last decade in undertaking evaluations of disaster recovery efforts in Victoria and Queensland. Key themes explored in this presentation consider the need for: evaluation-informed program design, accessible platforms for communities to participate in the evaluation, consultations to be tailored and trauma-informed and, evaluation methods that consider the dynamic ongoing recovery-context.

Evaluating recovery efforts presents unique challenges for evaluators and our work in this area emphasises the importance of careful planning and consideration around the logistics of the consultation (data collection) phase. In particular, communities recovering from disaster face additional barriers to traditional consultation methods and there is a need for evaluators to create accessible platforms for a range of people to provide their input.

Evaluators also need to be aware and equipped to consult with communities and service users who are have experienced acute and recent trauma. Our experience has shown that trauma takes many different forms for individuals and this affects their interaction with services and participation in consultations. Ethical conduct is paramount, using trauma-informed research methods and consultation processes that enable the collection of a range of perspectives but still safe-guard informants from re-traumatisation.

In addition, our evaluations have also found that consultations need to be conscious of prevailing cultures of 'don't speak up' and its impact on data collection activities, particularly in rural communities where stoicism is the norm. Stigma of mental health symptoms and attitudes towards help-seekers can hinder the willingness of community members to acknowledge the range of consequences arising from the disaster.


Chairs
avatar for Larissa Brisbane

Larissa Brisbane

Snr Project Officer, CCF Strategic Evaluation Services, Dept of Planning Industry and Environment
It was only a short step from training in environmental science, and a background in cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing your stories of what you've done and what you've learned, especially in the areas... Read More →

Presenters
avatar for Claire Grealy

Claire Grealy

Partner, Urbis
Motivated to influence - better experiences, better systems and better places, for everyone. Evaluation + is how I think about it - always with an eye to translating the learning into knowledge and change for the better. Passionate about working with our partners to influence change... Read More →
avatar for Christina Bagot

Christina Bagot

Senior Consultant, Urbis


Monday September 16, 2019 11:30am - 12:00pm AEST
C2.6

11:30am AEST

Contribution analysis: Evaluating the impact of intensive family services, applying theory in a real-world context.
Jane Howard (Department of Health and Human Services), Ms Gina Mancuso (Department of Health and Human Services)

It can be a challenge to demonstrate causality between intervention activities and desired outcomes, especially when multiple factors, contexts and players influence outcomes. Traditionally, causality is determined using experimental approaches. However, for many interventions it is not practical, feasible or ethical to conduct this research to measure an intervention's societal level impacts. Contribution Analysis (CA) is an alternate methodology evaluators can use to build credible and plausible evidence-based arguments to demonstrate whether intervention activities contribute to observed outcomes when there are limitations for available data.

The Intensive Family Preservation Services (IFPS) model is widely used to improve family functioning, to reduce children's entry into Out of Home Care and to facilitate family reunification. The Centre for Evaluation and Research conducted an evaluation of the 200 Hours Intense Family Support Service, an example of the IFPS model.

The program's evaluation sought to examine the impact of the program in terms of family functioning, and rates of family preservation and reunification. Using a quasi-experimental design, families receiving the intervention were compared with those who did not. To improve the credibility and quality of the data collected to judge the extent to which the program contributed to its desired outcomes, CA theory was applied. The evaluators developed a theory of change identifying the program's aims and underlying assumptions.
Appropriate application of theory to practice is an important skill for evaluators. This paper will discuss why CA was chosen and how:
  • CA was integral to articulating key research questions and a reasoned theory of change
  • CA informed the analytical and data collection process - what data was collected, and methods used
  • The process of applying theory to support evaluative conclusions is valuable when working with a small sample size to determine impact
  • CA complemented the evaluation methods and explore lessons learnt
  • To apply evaluative theory in a real-world setting, acknowledging that this process can be challenging.


Chairs
avatar for Squirrel Main

Squirrel Main

Research and Evaluation Manager, The Ian Potter Foundation
Dr Squirrel Main is The Ian Potter Foundation's first Research and Evaluation Manager and she co-chairs the Philanthropic Evaluation and Data Analysis network. Squirrel completed her Masters at Stanford University in Evaluation and Policy Analysis (with David Fetterman--hello David... Read More →

Presenters
avatar for Jane Howard

Jane Howard

Evaluation and Research Officer, Department of Health and Human Services
I have a passion for advancing evidence based best practice so that we learn from the past and avoid re-inventing the wheel. By producing high quality qualitative and quantitative data management, evaluation makes a significant contribution to inform program funding opportunities... Read More →


Monday September 16, 2019 11:30am - 12:00pm AEST
C2.5

11:30am AEST

Evaluating creatively: Capturing the diverse voices of children and young people involved in early intervention programs
Kylie Evans-Locke  (CareSouth)

Understanding the impacts and outcomes in evaluations into child protection programs utilizing the child and young person's voice has generally been met unfavourably, least of all, from parents and guardians. This is understandable as there are specific variables that require addressing. This includes gaining informed consent, ensuring age-appropriate activities and providing adequate supervision.

Traditional quantitative methods place the focus of activities on parents and guardians. This is not without contending issues including motivation, time, and demands efforts for program staff.
Even though traditional methods are valuable in capturing the program impacts on parents, and the wider family unit, they provide a minimal understanding of the child or young person's direct experiences. In seeking to gain a clearer understanding of important program experiences through the eyes of children, we sought to utilise experiences from other developmental and social science disciplines that have successfully evaluated effects for cohorts with similar lived experiences of trauma comparable to CareSouth. This required using interactive activities such as body-mapping which facilitate conversations and drawing, with trained professionals to gather more nuanced experiences of children and young people.

This paper will examine how we used this methodology to better understand the impacts of mentoring on children and young people involved in early intervention programs. It will detail how art and conversation were effectively used to meaningfully capture the role of adult mentors on the development of self-confidence and social skills in children and families in early intervention programs. Such methodology will lend commentary to understanding the utility of different methodology to better understand the experiences of children and families with lived experiences of trauma participating in early intervention programs.


Chairs
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →

Presenters
KE

Kylie Evans-Locke

Research Coordinator, CareSouth
I am the Research Coordinator at CareSouth which involves responsibilities including undertaking and managing program evaluations, along with designing program logics and service design models. Previously working in areas of higher education, tertiary program evaluation and international... Read More →


Monday September 16, 2019 11:30am - 12:00pm AEST
C2.2

12:00pm AEST

Empathy mapping - Discovering what they value
Andrew Moore (NZ  Defence), Victoria Carling (NZ Defence, NZ)

Empathy mapping is an emerging collaborative approach that focuses on the results of a programme. Used to gain the perspective of different stakeholders, from the commissioner to the programme participants, it seeks to define what they truly value from a programme. Empathy mapping requires participants to reflect on what success looks like, according to them, by considering what they would see, hear, do, say, think, or feel during and post programme. The results can then be used, as the building blocks of evaluation rubrics to define measurable criteria. The collaborative approach ensures a shared understanding is achieved on the quality, value, and effectiveness of a programme.

Drawing from their experience the presenters will demonstrate how empathy mapping has been used to build the foundations for successful evaluation within NZ Defence. Highlighting how empathy mapping can maximize contact time with key stakeholders, document the shared understanding of programme results and subsequently promote a collective interpretation of evaluation reports.

The session will allow participants to gain an insight into: What is empathy mapping? Where did it come from? What are the components of an empathy map? Why are they useful as building blocks for evaluation practice? How they can be used to build evaluation-rubrics?


Chairs
avatar for Larissa Brisbane

Larissa Brisbane

Snr Project Officer, CCF Strategic Evaluation Services, Dept of Planning Industry and Environment
It was only a short step from training in environmental science, and a background in cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing your stories of what you've done and what you've learned, especially in the areas... Read More →

Presenters
VC

Victoria Carling

Regional Evaluator, NZ Defence Force
avatar for Andy Moore

Andy Moore

Senior Advisor Performance and Evaluation, NZDF
I have had thirty years of experience in conducting, designing, and evaluating vocational training within NZ Defence. In that time, I have deployed oversea, predominantly in the role of coordinating aid programmes on behalf of NZ Defence. My current role is the Senior Adviser Performance... Read More →


Monday September 16, 2019 12:00pm - 12:30pm AEST
C2.6

12:00pm AEST

Unpacking the skills required for an evaluator - Learning from the past to prepare us for the future
Anthea Rutter (The  University of Melbourne)

What does it mean to call ourselves an evaluator? How do we define our craft? Let me pose another question: what do you put on your departure card when leaving Australia? - Evaluator?

I once put evaluator on my departure card before a flight to the States. I then spent the best part of an hour in Los Angeles airport trying to explain to a customs official what exactly an evaluator is. I felt it would have been so much easier to be a Plumber, an Electrician or a Nurse. We can easily conjure up the visual - somehow, it's not the same for an evaluator. However, defining our craft is important so that others, whether they are emerging evaluators or clients will understand what we are about, as well as what we are not about.

The AES Fellows are an important resource for understanding the history of evaluation, how it has evolved as well as looking towards the future. During the last eight months or so, I have interviewed the majority of the AES Fellows to get their take on what it means to be an evaluator today. I was rewarded by an honest and reflective look at their careers and gleaned some ideas for those emerging evaluators. For a number of those early pioneers of evaluation, they came into it when it was a fledgling field when it was still in the throes of trying to define itself. It has emerged as a profession and has been strengthened by becoming multi-disciplinary as it recognises that it needs to draw on many fields.

In this short paper, I will present some of those thoughts and experiences of the AES Fellows, to illuminate the path, if possible, for new evaluators. I would hope to pass on some ideas which can assist in skill building as well as identifying the qualities needed for the evaluator of today. This paper should add to the knowledge base in terms of providing some valuable information on the perceptions of those evaluators who have gone before.


Chairs
avatar for Sarah Renals

Sarah Renals

Senior Consultant Evaluation + Research, Allen + Clarke
Sarah is a senior consultant based in Brisbane working for Allen + Clarke. Sarah shares the role of aes20 Conference Co-Convenor and is currently the acting Queensland Regional Committee Convenor and Secretary. Come and speak to Sarah about: Allen + Clarke, aes20 Brisbane and the... Read More →

Presenters

Monday September 16, 2019 12:00pm - 12:30pm AEST
C2.3

12:00pm AEST

The Consolations of Evaluation Theory
Brad Astbury (ARTD), Andrew Hawkins (ARTD)

Conducting an evaluation is never easy - it must be rigorous, practical, useful, real-world, and participatory. It is almost always the case that there is insufficient time to do it all in one study. But how do we determine what is the best approach right now? Rather than follow the latest fashion or treat everything as a nail because all we have is a hammer, we believe the best place to find answers is in the consolations of evaluation theory - that is, in examination and reflection on the fundamental questions that have occupied key evaluation thinkers over the last 60 years.

In this paper we present a series of ideas and conceptual maps that have been developed and used to focus any given evaluation. Each map or diagram considers similar fundamental issues and theorists but for slightly different uses.

The first speaker will articulate the nature and components of evaluation theory and distil insights from the 'big seven' theorists as identified in Shadish, Cook and Leviton's (1991) seminal text Foundations of Program Evaluation: Theories of Practice. A schematic of the practice-theory relationship in evaluation is offered to highlight ways in which various kinds of theory can be integrated to support and guide the design and conduct of evaluation.

The second speaker will present two maps- one on the 'information to cost ratio' that is designed for decisions about the most appropriate method for impact evaluation, and one on 'navigating uncertainty' which is broader and focused on the use of evaluation for piloting a path from a current problem to a desired future state. The second map integrates some newer theorists post 1991, particularly realist and complexity theorists.


Chairs
avatar for Squirrel Main

Squirrel Main

Research and Evaluation Manager, The Ian Potter Foundation
Dr Squirrel Main is The Ian Potter Foundation's first Research and Evaluation Manager and she co-chairs the Philanthropic Evaluation and Data Analysis network. Squirrel completed her Masters at Stanford University in Evaluation and Policy Analysis (with David Fetterman--hello David... Read More →

Presenters
avatar for Brad Astbury

Brad Astbury

Director, ARTD Consultants
Brad Astbury is a Director at ARTD Consulting, based in the Melbourne office. He has over 18 years’ experience in evaluation and applied social research and considerable expertise in combining diverse forms of evidence to improve both the quality and utility of evaluation. He has... Read More →
AH

Andrew Hawkins

Partner, ARTD Consultants
Andrew works as a trusted advisor and strategic evaluator for public policy professionals, generating insight and evidence for decision-making. Andrew has worked for a wide range of Australian and NSW public sector agencies and not-for-profits on the design, monitoring and evaluation... Read More →


Monday September 16, 2019 12:00pm - 12:30pm AEST
C2.5

12:00pm AEST

The role of evaluation in social impact bonds
Sue Leahy (ARTD Consultants), Ruby Leahy Gatfield (ARTD Consultants), Claudia Lennon (ARTD Consultants)

Social Impact Bonds (SIBs) are spreading worldwide, receiving bipartisan political support as an innovative financial instrument that can align public and private interests while addressing complex social problems (Fraser et al, 2016).

In a SIB, a non-government investor supplies the capital for a new social program and, if this program is deemed successful according to agreed measures, the government repays the initial investment plus an agreed amount of interest. The return on investment is dependent on the degree of improvement in social outcomes, and the precise structure of the bond. Outcomes measurement for the bond is conducted through a financial lens, linked closely to repayments.

So, what is the role for evaluation? In this paper, evaluators and program staff reflect on a five-year evaluation of the first SIB to mature in Australia. They describe some of the challenges for evaluation in a bond context. They also highlight the key benefits of evaluation in identifying learnings and improvement for both the program and the bond mechanism itself.


Chairs
avatar for Gill Westhorp

Gill Westhorp

Professorial Research Fellow, Charles Darwin University
Gill leads the Realist Research Evaluation and Learning Initiative (RREALI) at Charles Darwin University. RREALI develops new methods and tools within a realist framework, supports development of competency in realist approaches and provides realist evaluation and research services... Read More →

Presenters
avatar for Ruby Leahy Gatfield

Ruby Leahy Gatfield

Manager, ARTD Consultants
Ruby is passionate about working with communities to ensure they have a voice in the services that effect their lives. She uses participatory, empowerment and developmental approaches to evaluation and building evaluation capacity, particularly in the Aboriginal and community development... Read More →
avatar for Sue Leahy

Sue Leahy

Managing Director, ARTD Consultants
Sue is an accomplished evaluator, policy analyst and facilitator and managing Principal at ARTD, a leading evaluation and public policy company based in NSW. She joined ARTD in 2009 from the NSW Department of Family and Community Services, where she managed a wide-ranging program... Read More →
CL

Claudia Lennon

The Benevolent Society
Claudia Lennon is a qualified social worker with 20 year’s experience. She has worked for the health system, not for profit and non government agencies. She has 14 years of management experience across a variety of areas including family preservation, youth homeless, asylum seeker... Read More →


Monday September 16, 2019 12:00pm - 12:30pm AEST
C2.4

12:00pm AEST

A trauma informed approach to capturing the voices of vulnerable children in Out-of-Home-Care evaluation
Suzanne Evas (Department of Health and Human Services Victoria), Antoniette Bonaguro (Department of Health and Human Services Victoria)

The Department of Health and Human Services in Victoria is committed to including the voice of vulnerable children in program evaluation. However, accomplishing this is fraught with complications, including privacy and ethical risks, the difficulty of recruiting and interviewing very young children, and guardian consent for interviews or surveys. This session describes a bespoke approach to capture the voice of the child in the monitoring and evaluation of an out of home care program aimed at keeping sibling groups together and where sibling groups are separated, ensuring they have meaningful contact.

The approach was developed by the staff of partner agencies delivering the program. The approach leveraged knowledge of the children and took a trauma informed approach to develop ways of gathering data and information to ensure the child's safety. A mix of survey and play techniques were developed for the children, as well as processes to capture data from foster carers, family, program staff and clinicians to allow triangulation of evidence measuring the children's experiences and progress. This collaborative approach to evaluative thinking supported agencies to deliver the program. The strategies have been embedded in program protocols, improving the inclusion of the voice of the child in out of home care practices. The approach also enabled the data to be gathered as part of quality assurance monitoring, and collectively evaluated. Early results of the strategies will be presented along with reflections from the staff of the collaborative evaluative process in the program.

Chairs
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →

Presenters
avatar for Dr. Suzanne Evas

Dr. Suzanne Evas

Senior Evaluation and Research Officer, Victorian Department of Health and Human Services
Suzanne Evas has been working broadly in the social services sector for over 25 years. She began her professional life as an allied health practitioner and program coordinator, working across clinical and community settings in the US and Canada. A PhD scholarship at the University... Read More →
AB

Antoniette Bonaguro

Principal Project Officer, Reform and Service Development, Department of Health and Human Services, Victoria
Antoniette Bonaguro has been working in the Human Services field since 1993. She has 20 years of experience working in the Community Services sector. She has been working for Department of Health and Human Services since 2018, where she has been involved in managing a range of innovative... Read More →


Monday September 16, 2019 12:00pm - 12:30pm AEST
C2.2

12:00pm AEST

Yuwaya Ngarra-li: Evaluating an Aboriginal community-led partnership working towards systemic change in Walgett, NSW
Ruth McClausland (UNSW)

In this paper, I reflect on the process and lessons of developing an evaluation framework for a unique place-based partnership between an Aboriginal community-controlled organisation and a university. The Yuwaya Ngarra-li partnership between the Dharriwaa Elders Group and UNSW grew from collaboration on qualitative research projects over many years, and was formalised after the Dharriwaa Elders invited UNSW to work with them on their vision for change in Walgett, a remote town in north-west NSW. The long-term aim of Yuwaya Ngarra-li is to improve the wellbeing, social, built and physical environment and life pathways of Aboriginal people in Walgett through capacity building, research and evidence-based initiatives. The partnership’s approach is community-led, culturally connected, strengths-focused and holistic, and the evaluation framework is informed by and underpinned by these principles. Local Indigenous knowledges and community data gathering and metrics of success have primacy in the evaluation of Yuwaya Ngarra-li. Taking a developmental evaluation approach in this early phase of the partnership has proven useful. Embedding participatory and reflective processes has enabled the team to adjust and respond as the focus and role of the partnership evolves. The evaluation is seeking to document impact and change at individual, community and systems levels. I will discuss Yuwaya Ngarra-li initiatives focused on youth justice, water and food security in Walgett to illustrate how we are evaluating the conditions, elements and processes that are enabling change.

Chairs
avatar for Donna Stephens

Donna Stephens

Research and Project Manager Wellbeing and Preventable Chronic Diseases Division, Menzies School of Health Research

Presenters
avatar for Ruth McCausland

Ruth McCausland

Senior Research Fellow, School of Social Sciences, UNSW
Dr Ruth McCausland is Director of Research and Evaluation for the Yuwaya Ngarra-li partnership between the Dharriwaa Elders Group and UNSW, and Senior Research Fellow in the School of Social Sciences, UNSW. Her research focuses on women, young people, people with disabilities and... Read More →


Monday September 16, 2019 12:00pm - 12:30pm AEST
C2.1

1:30pm AEST

'Games of Firsts': The evaluation and monitoring framework for Gold Coast 2018 Commonwealth Games Legacy program
Robert Grimshaw (QLD Dept of Innovation, Tourism Industry Development and the Commonwealth Games)

The Gold Coast 2018 Commonwealth Games™ (GC2018) were held from 4 to 15 April 2018 – they represented the largest sporting event Australia will see this decade and the biggest sporting spectacular the Gold Coast has ever seen. But GC2018 is about more than a spectacular sporting event. In what has now been coined the ‘Games of Firsts’, GC2018 was the first regional Australian city to ever host a Commonwealth Games, the first major event of its kind to commit to a Reconciliation Action Plan for First Nations peoples, the first to have equal number of medal events for men and women and the largest ever fully-integrated para-sports program seen in Commonwealth or world sport. It is also about the opportunities and benefits that hosting the Commonwealth Games bring to the Gold Coast and all of Queensland before, during and after the event.

The presentation will cover the innovative design and implementation of the Evaluation and Monitoring Framework and how data visualisation technology has been used to track progress towards realising and maximising positive legacy benefits from GC2018 for Queensland communities. The will include a summary of particular elements of the evaluation framework that sought to engage the broad range of participants and stakeholders and capture the economic, social and cultural benefits. These approaches ranged from highly-technical economic modelling to face-to-face consultations.

The Evaluation and Monitoring Framework for the Embracing 2018 Legacy Program was awarded the 2018 AES Award for Excellence in Evaluation – project or study.

Chairs
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led over 50 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... There's probably... Read More →

Presenters
avatar for Robert Grimshaw

Robert Grimshaw

Manager, Evaluation, Queensland Department of Innovation, Tourism Industry Development and the Commonwealth Games
Robert Grimshaw is an experienced project and evaluation manager with a demonstrated history of working in various social and economic subject areas across government. He works effectively with all stakeholders to deliver outcomes by bridging the gaps between government, delivery... Read More →


Monday September 16, 2019 1:30pm - 2:00pm AEST
C2.4

1:30pm AEST

A Practical Application of a Realist Synthesis Method
Jo Hall (Australian  National University)

There are a number of different methods for synthesising information across multiple evaluations. The emphasis of one of these, realist synthesis (Pawson and Tilley), is on identifying theory (context-mechanism-outcome configurations) to answer the question ‘what works for whom in what circumstances, in what respects and how?’ There are relatively few examples of realist synthesis and they sometimes struggle to articulate mechanisms and theory in ways that can be helpful for policy makers. In particular they tend to be insufficiently focused on explanation and to develop separate lists of context, mechanisms and outcomes. More examples of realist synthesis are important to grow the practical experience of using and refining the method. It is also important to demonstrate a viable and potentially more useful alternative to systematic reviews that are based on randomised control trials, for which there is a growing appetite.

I will share with you my PhD work which adopted a realist synthesis methodology for Australia’s Department of Foreign Affairs Review of Program Evaluations, to see what could be learned from the evaluation reports across two topic areas: policy influence and promoting gender equality.
I will briefly present the findings but spend most of this session reflecting on the methodology. The primary sources of information for the review were the 37 evaluation reports completed by program areas in 2017 and 14 interviews with program evaluators and DFAT staff. The method focused on coding explanatory text in evaluation reports and interview transcripts and analysing the coded text with the help of Nvivo software, drawing on substantive theory.

In a 20-minute presentation I will highlight key aspects of the process and my reflections on mid-range theory, mechanisms and explanation. The remaining 5 minutes will be for questions and discussion.

The learning papers are available at:
https://dfat.gov.au/aid/how-we-measure-performance/ode/strategic-evaluations/Documents/review-of-2017-program-evaluations-policy-influence-learning-paper.pdf

https://dfat.gov.au/aid/how-we-measure-performance/ode/strategic-evaluations/Documents/review-of-2017-program-evaluations-gender-learning-paper.pdf

Chairs
avatar for Ian Patrick

Ian Patrick

Director, Ian Patrick & Associates
Dr. Ian Patrick has wide experience in the evaluation, design and management of social sector programs. This includes a strong background in design and implementation of M&E systems, conduct of program evaluations, strategic planning, analysis and research, and training and academic... Read More →

Presenters
avatar for Jo Hall

Jo Hall

PhD student, ANU
Jo is now a part-time student and part-time evaluation consultant, having spent 30 years in international development, with NGOs and with Government, including 7 years with the Office of Development Effectiveness in the Department of Foreign Affairs and Trade. Jo is very interested... Read More →


Monday September 16, 2019 1:30pm - 2:00pm AEST
C2.5

1:30pm AEST

Beyond co-design to co-evaluation: Reflections on collaborating with consumer researchers
Rachel Aston (ARTD Consultants), Amelia Walters (ARTD Consultants), Amber Provenzano (ARTD Consultants)

There is increasing recognition that consumers of mental health services and consumer researchers play an essential role in creating quality and effective research (Lammers & Happell, 2004; Hancock et al., 2012). However, little evidence exists around the engagement of consumer researchers in research and even less in evaluation (Lammers & Happell, 2004). Consumer researcher inclusion can enhance the utility, relevance, and validity of the evaluation process, conclusions, and judgements of programs, policies and initiatives that directly involve and impact on the lives of end-users.

A Victorian Primary Health Network has introduced an innovative Mental Health Stepped Care Model designed to match services with individual's and local population needs. Using this as an evaluation case example, collaboration with a consumer researcher is shown as critical to the success of the evaluation due to the design of the methodology, and in particular the emphasis on qualitative data gathering and case studies of primary health services.

Supporting the emergent literature and challenging the historical view of consumers as passive potential beneficiaries of the research and evaluation process, we propose that the active involvement of a consumer researcher in all stages of the evaluation process creates powerful mutual learning (Brosnan, 2012).

We will discuss how to practically support consumer researchers in evaluation to contribute their lived experience, to further develop their professional skills, and to foster greater ownership of evaluation for the community. We suggest minimising potential power disparities between the evaluation team and the consumer researcher, through a mentoring and allyship model (Happell et al., 2018).

Finally, we will raise important implications for the practice and wider discipline of evaluation. Progressing beyond co-design to co-evaluation, the unique contribution of consumer researchers' values and lived experience, being embedded in evaluation, leading to maximizing the utility, relevance and accuracy of the findings, will be elucidated.


Chairs
avatar for Christina Kadmos

Christina Kadmos

Principal, Kalico Consulting

Presenters
avatar for Rachel Aston

Rachel Aston

Manager, ARTD Consultants
Rachel is an experienced social researcher and evaluator at ARTD Consultants. She brings eight years’ experience conducting research and evaluation for government, NGOs and in the higher education sector. Rachel has a high level of expertise in qualitative and mixed-methods research... Read More →
avatar for Amber Provenzano

Amber Provenzano

Analyst, ARTD Consultants
Amber joined ARTD in 2018. She supports evaluation and research in areas of complex social policy, particularly in the health, mental health and disability sectors. She has experience with undertaking document and literature reviews, conducting interviews, survey design and administration... Read More →
avatar for Amelia Walters

Amelia Walters

Lived Experience Researcher, ARTD Consultants
Amelia is a mental health advocate, consultant, researcher, and peer worker.Amelia is the Lived Experience Researcher to the ARTD Consultants and University of Melbourne’s independent evaluation of the Eastern Melbourne Primary Health Network Mental Health Stepped Care model. She... Read More →


Monday September 16, 2019 1:30pm - 2:00pm AEST
C2.6

1:30pm AEST

Introduction to Evaluation
Charlie Tulloch (Policy Performance)

This session is targeted towards new, inexperienced or emerging evaluators who feel that they have fallen into the deep end of the field. This can be overwhelming, with theoretical, methodological, logistical and ethical challenges to consider. This presentation will provide an introductory overview of evaluation, opening the box on key concepts, definitions, approaches and resources. Those attending this session will have a better understanding ahead of several days of evaluation presentations. This session is supported by the AES Emerging Evaluators Special Interest Group.

Chairs
avatar for Rae Fry

Rae Fry

Senior Evaluation Analyst, NSW Agency for Clinical Innovation
I’m an evaluator with experience in public health, health services and road safety.

Presenters
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance supports public sector leaders to plan, implement and evaluate their policies and programs. I enjoy supporting emerging evaluators to better understand the theories, practices and skills that help them to thrive in the evaluation field. I have been a tutor in Impact... Read More →


Monday September 16, 2019 1:30pm - 2:30pm AEST
C2.2

1:30pm AEST

Digital Disruption - the next industrial revolution is here. What does this all mean for evaluators?
Jenny Riley (Clear Horizon), Jess Dart (Clear Horizon), Kristy Mansfield (Seer Data and Analytics), Reuben Stanton (Paper Giant), Chris Newman (ArcBlue Asia Pacific)

Digital, Cloud, Data Science, AI and Machine Learning, Robots....what does all this mean for the field of evaluation? Award winning evaluator Jess Dart, will host a panel of experts to explore current and emerging trends in what is hailed the 4th revolution. We will explore how new technologies are being used for social change (phone apps for finding free food, wearables for tracking in aged-care facilitates, social media for building resilience amongst farmers, apps for stream lining fines applications) and what evaluators need to be equipped to evaluate these technological interventions and also how digital can be leveraged to enhance the practice of evaluation.

The panel will reflect on real-world examples of how technical fixes can fail but also how new technology and design approaches can more democratic, participatory, transparent and importantly useful at potentially much lower costs than before. The panel will share what they have seen work well and how they evaluate success. We will also explore the ethics, risks and challenges of digital data collection, storage and reporting. We will discuss big data, small data as well as open and closed data and how we can leverage digital.


Chairs
avatar for George Argyrous

George Argyrous

Research and evaluation manager, Institute for Public Policy and Governance, UTS

Presenters
avatar for Jenny Riley

Jenny Riley

Chief Innovation Officer, Clear Horizon
Jen is one of the leading digital disrupters in the evaluation space, having developed and commercialised a digital data collection, storage and reporting tool Track2Change and most recently has developed and launched Clear Horizon's Learning Academy www.clearhorizonacademy.com... Read More →
avatar for Kristi Mansfield

Kristi Mansfield

Co-founder & Director, Seer Data & Analytics
Kristi is an influential social innovator who has a long track record working with government, philanthropists and not-for-profit leaders. Named one of Australia's 100 Women of Influence by the Australian Financial Review in 2015, Kristi is Director of CX and Strategy at Oracle, and... Read More →
avatar for Reuben Stanton

Reuben Stanton

Director, Paper Giant
Reuben is a strategic designer and researcher with over 15 years experience in the design industry in Australia and Japan. He is a co-founder and the Design Director at Paper Giant.He has a background in communication design and software development, and a PhD in interaction design... Read More →
avatar for Chris Newman

Chris Newman

managing Director, ArcBlue Consulting
Managing Director and co-founder of ArcBlue, a leading procurement consulting, training, and analytics company, Chris Newman is a specialist in delivering social and sustainable outcomes through procurement. Chris has led work across Asia-Pacific, working with Government, Industry... Read More →
avatar for Jess Dart

Jess Dart

CEO & Principal Consultant, Clear Horizon Consulting
Dr Jess Dart is the founder and CEO of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of experience... Read More →


Monday September 16, 2019 1:30pm - 2:30pm AEST
Pyrmont Theatre

1:30pm AEST

What the arts can teach evaluators
Gerard Atkinson (ARTD  Consultants)

"The arts are fundamental resources through which the world is viewed, meaning is created and the mind developed" - Elliot Eisner

It's time to do some out-of-the-box thinking.

In this interactive session, you and your peers will engage with a series of artistic provocations that will promote discussion and reflection on the practice of evaluation. The aim is to challenge assumptions about what we value, open up new ways of looking at problems, and highlight the diversity of perspectives that we and those we work with bring.

This session also aims to remind us of the value that the arts have as a means of expression and engagement and identify how the arts can have a more prominent place in evaluative discourse.


Chairs
avatar for Josephine Norman

Josephine Norman

Manager, Centre for Evaluation and Research, DHHS
I’m endlessly optimistic about our profession & the ways in which we can contribute to better outcomes, particularly in my context, for vulnerable Victorians. I’m always interested in finding new ways to give better advice, based on increasingly good evidence. I am fortunate to... Read More →

Presenters
avatar for Gerard Atkinson

Gerard Atkinson

Manager, ARTD Consultants
I joined ARTD in 2017, bringing experience as an evaluator and administrator in the arts and culture sector in Australia, Europe, and the United States. I design and deliver evaluation strategies and frameworks for program portfolios, and manage complex projects. I also like to get... Read More →


Monday September 16, 2019 1:30pm - 2:30pm AEST
C2.3

1:30pm AEST

Logic and Creative Evaluation that Embraces our Young People - Measuring Personal Growth, Aspirations, Dreams and Commitment?
Rossingh, B., Puruntatameri, E., Tipuamantumirri, S., Perry, M., Nasir, T  (Tiwi Islands Training and Employment Board)

The Women's Centre in the Tiwi Islands is introducing a program for young Tiwi women to 'find themselves' so they may commence their own unique journey of aspirational development and self-belief to build foundation life-skills. Evaluation of this program requires a balance of logic and creativity. Logic to give structure and creativity to measure the almost unmeasurable - progressions in one's thinking and the realisation of possibility and opportunity to achieve.

These young women need support to grow and develop as leaders of change and potentially follow in the footsteps of their senior and strong cultural leaders. The issue is that being a community leader is not necessarily an aspiration for many young people. Changes relating to westernisation are coming at a fast rate and attitudes of young people are not as focused on retention of culture for future generations and the sustainability of one's community. We need to embrace this contemporary moment towards understanding what it is that young people want so they can develop as leaders of change and achieve in their own way. The 'Rise Up 2 Lead Program' is aimed at adding value to existing employment based programs for young Tiwi women to build knowledge, skills, values and confidence as well as strengthening relationships, trust and friendship. These program outcomes are geared towards young Tiwi women seeing themselves as leaders and change makers for their family and the community. The embedded evaluation framework for this program is premised on a mixed-methods approach that is structured in an informal way and more importantly inclusive of young women to share and grow and senior 'strong' women to guide and advise.


Chairs
avatar for James Smith

James Smith

Father Frank Flynn Fellow and Professor of Harm Minimisation, Menzies School of Health Research
James is the Father Frank Flynn Fellow and Professor of Harm Minimisation at Menzies School of Health Research - with much of his work sitting at the health/education nexus. Previous to this role he was a 2017 Equity Fellow with the National Centre for Student Equity in Higher Education... Read More →

Presenters
avatar for Evita Puruntatameri

Evita Puruntatameri

Tiwi Islands Training and Employment Board
Evita has been working as a supervisor at the Women’s centre for the past four years. Both her mother and grandmother work in education, her grandmother is a teacher at TITEB, the same company where Evita works so there is a strong family history of valuing learning. Evita is well... Read More →
avatar for Sophia Tipuamatumirri

Sophia Tipuamatumirri

Tiwi Islands Training and Employment Board
Sophia has been working as a supervisor at the Women’s Centre for the past three and a half years. Sophia has a strong interest in cultural activities and enjoys spending time out bush with her friends and family hunting and fishing. Sophia is well respected by staff and clients... Read More →
avatar for Bronwyn Rossingh

Bronwyn Rossingh

Chief Financial Officer, Tiwi Island Training and Employment
Bronwyn is passionate about supporting the vision of Aboriginal communities and organisations. She has worked extensively in remote Aboriginal Communities in the NT and WA in the areas of financial management, governance, community engagement, enterprise development, financial capability... Read More →
MP

Moya Perry

Community Development Program Manager, TITEB
Moya is passionate about community development and has been working with Indigenous people in remote Australia for over 20 years. Currently Moya is working for the Tiwi Island Training and Employment Board managing the Community Development Program which focuses on creating a skilled... Read More →


Monday September 16, 2019 1:30pm - 2:30pm AEST
C2.1

2:00pm AEST

Frameworks for program evaluation: considerations on research, practice and institutions
Ghislain Arbour  (University of Melbourne)

Evaluation frameworks are currently an important concern in evaluation practice, especially for organisations who desire to organise their evaluation activities. But the reflections and decision in that domain are plagued with imprecisions and ambiguities regarding the constitutive dimensions of frameworks. This renders more difficult the identification of needs and potential answers in their selection or development.

In response, this paper provides a model to analyse frameworks for program evaluation organised around four dimensions. The model states that a framework for evaluation is an intellectual framework, made of concepts and or theories (first dimension: types of ideas) about an object related to evaluation (second dimension: object), where the said concepts and theories can be positive and/or normative (third dimension: analytical perspective). These three dimensions provide the means to describe, explain or judge an evaluation related matter. A fourth and optional dimension, the institutional character of a framework, allows an evaluation framework to become a form of regulation for behaviours related to program evaluation (fourth dimension: institutional dimension).

In essence, this paper will raise our awareness about the kinds of theoretical "boxes" we encounter in evaluation so we can get better at relying on them, and even turn them into influential policies when it counts.


Chairs
avatar for Ian Patrick

Ian Patrick

Director, Ian Patrick & Associates
Dr. Ian Patrick has wide experience in the evaluation, design and management of social sector programs. This includes a strong background in design and implementation of M&E systems, conduct of program evaluations, strategic planning, analysis and research, and training and academic... Read More →

Presenters
avatar for Ghislain Arbour

Ghislain Arbour

Senior Lecturer, The University of Melbourne
Ghislain Arbour is a Senior Lecturer at the Centre for Program Evaluation at the University of Melbourne in Australia.He has a hard time not talking about terminology in evaluation (see what happens if you ask about his dictionary project), the nature of evaluation theory and models... Read More →


Monday September 16, 2019 2:00pm - 2:30pm AEST
C2.5

2:00pm AEST

Travel Behaviour Change Evaluation: Embracing ticketing data insights and moving beyond the box of self-reports
Zarin Salter (Active Transport and Safety, Urban Mobility, Department of Transport - WA), Dr Kim Carter (Data Analysis  stralia, Pty Ltd)

Implemented by The Government of Western Australia, Your Move delivers a suite of tailored travel behaviour change (TBC) programs that provide participants with localised, personalised information, coaching conversations and ongoing feedback to encourage them to walk, ride a bike and use public transport more often for their daily trips.

In 2018, de-identified, residentially coded SmartRider ticketing data made it possible to analyse the public transport patronage habits of residents who lived in two previous Your Move project areas and statistically compare their travel with those who lived in areas of greater Perth that received no Your Move projects. The data source was representative of the whole metropolitan area and was still sufficiently large enough for analysis even after a thorough data cleaning process was applied.
The resulting figures for the two previous Your Move projects were impressive and the most reliable estimate of public transport mode shift that Your Move has been able to obtain in its 20-year history. Having robust figures for public transport mode shift made it possible to extrapolate the shift in other modes and model the overall benefits of a Your Move project to the whole community.

Traditionally, TBC programs have been evaluated using self-report data collection techniques which are expensive and prone to risks associated with data reliability, survey length and respondent burden, small sample size, inaccurate sampling between interviewers, control group selection, panel recruitment loss, and weather variability.

This presentation will discuss the need for practitioners to innovate in the TBC evaluation space, specifically with respect to data source accuracy, and will share insights learned from un-packing the box of treasures hidden within ticketing data.


Chairs
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led over 50 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... There's probably... Read More →

Presenters
avatar for Zarin Salter

Zarin Salter

I lead all aspects of Program Evaluation for the Active Transport and Safety branch of the Department of Transport's Urban Mobility policy and planning directorate. In this role I am responsible for: the management and leadership of evaluation projects; informing strategic cycling... Read More →


Monday September 16, 2019 2:00pm - 2:30pm AEST
C2.4

2:00pm AEST

Does empowerment evaluation work? Findings from a case-study
Kerrie Ikin  (University of New England)

End users running their own evaluations!
End users owning the evaluation results!
End users influenced by the evaluation processes!
This paper is all about empowerment: values, capacity building, ownership, power.
Curious?

Come and find out about how an entire staff became involved in their school's three-year journey in an empowerment evaluation process and what the research about this process revealed.

In the New South Wales government education system in Australia, reviewing schools has undergone a sea change. Community-of-practice approaches to school planning and evaluation, followed by external but peer-led validation has become the norm. This model presumes a high level of competence in collaborative strategic planning and evaluation as well as a high level of evaluation capacity by school principals and staff. One school principal, realising the challenges that the new model posed, engaged an evaluator to develop and implement a process (empowerED) that would help his school rise to these challenges.

EmpowerED was specifically designed to strengthen the school's learning community by creating in partnership across it stronger and better professional practice. Who held the power in an evaluation was challenged as traditional evaluation roles were turned on their heads—the staff became the evaluators; the evaluator became their critical friend. Through this process, it was envisaged that staff would build capacity for change, be empowered as whole-of-school evaluators, and embrace ownership of their school's plan. The ultimate goal was to improve student learning outcomes. And the approach paid off. Findings from the concurrent research show how as staff developed transparency, openness, and trust in the process and with each other, their understanding of and input into the school's plan and directions increased, and their evaluation capacity was built. Early indications also suggest improved student learning outcomes may be in part attributable to empowerED.


Chairs
avatar for Christina Kadmos

Christina Kadmos

Principal, Kalico Consulting

Presenters
avatar for Kerrie Ikin

Kerrie Ikin

Adjunct Senior Lecturer, UNE Business School, University of New England
Dr Kerrie Ikin FACEKerrie began her career as a teacher and executive member in government schools across New South Wales, Australia. For over 25 years she worked at the systems level of the education department as a director of schools, senior policy adviser, and senior administrator... Read More →


Monday September 16, 2019 2:00pm - 2:30pm AEST
C2.6

2:30pm AEST

Evaluation for enlightenment: Creating value through process evaluation
Rory Sudfelt (Education Review Office), Sankar Ramasamy (Education Review Office, NZ), Barbie Mavor (Education Review Office, NZ), Tess Livingstone (Education Review Office, NZ)

For many evaluations their primary purpose is the judgement of value. However, value can also be created through interactions between the evaluators and the stakeholders, during an evaluation. Patton (2008) said process use of evaluation enhances achievement of program outcomes while also meeting evaluation information needs. This presentation will focus on how a process use of evaluation helped to create value for both evaluation stakeholders and evaluators during a two phased, mixed-methodology, evaluation.

The focus will be on a formative evaluation, which used a survey for the first phase, and case studies on selected schools for the second phase. The evaluation looked at how New Zealand schools were progressing with implementing new curriculum content.

Value will be discussed as an exchange of knowledge between evaluators and stakeholders that fostered enlightenment by both unpacking 'what's in the box'. Stakeholders' enlightenment was through discovering and unpacking, with evaluators, their journey of implementing the new curriculum content. Evaluators' enlightenment was through 'unpacking the black box' of a theory of change through case studies. The case studies tested if the conditions for effective implementation, inferred from the initial survey, reflect how schools implemented the curriculum content.

The presentation will discuss the benefits for evaluators and stakeholders of process use evaluation. The presentation will be useful for anyone starting to, or wanting a different perspective on, working with formative evaluation and mixed-method methodologies.

Value will also be discussed in the context of developing new evaluators' capacity in formative and process use, evaluations, and mixed-method methodologies.


Chairs
avatar for Rae Fry

Rae Fry

Senior Evaluation Analyst, NSW Agency for Clinical Innovation
I’m an evaluator with experience in public health, health services and road safety.

Presenters
RS

Rory Sudfelt

Analyst, Education Review Office
Rory has worked at the Education Review Office in the Evaluation and Policy team since mid-2016. He works across a range of evaluation projects, and has a particular interest in the power of mixed-methods evaluations.Rory has recently completed the Graduate Certificate in Evaluation... Read More →


Monday September 16, 2019 2:30pm - 3:00pm AEST
C2.3

2:30pm AEST

Necessary components of a theory of change for system level interventions
Nerida Rixon (The University of Melbourne)

This presentation discusses research into the necessary components of a theory of change for system level interventions. This research focuses on theories of change at the 'whole of government response' or 'package' level (i.e. programs or initiatives managed by multiple agencies put together, funded and announced as a package), or at the 'system' level (e.g. the mental health system). It is relevant to any institution designing systems level responses.

A conceptual framework will be proposed outlining the necessary components of a theory of change and more broadly good theory. This will enable governments to effectively plan, monitor and evaluate outcomes. Through translating the framework to an analysis grid, a formula is provided to craft and analyse theories of change. Analysis of at least one case study using this analysis grid will be presented.

This presentation draws on research into what makes both good theory and a good program level theory of change, critiquing and translating this research, where appropriate, to the package or system level.

A central assumption in my work is that governments develop theories of change quickly. This research would provide evaluation practitioners, government program managers and policy officers with the 'must haves' for this theory. As organisations transition to outcomes based planning and design, and grapple with complexity a strong system level theory of change is essential.


Chairs
avatar for Ian Patrick

Ian Patrick

Director, Ian Patrick & Associates
Dr. Ian Patrick has wide experience in the evaluation, design and management of social sector programs. This includes a strong background in design and implementation of M&E systems, conduct of program evaluations, strategic planning, analysis and research, and training and academic... Read More →

Presenters
avatar for Nerida Rixon

Nerida Rixon

Masters Student, The University of Melbourne
Nerida is a Master of Evaluation (Research) student at The University of Melbourne. After knocking off her coursework in record time, Nerida took maternity leave to first worry/then enjoy hanging out with her son (now 2). She has been slowly but surely progressing her thesis since... Read More →


Monday September 16, 2019 2:30pm - 3:00pm AEST
C2.5

2:30pm AEST

Machine-assisted qualitative analysis in Evaluation
Jasper Odgers (ARTD Consultants), Klas Johansson (ARTD Consultants)

We will tell you how Natural Language Processing (NLP) can be used to reduce time and costs associated with qualitative analysis by up to 75%. Our experience with this technology will allow for a vibrant discussion about the real benefits of machine-assisted qualitative analysis. The ethics, limitations and future directions of the technology will also be discussed.

This technology can be used to analyse large amounts of unstructured text data in a way that reduces the resource burden of analysing large qualitative datasets. By using techniques such as topic modelling and keyword identification, analysts can interpret the contents of large datasets in a fraction of the time it would take to do manually. Improvements in this technology will have profound impacts on the practice of evaluation as the use of the technology becomes more widespread. Much of the analysis work that was a large part of an evaluator’s job will be able to be done quickly and easily by machine-assisted technology; however, we focus on the continued need for humans to be involved throughout the analysis process. NLP is also adept at identifying themes from data which may not be apparent to human analysts. Integrating this technology with ongoing monitoring data means that evaluators don’t need to constantly analyse incoming data but can easily keep up to date and concentrate on interpretation and innovative reporting.

As the technology improves and becomes more widespread it is inevitable that it will have an impact on how evaluations are designed and therefore the theory which underpins them.

Chairs
avatar for George Argyrous

George Argyrous

Research and evaluation manager, Institute for Public Policy and Governance, UTS

Presenters
avatar for Georgia Marett

Georgia Marett

Consultant, ARTD Consultants
I have many and varied interests when it comes to evaluation. I work across a variety of sectors including education, health and disability. I have a masters in Behavioural Economics so I am always keen to talk about the latest innovations in this space. I also have an interest in... Read More →
avatar for Jasper Odgers

Jasper Odgers

Manager, ARTD Consultants
Jasper has been studying and working in quantitative research and data analysis for the past eight years. He manages online surveys, quantitative data analysis and data visualisation for all of ARTD’s reporting. He has recently managed several stakeholder surveys for NSW Government... Read More →
avatar for David Wakelin

David Wakelin

Senior Consultant, ARTD Consultants
I am a keen data analyst with a passion for data visualisation. I've been working on a wide range of projects lately and have seen immense value in being able to tell stories with data I am working with.


Monday September 16, 2019 2:30pm - 3:00pm AEST
Pyrmont Theatre

2:30pm AEST

Movies, art and virtual reality - Innovative evaluation story methods for participatory approaches
Samantha Abbato (Visual Insights People), Margi MacGregor (CatholicCare NT), Jayne LLoyd (CatholicCare NT)

Story is a valuable tool for evaluation that receives cursory attention in the evaluation literature compared to other qualitative methodologies. Krueger (2010) calls attention to the value of evaluation stories because they make information easier to remember, more believable and can convey emotion to elicit action. Many organisations in the community and health sectors are regularly required to provide participant stories as a component of regular reporting. But scant attention has been given to how to build rigor and credibility into this evaluation approach. In addition, the last decade has seen rapid innovation in technology to tell stories in engaging visual ways through film and virtual reality that is becoming ever more accessible to all of us, evaluators, commissioning organisations and staff and the people our programs are designed to serve.

Through a multidisciplinary partnership bringing film, art, graphic design and virtual reality to evaluation, we disrupt the traditional way of developing evaluation story. We present examples of evaluation story developed through using three approaches beyond the box of evaluation.

  1. Film story based on accessible technologies (i-pads, i-phones and smart phones) in-depth interview, and a participant led approach;
  2. Aboriginal art telling the stories of client participation in programs painted in partnership with clients;
  3. Virtual reality animation based on storyboards codesigned with program participants.

These different modes of evaluation story telling facilitated by the transdisciplinary team have been combined in an approach and used for a range of evaluation projects. A major advantage of the approach is that the visual media enables a diversity of participants to engage, create, narrate, shape, communicate and validate their own stories to the audience of evaluation without limitation of language and literacy. We discuss how regardless of how innovative and creative the story-telling media, rigor and credibility of story as data can be maintained and risks mitigated.


Chairs
avatar for Josephine Norman

Josephine Norman

Manager, Centre for Evaluation and Research, DHHS
I’m endlessly optimistic about our profession & the ways in which we can contribute to better outcomes, particularly in my context, for vulnerable Victorians. I’m always interested in finding new ways to give better advice, based on increasingly good evidence. I am fortunate to... Read More →

Presenters
avatar for Samantha Abbato

Samantha Abbato

Samantha Abbato and Associates
Samantha Abbato is an evaluation consultant and director of Visual Insights, a pictures and stories approach to evaluation. Sam has completed more than 100 evaluation and research reports and papers for a range of government, non-government organisations and community stakeholders... Read More →
avatar for Margi MacGregor

Margi MacGregor

Evaluation Systems Manager, CatholicCare NT
CatholicCare NT is currently in the final stages of a cultural shift in relation to evaluation. We have been guided and supported by Samantha Abbato from Visual Insights People, who has made the journey engaging for staff at all levels. As we move towards refining our qualitative... Read More →


Monday September 16, 2019 2:30pm - 3:00pm AEST
C2.2

2:30pm AEST

Harnessing the power of co - practical tips
Jade Maloney (ARTD)

In the disability sector, there is growing advocacy for the philosophy of 'nothing about us without us', while in the mental health sector, peer delivery and peer research are important. Recognising the rights of people with lived experience to influence the policies and programs that affect their lives, organisations have turned to co-design, co-production and co-delivery.

As evaluators, we need not only to evolve our methodologies to appropriately assess these ways of working, but to ensure our approaches uphold the philosophy. That is, to ensure we recognise the expertise of people with lived experience and engage them in our processes.

To do this, we need to challenge traditional power dynamics that come with the concept of evaluator as expert outsider. We can draw from the toolboxes of collaborative, participatory and empowerment evaluation, as well as design. But when is it right to use each of these approaches? What do they look like in practice across the stages of an evaluation? And what can you do to engage genuinely when you have limited time and are working with geographical, cultural and communications differences?
This presentation provides practical ideas for harnessing the power of co in different contexts from projects with organisations working with people with autism, dementia, psychosocial disability and intellectual disability, across locations and cultures. Our ideas cover the design, data collection, analysis and interpretation, and reporting phases, with options for when you have years versus weeks or days. We also identify considerations for accessibility and inclusion, and lessons we have learned the hard way.

The examples illustrate the value that lived experience has brought to our practice, and what we have had to bring to make this possible. For evaluation to be a gift - the exchange must be two way - we must receive as well as give.



Chairs
avatar for James Smith

James Smith

Father Frank Flynn Fellow and Professor of Harm Minimisation, Menzies School of Health Research
James is the Father Frank Flynn Fellow and Professor of Harm Minimisation at Menzies School of Health Research - with much of his work sitting at the health/education nexus. Previous to this role he was a 2017 Equity Fellow with the National Centre for Student Equity in Higher Education... Read More →

Presenters
avatar for Jade Maloney

Jade Maloney

Partner & Managing Director, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
avatar for Alex Lorigan

Alex Lorigan

Consultant, ARTD Consultants
Alexandra supports evaluations and reviews in areas of complex social policy, most commonly in the mental health and disability sectors. She is passionate about building social inclusion through initiatives that support or build the capacity of people with lived experience, their... Read More →


Monday September 16, 2019 2:30pm - 3:00pm AEST
C2.1

2:30pm AEST

The Perpetrator Perspective: Breaking down the barriers in family violence research and evaluation
Luke Condon (Deloitte), Kate Palmer (Deloitte Access Economics), Sasha Zegenhagen (Deloitte Access Economics), Karen Kellard (Social Research Centre), Scott Pennay (Social Research Centre), Jenny Anderson (Department of Health and Human Services), Sally Finlay (Family Safety Victoria), Ilana Jaffe (Family Safety Victoria)

The Victorian Royal Commission into Family Violence placed a strong emphasis on the need to better understand who is experiencing family violence, their circumstances, and how they can be supported. The unique experiences of both the victim and the perpetrator are critical to measuring the impact of family violence programs, and contributing to best practice for changing the behaviour of people who use violence. However, engaging with perpetrators and victims presents an ethical minefield. It requires us to 're-evaluate' our approach to evaluation, view risks from a different lens, and think outside the box, all whilst meeting ethical standards.

In Victoria, interventions to address perpetrator behaviour are being redefined to be both broader and better integrated into wider family violence responses. This includes improving the inclusivity of these programs to better target the diverse needs and circumstances of perpetrators of family violence. Evaluation of these new programs will inform policy and drive system improvement, making it more responsive to the needs of our diverse community. As such, it is important to understand the perspective of the 'service users' and how their experience is contributing to evidence of outcomes. Inclusion of the perpetrator and victim voice within the evaluation design requires complex consideration of the potential risks involved for both victim and researcher, balanced with the anticipated benefits of the research at both an individual and community-wide level.

Drawing on the perspective and expertise of program service providers is key to understanding and addressing the broad range of considerations and sensitivities when engaging with this typically complex population. From recruitment strategies, to participant incentives, and discussion guides, the standard methods do not apply, and a 'one-size-fits-all' approach does not work. We discuss how a collaborative approach to evaluation design is key to ensuring research is centred on the needs of participants, thus maximising the positive impact of perpetrator programs in the future.


Chairs
avatar for Christina Kadmos

Christina Kadmos

Principal, Kalico Consulting

Presenters
avatar for Luke Condon

Luke Condon

Partner, Deloitte Access Economics
I've been an evaluator for around 12 years after originally starting my career in the public sector. My clients are primarly state and federal government and while my main areas of focus are health, justice and community services I've done evaluations on all sorts of topics. I enjoy... Read More →
avatar for Karen Kellard

Karen Kellard

Director, Social Research Centre (SRC)
Karen Kellard is the Exec Director of the Qualitative Research Unit at the Social Research Centre (owned by the Australian National University) in Melbourne, Australia. Her interests are in the use of qualitative approaches in evaluation, and in conducting research on sensitive topics... Read More →
avatar for Jenny Anderson

Jenny Anderson

Director, Monitoring, Research and Evaluation, Movember
I have recently changed roles - from DHHS - to Movember. My new organisation is dedicated to changing the face of men's health. We are a leading global fundraiser in the three key men's health areas: prostate cancer, testicular cancer, and mental health and suicide prevention. Funds... Read More →


Monday September 16, 2019 2:30pm - 3:00pm AEST
C2.6

2:35pm AEST

So, you're an evaluation consultant - what's that?
Vanessa Hood (Rooftop  Social)

You're at a BBQ. Someone asks you what you do. You say, 'I'm an evaluation consultant'. They look blankly at you. There's awkward silence. They avert eye contact. What do you say next? How do you explain what you do and how you make the world a better place?

If you'd said, 'I'm a firefighter or nurse or builder', you may not have been greeted with glazed eyes.

Through a series of images and anecdotes, participants will learn about how the presenter has tried to describe what she does - some attempts have been greeted with enthusiasm, others have not!


Chairs
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led over 50 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... There's probably... Read More →

Presenters
avatar for Vanessa Hood

Vanessa Hood

Associate Director, Rooftop Social
I've been working as a facilitator and evaluator for over 15 years, in a wide range of contexts, including horticulture, sustainability and financial literacy. I work at Rooftop Social with Duncan Rintoul and a team of associates from around Australia, on evaluation capability building... Read More →


Monday September 16, 2019 2:35pm - 2:40pm AEST
C2.4

2:40pm AEST

Using a template to collect interview notes for rapid upload and autocoding in NVivo
Carolyn Hooper (Allen and Clarke Policy and Regulatory Specialists)

We have all kinds of tools at our fingertips, yet many of us under-utilise them. If you have NVivo and want to get more out of it, learning how to develop a template in MSWord is a good way forward. In five minutes, I will show you how to to it, and you will never look back.

Chairs
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led over 50 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... There's probably... Read More →

Presenters
avatar for Carolyn Hooper

Carolyn Hooper

Evaluation + Research Consultant, Allen + Clarke
I'm a social scientist, with a preference for qualitative methods. I've been working in evaluation for four years and I love the variety of opportunities I get to make a difference. My doctorate in Public Health means I often work on health-related evaluations, examining interventions... Read More →


Monday September 16, 2019 2:40pm - 2:45pm AEST
C2.4

2:45pm AEST

"You seriously need to play more - Let's go! (Participatory design and facilitation with Lego Serious Play)"
Kahiwa Sebire  
It might look like just fun and games, but Lego Serious Play (LSP) is a powerful facilitation tool to enable groups to surface deeper-level assumptions about a topic or program. By supporting participants to think metaphorically to build and then communicate their idea or viewpoint, groups can achieve stronger and clearer communication.

Let me share an example of how I used LSP to help a team build a shared vision of success, while uncovering competing assumptions in a safe and structured manner, and ideas for how you could use it to construct program theories, define success criteria, gather participant insights.


Chairs
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led over 50 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... There's probably... Read More →

Presenters
avatar for Kahiwa Sebire

Kahiwa Sebire

Director / MEval Student, eLumen/ University of Melbourne
Enthusiastic solution finder and life-long learner. Big fan of thorny questions, sticky notes and whiteboards. In my work, I'm exploring the intersection between learning and technology, and particularly how educational institutions can be purposeful about how they design for student... Read More →


Monday September 16, 2019 2:45pm - 2:50pm AEST
C2.4

3:30pm AEST

The evaluation box needs more pictures: A multidisciplinary approach to reducing words and numbers for evaluation capacity building (ECB)
Samantha Abbato  (Visual Insights People), Margi MacGregor (CatholicCare NT),  Jayne LLoyd (CatholicCare NT)

Effective communication channels are an essential part of successful ECB. Written materials about evaluation processes, and learning are components of Preskill and Boyles multidisciplinary model of ECB (2008). But ten years on from the publication of this model, mainstream communication methods have changed rapidly with an increasing dominance of digital images and videos. For example, regularly most of us now use images with short captions to share information and experiences and watch online videos for learning how to do something from making a meal to fixing a broken appliance. Furthermore, recent reports by the Australian Bureau of Statistics shows that 43 percent of Australians have low levels of literacy and an increasing diversity of languages other than English. So it's high time we swapped some of our written words for out of the box strategies if we are to be serious about building ECB in organisations.

Borrowing multidisciplinary tools from graphic communication, videography and systems thinking we present the development pictorial toolkit for ECB of a large state-wide community organisation. The organisation comprises one-third staff from Aboriginal and Torres Strait Islander communities and a number from CALD backgrounds. The co-designed toolkit is the major communication tool for learning and "doing" the evaluation for all staff of the organisation.

The central component of this toolkit is a color A3 poster explaining the entire organisational monitoring, evaluation and learning process with pictures and symbols. Evaluation of the poster shows that staff across the organisation:
  • Engage with organisational colors, images of their locations of work and activities;
  • Understand the evaluation process and method and how these link to their work practice through simple symbols and icons;
  • Can explain the evaluation process and method to each other using the poster;
  • Staff ownership of images is more important than perfect pictures.
Key steps for increasing ECB success through pictorial communication are discussed.


Chairs
avatar for Margaret Moon

Margaret Moon

Senior Project Officer, SafeWork NSW
I manage the evaluation program at SafeWork NSW. This involves commissioning evaluations of programs designed to improve safety in NSW and building evaluation capacity across the organisation. I have previously worked as a film editor at the Australian Broadcasting Corporation, as... Read More →

Presenters
avatar for Samantha Abbato

Samantha Abbato

Samantha Abbato and Associates
Samantha Abbato is an evaluation consultant and director of Visual Insights, a pictures and stories approach to evaluation. Sam has completed more than 100 evaluation and research reports and papers for a range of government, non-government organisations and community stakeholders... Read More →
avatar for Margi MacGregor

Margi MacGregor

Evaluation Systems Manager, CatholicCare NT
CatholicCare NT is currently in the final stages of a cultural shift in relation to evaluation. We have been guided and supported by Samantha Abbato from Visual Insights People, who has made the journey engaging for staff at all levels. As we move towards refining our qualitative... Read More →


Monday September 16, 2019 3:30pm - 4:00pm AEST
C2.3

3:30pm AEST

Maximising Effectiveness of "Evaluation to Policy Making " Process
Rini Mowson (Clear Horizon)

Translating evaluation into policy making remains a big challenge in the international development. The uptake of findings from evaluation into policy making is a complex and non-linear process.
Based on my experience working as an internal and external evaluator, this paper presents some key considerations for maximising the effectiveness of 'evaluation to policy making' process. These include:
  • The first step is to identify the knowledge roles and function of the evaluator which defines their roles in evidence-based policy making. For example: evaluators can play a role in providing sound evidence or leading the process of knowledge brokering and translation. Clarifying this role will make it easier to monitor contributions to the policy decisions.
  • To ensure utilisation of the evaluation results in the policy making, the evaluation should be high quality as credibility of evidence and conveyors of the messages are extremely important in influencing policy decisions.
  • There are different types of relationship between evaluator and policy maker which influence the utilisation of evaluation results. For example, an evaluator who has a trusted relationship with policy makers can apply an 'inside-track' approach in using the evidence to influence policy (Start and Hovland, 2004).
  • To achieve significant outcomes, evaluators can capitalise on similar initiatives and use cross-sector engagement in using their evaluation findings. In collaborating, they should use the most appropriate modalities to deliver the highest outcomes and avoid having overlapping roles in supporting evidence-based policy making.
  • The last strategy is to engage policy makers during the evaluation design, development, implementation and communication of the research findings. The evaluation topic should be informed by the needs of policy makers while the development and implementation of the evaluation should closely engage relevant stakeholders throughout the process. The communication of the evaluation findings to stakeholders helps to support policy making.


Chairs
LK

Leanne Kelly

Research & Development Coordinator, Windermere Child & Family Services
I am the research and development coordinator at Windermere Child & Family Services in Melbourne. I have recently submitted my PhD on the effectiveness of evaluation in small community development non-profits.

Presenters
avatar for Rini Mowson

Rini Mowson

Consultant, Clear Horizon
Rini has been working in the international development and Australian community development sectors for more than thirteen years. She has worked with a broad range of social change initiatives and businesses globally. Rini has extensive experience in designing and implementing monitoring... Read More →


Monday September 16, 2019 3:30pm - 4:00pm AEST
C2.2

3:30pm AEST

Innovation in government program evaluation
Bridgette Hargreave (Evaluation Unit), Angelina Bruno (Insights and Evaluation, Department of Industry, Innovation and Science)

Impact evaluations of government programs are becoming tougher, particularly when the program involves a broad spectrum of effects (social, economic and environmental), and the data is held by many different agencies. Australian Government departments are now experimenting with linking data sets to undertake innovative data analysis, research and evaluation.

This year, the Evaluation Unit in the Department of Industry, Innovation and Science trialled a new mixed method methodology for impact evaluations. To evaluate the impact of government programs at a regional level, we combined the analysis of cross-portfolio datasets with other regional data and qualitative research.

We employed out-of-the-box thinking to explore how we could use data from sources such as the Business Longitudinal Analysis Data Environment, the Household, Income and Labour Dynamics in Australia Survey, and the Australian Census. We then integrated insights from interviews, program reports and relevant academic research.

Presenters will explain why we trialled this methodology on an evaluation of Tasmanian Innovation and Investment Funds, and how it was applied. We will also demonstrate how a more complete picture of program outcomes and broader effects can be obtained through combining innovative data sources and techniques with a different way of thinking. We will discuss our approach to the complex issue of developing a counterfactual for the evaluation. And, as all of this was experimental and none of it was easy, we will also highlight the challenges faced by those involved, the solutions attempted, and the many lessons learned along the way.

Attending this presentation will give you insights into how major cross-portfolio datasets can be used to enrich impact analysis, and ideas on how such approaches could be applied to your own evaluations.


Chairs
avatar for Sean Chung

Sean Chung

Director, Paxton Partners
Sean Chung is a Director of Paxton Partners, a specialist management consulting firm, focused exclusively on clients within the health and human services sectors. Sean has assisted clients across both Canada and Australia at multiple levels of the healthcare system - from federal... Read More →

Presenters
avatar for Katherine Barnes

Katherine Barnes

Manager, Evaluation Unit, Department of Industry, Innovation and Science
Katherine Barnes is the manager of the Evaluation Unit in the Office of the Chief Economist, Department of Industry, Innovation and Science. She has a strong background in policy including five years with the former Australian Workforce and Productivity Agency. Katherine is also an... Read More →
AB

Angelina Bruno

Economist, Department of Industry, Innovation and Science
Angelina is an Economist at the Department of Industry, Innovation and Science. In her role in a research team, she uses firm-level administrative tax data on the population of Australian businesses, known as the Business Longitudinal Analysis Data Environment (BLADE), to inform policy... Read More →


Monday September 16, 2019 3:30pm - 4:00pm AEST
C2.5

3:30pm AEST

When the West Meets the East: Collaborative design, analysis and delivery of program evaluation in rural generalist training program in Japan
Takara Tsuzaki  (Western Michigan University)

This presentation demonstrates a case study of a mixed method and bilingual program evaluation which was conducted on a newly launched rural medicine/rural generalist program in Japan with a focus on collaborative and iterative learning processes. The client, GENEPRO LLC and the evaluator will share challenges in designing and implementing the evaluation, and how we have been successful in building trust among stakeholders, integrating evaluation into practice, and fostering iterative learning within the organization.

The model - Rural Generalist Program Japan (RGPJ) - is based on the Australian model which has been regarded as the most comprehensive and matured rural generalist medicine training scheme in the world. To meet the specific needs of rural generalist medicine in Japan, provision of rural healthcare was needed to be tailored to regional and local context. Exporting this medical training scheme from Australia to Japan also meant a new collaborative endeavor to develop a unique program evaluation model and approach in Japan.

This presentation will highlight the contextual differences between the East and the West in terms of philosophies and cultural values and how they are manifest in the evaluation practices. The concept of both the theoretical and practical evaluation has developed differently in Japan in the past 50 years when it is compared to the West. Furthermore, evaluation has been conducted predominantly using quantitative data in the medical and healthcare sector in Japan. However, the rural generalist medicine requires distinctly broad scope of practice as well as unique combination of abilities and aptitude to respond to the community needs of rural and remote areas of Japan. As a result, the evaluation approach, including the underlying values, philosophies and methodologies had to be thoroughly examined and openly discussed to bring all the stakeholders on board.

We will share the lessons from the collaborative evaluation process by discussing: what the evaluative thinking and collaborative evaluation design mean in the Japanese rural and medical settings; how we have come up with innovative approaches to communicate with stakeholders who have evaluation anxiety and fear of modernist undertaking; how we have acknowledged and overcome (in)translatability issues in languages, imbedded values, and social contexts of each stakeholder groups; and how the collaborative evaluation processes impacted the organizational culture during and after the evaluation.

Chairs
avatar for Rebecca Arnold

Rebecca Arnold

Senior Project Officer - MERI, Department of Environment and Water Resources (SA)

Presenters
avatar for Takara Tsuzaki

Takara Tsuzaki

Interdisciplinary Ph.D. in Evaluation, Western Michigan University
Takara Tsuzaki is a specialist in public relations, social policy research and evaluation. She has worked as researcher, consultant and evaluator for 15 years in the private, public, academic and not-for-profit sectors in Japan and the United States. Working extensively in the fields... Read More →


Monday September 16, 2019 3:30pm - 4:00pm AEST
C2.6

3:30pm AEST

Finding your voice: sharing your knowledge and elevating evaluation through social media, blogging and the Evaluation Journal of Australasia
Evaluation is an integral part of the policy making and service delivery ecosystem, with many government agencies and funding bodies requiring evaluations of initiatives that meet certain criteria. But evaluation is diverse and isn’t widely understood. (Ever got blank stares or questions about property valuation when you tell people you’re an evaluator?). Behavioural Insights and Consumer Experience get more traction. So what can we as evaluators do to elevate the discipline? Use social media, blogging and the Evaluation Journal of Australasia to un-box evaluation.

In this session, the editors of the EJA and the AES blog will share their tips on identifying a theme or subject, structuring journal articles and blogs, repurposing content of one type into another, finding your voice, and amplifying it through social media (#EvalTwitter anyone?). We’ll then throw it over to participants to ask questions, pitch ideas and find partners to collaborate with.

This session will provide emerging authors with the opportunity to network with editors and established authors and access support and resources on the ‘how to’ of finding your voice and navigate across platforms. It will also provide existing authors with tips for translating across platforms.

You have a voice and a story to tell, so be strategic in being heard. Participants in previous EJA conference sessions have gone on to contribute journal articles and book reviews, and to peer review for the journal.


Chairs
Presenters
avatar for Liz Gould

Liz Gould

Associate Director, NSW Department of Premier and Cabinet
I've been evaluating and using evaluation methodologies to manage and conduct public health, community, and social services evaluation and review projects for commonwealth and state government for the better part of a decade. I am an Editor of the Evaluation Journal of Australasia... Read More →
CQ

Carol Quadrelli

Consultant, University of Queensland
I have over 25 years of experience in the higher education sector with additional time served in local and state government roles. I have worn, and continue to wear many hats! Academic roles include: qualitative research, unit coordinator /lecturer /tutor in Law (Criminology... Read More →
avatar for Bronwyn Rossingh

Bronwyn Rossingh

Chief Financial Officer, Tiwi Island Training and Employment
Bronwyn is passionate about supporting the vision of Aboriginal communities and organisations. She has worked extensively in remote Aboriginal Communities in the NT and WA in the areas of financial management, governance, community engagement, enterprise development, financial capability... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & Managing Director, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
avatar for Eunice Sotelo

Eunice Sotelo

Project Officer, Australian Institute for Teaching and School Leadership
Interested in evaluation capacity building in the policy influence space. As an embedded evaluator, I'm open to trading stories about what it's like and how you've faced your own challenges. Message me on LinkedIn or tap me on the shoulder at the conference.


Monday September 16, 2019 3:30pm - 4:30pm AEST
C2.1

3:30pm AEST

How do we know? Implications of epistemology for evaluation practice
Gill Westhorp  (Charles Darwin University)

What do we know? What can we know, and how do we know that we know it? These are philosophical questions with real implications for the practice of evaluation. Epistemology is the branch of philosophy dealing with the nature of knowledge. Different epistemologies underpin different approaches in research and evaluation. They have implications for what data is considered to be 'valid', how data can or should be collected, how data is analysed and interpreted, and under what conditions findings are portable to other contexts.

This paper deals with two epistemologies - realist and constructivist - from a realist viewpoint. Some authors have claimed that realists 'are realists ontologically, but constructivists epistemologically'. That is, realists believe that there is a real world, which exists independently of our interpretations of it ("realist ontology"). However, we all construct our own interpretations of it. Knowledge is not a direct representation of reality, but an interpretation of it, constructed in our own heads, and shaped by language, culture, personal experience, and previous learning ("constructivist epistemology"). Knowledge does not exist independently of 'the person who knows'. In radical constructivism, we cannot even be sure that there is a real world. Perhaps we are all just avatars in some giant computer game.

This paper argues that there are areas of overlap, but also areas of distinction between, realist and constructivist epistemology. These distinctions have implications for evaluation practice. It will briefly describe the key assumptions of constructivism, and contrast these with key assumptions in realism. It will use a hypothetical evaluation as an example to discuss differences in: the purposes of constructivist and realist investigation; the nature of the data that is collected; the ways that analysis is undertaken; how 'valuing' is approached and how evaluation adds value; the nature of findings; and the portability of findings.


Chairs
MT

Mardi Trompf

M&E Lead Fiji and Tuvalu Facility, Tetratech Coffey International
Consolidation of disparate programs to determine value. Outcome based monitoring, value for money.

Presenters
avatar for Gill Westhorp

Gill Westhorp

Professorial Research Fellow, Charles Darwin University
Gill leads the Realist Research Evaluation and Learning Initiative (RREALI) at Charles Darwin University. RREALI develops new methods and tools within a realist framework, supports development of competency in realist approaches and provides realist evaluation and research services... Read More →


Monday September 16, 2019 3:30pm - 4:30pm AEST
C2.4

3:30pm AEST

Out of the box and in country: Tracking stories to collaboratively develop and evaluate an Indigenous-led wellbeing innovation in remote Australia
Samantha Togni (S2 Consulting), Rene Kulitja (Ngaanyatjarra Pitjantjatjara Yankunytjatjara Women Council), Margaret Smith (Ngaanyatjarra Pitjantjatjara Yankunytjatjara Women's Council), Nyunmiti Burton (Ngaanyatjarra Pitjantjatjara Yankunytjatjara Women's Council), Maimie Butler (Ngaanyatjarra Pitjantjatjara Yankunytjatjara Women's Council), Anawari Mitchell (Ngaanyatjarra Pitjantjatjara Yankunytjatjara Women's Council), Ilawanti Ken (Ngaanyatjarra Pitjantjatjara Yankunytjatjara Women's Council), Pantjiti McKenzie (Ngaanyatjarra Pitjantjatjara Yankunytjatjara Women's Council), Angela Lynch (Ngaanyatjarra Pitjantjatjara Yankunytjatjara Women's Council)

Evaluation conducted in Indigenous Australian contexts rarely incorporates Indigenous ways of knowing and valuing; Western worldviews predominate. To be grounded in and guided by different worldviews requires the disruption of power and privilege inherent in evaluation. Developmental evaluation is an approach that offers this potential and its practice continues to evolve.

Developmental evaluation relies on social innovators' knowledge and skills to effectively evaluate and support innovation development. It de-centres the evaluator 'expert'; instead, situating the evaluator within the development team which co-creates the innovation and the evaluation. Understanding how developmental evaluation operates in practice at the interface of different knowledge systems is important. Senior Indigenous people, who are leading an innovation to strengthen wellbeing in their communities, and the evaluator will share our story of using developmental evaluation over several years to support our Indigenous-led social innovation in remote Australia.

Our evaluation design draws on local Indigenous ways of knowing and learning incorporating drawing and storytelling from a range of perspectives to understand the innovation's nature and its effectiveness over time. This design facilitates a meaningful and integrated evaluation process that privileges all team members' knowledge. The Indigenous leaders' visual stories enable us to follow the tracks of the innovation across communities. We are tracking stories at multiple levels within multiple systems and for individuals over time, revealing changes and connections that inform the innovation development. Recently the team presented the evaluation findings to the funders, demonstrating the harnessing of an evaluation approach that supported Indigenous people to lead the telling of their own innovation story.

Developmental evaluation can build on the strengths of Indigenous culture and knowledge to support Indigenous voices, values and aspirations. In our experience, developmental evaluation opened up the value of evaluation to whole team, effectively addressing issues of power and privilege and promoting cultural validity.


Chairs
avatar for Farida Fleming

Farida Fleming

Principal Evaluator, Assai
I evaluate social justice programs, mostly in the fields of education and women's empowerment. I'm focused on getting people and organisations to better use evaluations to improve their practice. I facilitate learning processes, through evaluation activities, to help people and organisations... Read More →

Presenters
avatar for Samantha Togni

Samantha Togni

Evaluation & Social Research Consultant, S2 Consulting
Samantha Togni is an evaluation and social research consultant based in Alice Springs. She has more than 20 years’ experience in Indigenous health and wellbeing research and evaluation, working with rural and remote Aboriginal organisations in northern and central Australia. Her... Read More →
avatar for Margaret Smith

Margaret Smith

Ngaanyatjarra Pitjantjatjara Yankunytjatjara Women’s Council
A Yankunytjatjara woman from Imanpa community, Northern Territory, Margaret is Vice Chairperson of Ngaanyatjarra Pitjantjatjara Yankunytjatjara (NPY) Women’s Council. She is a founding member of NPY Women’s Council’s multi-award winning Uti Kulintjaku Project, an Anangu-led... Read More →
avatar for Rene Kulitja

Rene Kulitja

Ngaanyatjarra Pitjantjatjara Yankunytjatjara Women’s Council
A Pitjantjatjara woman from Mutitjulu community, Northern Territory, Rene Kulitja is a Director of Ngaanyatjarra Pitjantjatjara Yankunytjatjara (NPY) Women’s Council. She is a member of NPY Women’s Council’s multi-award winning Uti Kulintjaku Project, an Anangu-led innovation... Read More →
avatar for Nyunmiti Burton

Nyunmiti Burton

Ngaanyatjarra Pitjantjatjara Yankunytjatjara Women’s Council
A Pitjantjatjara woman from Amata community in the Anangu Pitjantjatjara Yankunytjatjara (APY) Lands in South Australia, Nyunmiti is a Director of Ngaanyatjarra Pitjantjatjara Yankunytjatjara (NPY) Women’s Council. She is a member of NPY Women’s Council’s multi-award winning... Read More →


Monday September 16, 2019 3:30pm - 4:30pm AEST
Pyrmont Theatre

4:00pm AEST

Making the numbers count: Being evaluation ready for administrative data analysis
Fiona Christian (ARTD Consultants), David Wakelin (ARTD Consultants)

Service providers are generating and collecting more data than ever before, and analysis of this data has become a standard feature of many evaluations. While these data sets are an important source of information for evaluation, they are not always in the most appropriate format. When evaluation teams and evaluation commissioners are not sufficiently prepared for administrative data analysis, evaluation time is lost and the quality of insights that could be gained about participants, their profiles, program engagement and outcomes is reduced. Being prepared for administrative data analysis is critical, especially if there are tight timeframes or deadlines in place.

This presentation will help evaluators and evaluation commissioners to better prepare for the administrative data component of evaluation. It will provide practical advice on what's needed for administrative data to more effectively and efficiently support evaluations and contribute to stronger findings and recommendations. To do this, we will present six key areas of preparation. Considering these areas ahead of your evaluation will make you better prepared for sharing information and handling questions from the evaluators and will give you an insight into how your program is working internally.

The presentation will also enable evaluators to conduct evaluability assessments at any stage of their program - from design to implementation - by recognising the strengths and weaknesses of their data when an evaluation commences.


Chairs
avatar for Margaret Moon

Margaret Moon

Senior Project Officer, SafeWork NSW
I manage the evaluation program at SafeWork NSW. This involves commissioning evaluations of programs designed to improve safety in NSW and building evaluation capacity across the organisation. I have previously worked as a film editor at the Australian Broadcasting Corporation, as... Read More →

Presenters
avatar for Fiona Christian

Fiona Christian

Director, ARTD
Fiona is a skilled evaluator, researcher and analyst who helps clients better understand their programs and policies by drawing insights from quantitative data. She is ARTD’s organisational lead for quantitative research and is experienced at analysing administrative data sets such... Read More →
avatar for David Wakelin

David Wakelin

Senior Consultant, ARTD Consultants
I am a keen data analyst with a passion for data visualisation. I've been working on a wide range of projects lately and have seen immense value in being able to tell stories with data I am working with.


Monday September 16, 2019 4:00pm - 4:30pm AEST
C2.3

4:00pm AEST

Knowing the value of knowledge: emerging approaches to evaluating research through end user perspectives.
Mohammad Alatoom (Office of Environment and Heritage), Emily Prentice (Office of Environment and Heritage), Larissa Brisbane (Office of Environment and Heritage)

Research and knowledge generation is often in the 'too hard' basket for evaluation, being viewed as a public good, a foundational activity, difficult to value economically, or combination of these. Historically, academia has valued 'research' through metrics such as impact factors, publishing records, citations and successful funding applications. While these indicators reflect academic interest in the research, they do not reveal much about the fulfilment of other end users' needs.

If evaluation judges the 'merit, worth or value' of a thing, then evaluation of targeted research activities should fully consider how the outputs and outcomes advance and enrich our knowledge, enabling more informed decision-making. The evaluation should ideally demonstrate to what extent the research provides value to end users, as well as capture any distant outcomes for peripheral end users. How, then, do we evaluate research beyond traditional academic indicators? How do we evaluate the impact and effectiveness of research programs that are in progress or yet to report findings? And how do we best engage and involve end users in research evaluation, from planning to monitoring and final execution?

We present a best practice review and its application through a case study to examine these questions in a practical context. We describe evaluation planning for a targeted research program that is designed to generate insights into a complex problem, while satisfying the needs of a diverse range of end users. We discuss integrating evaluation planning into program design, engaging end users in developing the evaluation framework, the challenges of establishing KPIs for research evaluation, and reflect briefly on capturing the longer-term outcomes and options to apply economic valuation methods.


Chairs
avatar for Sean Chung

Sean Chung

Director, Paxton Partners
Sean Chung is a Director of Paxton Partners, a specialist management consulting firm, focused exclusively on clients within the health and human services sectors. Sean has assisted clients across both Canada and Australia at multiple levels of the healthcare system - from federal... Read More →

Presenters
avatar for Larissa Brisbane

Larissa Brisbane

Snr Project Officer, CCF Strategic Evaluation Services, Dept of Planning Industry and Environment
It was only a short step from training in environmental science, and a background in cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing your stories of what you've done and what you've learned, especially in the areas... Read More →
avatar for Emily Prentice

Emily Prentice

Senior Project Officer, Strategic Evaluation and Statistics, Department of Planning, Industry and Environment (DPIE)
I'm an environmental scientist by training and spent my early career investigating the impact of contaminants in aquatic environments. I later moved into sustainability consulting where I discovered evaluation, which has proven to be the perfect match for someone accustomed to research... Read More →


Monday September 16, 2019 4:00pm - 4:30pm AEST
C2.5

4:00pm AEST

How can implementation quality be evaluated? An example from a pilot initiative in Victorian child and family services.
Jessica Hateley-Browne (Centre for Evidence and Implementation), Tom Steele (Centre for Evidence and Implementation), Vanessa Rose (Centre for Evidence and Implementation), Bianca Albers (Centre for Evidence and Implementation), Robyn Mildon (Centre for Evidence and Implementation)

Background and aim: High-quality program implementation is a pre-condition to program effectiveness. However, evaluation of the implementation process is rare, resulting in uncertainty around interpretation of impact evaluations with null effects (i.e. was the program ineffective, or implemented poorly?). We report on an implementation evaluation of the Victorian Government's pilot of five manualised therapeutic programs for vulnerable families (four developed in the USA) across seven service provider agencies; the first evaluation of this nature and scope in Australia. The aim was to provide an indication of the comprehensiveness, pace and quality of program implementation to inform government decisions about if/ how such programs should be funded, implemented, supported and scaled.

Method: A real-world mixed-methods observational study design was used. The Stages of Implementation Completion checklist assessed implementation pace and comprehensiveness. Theory-based structured interviews were conducted with agency staff (N=29) to explore program appropriateness, acceptability and feasibility. Fidelity data were extracted from agency databases.

Results: Most (n=6) agencies were still in early implementation, having not yet achieved sustainability. Highly-concentrated and overlapping implementation activity was observed, reflective of funding pressures, putting implementation quality at risk. The programs were generally well-accepted, perceived as high-quality and a good fit. While most agency staff 'believed in' the programs, perceived appropriateness was compromised by the lack of adaptability for Aboriginal and Torres Strait Islander communities. Threats to feasibility included high demands on practitioners and lack of Australian-based implementation support (trainers, consultants). It was too early for valid fidelity assessments.

Conclusions: Policy-makers should afford agencies more time/resources to incorporate initiatives into 'business as usual'. Ongoing monitoring of implementation outcomes is highly recommended to facilitate data-driven decisions about when to commence impact evaluation (i.e. when sustainability is achieved, and fidelity has been demonstrated).


Chairs
LK

Leanne Kelly

Research & Development Coordinator, Windermere Child & Family Services
I am the research and development coordinator at Windermere Child & Family Services in Melbourne. I have recently submitted my PhD on the effectiveness of evaluation in small community development non-profits.

Presenters
avatar for Dr Jessica Hateley-Browne

Dr Jessica Hateley-Browne

Senior Advisor, Centre for Evidence and Implementation
Jessica is a Senior Advisor at the Centre for Evidence and Implementation. She has a PhD in health psychology, and has expertise in behavioural science and implementation science. She has authored more than 40 journal articles that span the fields of health services, population health... Read More →


Monday September 16, 2019 4:00pm - 4:30pm AEST
C2.2

4:00pm AEST

Evaluating a place-based partnership program: Can Get Health in Canterbury
Amy Bestman (Health Equity Research & Development Unit (HERDU), Sydney Local Health District), Jane Lloyd (Health Equity Research & Development Unit (HERDU),  Sydney Local Health District), David Lilley (Health Equity Research & Development Unit (HERDU), Sydney Local Health District), Barbara Hawkshaw (Central and Eastern Primary Health Network)

This presentation wrestles with the balance between ensuring a robust community-led, inter-sectoral, public health program in a culturally and linguistically diverse (CALD) location and how to effectively provide sufficient monitoring, evaluation, reflection and improvement opportunities while the intervention is in situ.

Can Get Health in Canterbury (CGHiC) is a unique inter-sectoral program with three key partners (the University of New South Wales, Sydney Local Health District and Central Eastern Primary Health Network) and many local partnerships with community organisations. It was established in 2013 to address high health needs among CALD population groups within Canterbury, NSW.
CGHiC's partnership with the community is supported by the employment of community networkers and the establishment of collective control projects. Bengali and Arabic networkers link the community with the health system, and also provide insight to the health system on the unique needs of the community. The collective control projects enable the community to have greater power over decision making, priority setting and allocation of resources. These projects aim to improve capacity of both community groups and the health system and encourage bi-directional learning and reflection.

Two external evaluations have previously been conducted which provide a point in time reflection on the impact of the project. Now that CGHiC is in its sixth year of operation, we are evaluating the program in-house with the following foci: the external impact of the program; the governance structure, priority setting and decision making of the program; and, the activities of the program. While this process is ongoing, the program team have implemented monitoring tools and processes to measure recent activities. The CGHiC evaluation will contribute to the field of evaluation through the development of novel methodologies, approaches and insights to evaluating complex place-based, multi-sectoral, population-level programs in situ.


Chairs
avatar for Rebecca Arnold

Rebecca Arnold

Senior Project Officer - MERI, Department of Environment and Water Resources (SA)

Presenters
avatar for Amy Bestman

Amy Bestman

Community Partnerships Fellow, UNSW
Dr Bestman’s work has been driven by a strong public health approach and has focused on the translation of research to practice and policy. Her research has focused on public health qualitative studies that address inequity in vulnerable populations such as children, disadvantaged... Read More →


Monday September 16, 2019 4:00pm - 4:30pm AEST
C2.6

4:30pm AEST

Plenary 2: David Fetterman "Empowerment Evaluation: a powerful stakeholder involvement approach fit for the times"
David Fetterman (President and CEO, Fetterman & Associates)

As we enter a new era of social consciousness, awareness, and transformation, empowerment evaluation provides a useful approach. Since it was first introduced twenty-six years ago, it has been the subject of critique from the likes of such theorists as Michael Scriven. But the approach has been used in over 16 countries around the world and in contexts as varied as Native American reservations, Google, smoking cessation initiatives, and fourth and five grade school inclusion programs. There is little doubt that it is “an approach that has literally altered the landscape of evaluation”.

Empowerment evaluation is a self-evaluation approach designed to help people help themselves. Community and program staff build evaluation capacity, by conducting their own evaluation with the guidance of empowerment evaluation coaches (or critical friends). This presentation will situate empowerment evaluation within the broader landscape of stakeholder involvement approaches and equip you with the guiding principles, key concepts and specific steps to apply the approach yourself.

It will leave you questioning the boundaries of evaluation and the role of the evaluator in the context of conversations about professionalization. In un-boxing evaluation, empowerment evaluation can open up tremendous potential. It shifts the playing field from one of exclusivity to inclusivity. It allows us to reach more people, to help more people think more evaluatively, and to improve their own lives.

Reflecting the openness to dialogue and reflective practice that is the hallmark of empowerment evaluation, this presentation will conclude with a Q&A session.

Presenters
avatar for David Fetterman

David Fetterman

President & CEO, Fetterman & Associates
David Fetterman is President and CEO of Fetterman & Associates, an international evaluation consulting firm. He has 25 years of experience at Stanford University, serving as a School of Education faculty member, School of Medicine director of evaluation, and senior member of Stanford... Read More →


Monday September 16, 2019 4:30pm - 5:30pm AEST
Pyrmont Theatre
  Plenary
  • Modality Keynote address
  • Level All
  • about David Fetterman is President and CEO of Fetterman &amp; Associates, an international evaluation consulting firm. He has 25 years of experience at Stanford University, serving as a School of Education faculty member, School of Medicine director of evaluation, and senior member of Stanford administration. Fetterman concurrently serves as a faculty member at Pacifica Graduate Institute, &nbsp;the University of Charleston, and San Jose State University. &nbsp;He is also a co-director of the Arkansas Evaluation Center. Previously, Dr. Fetterman was a professor and research director at the California Institute of Integral Studies, Principal Research Scientist at the American Institutes for Research, and a senior associate at RMC Research Corporation.<br>David is a past president of the American Evaluation Association. He received both the Paul Lazarsfeld Award for Outstanding Contributions to Evaluation Theory and the Myrdal Award for Cumulative Contributions to Evaluation Practice. Fetterman also received the American Educational Research Association Research on Evaluation Distinguished Scholar Award and the Mensa Award for Research Excellence.<br><br>Fetterman is the founder of empowerment evaluation. He has published 17 books, including Collaborative, Participatory, and Empowerment Evaluation: Stakeholder Involvement Approaches (with Rodríguez-Campos and Ann Zukoski), Empowerment Evaluation: Knowledge and Tools for Self-assessment, Evaluation Capacity Building, and Accountability (with Kaftarian and Wandersman), Empowerment Evaluation in the Digital Villages: Hewlett-Packard’s $15 Million Race Toward Social Justice, Empowerment Evaluation Principles in Practice (with Abraham Wandersman), Foundations of Empowerment Evaluation, and Ethnography: Step by Step.

5:30pm AEST

Australian Evaluation Society 2019 AGM & 2019-2022 Strategy launch
Join the Australian Evaluation Society (AES) Board as we celebrate another year’s achievements by members of the AES,  introduce 2019-2020 Board, and launch the new AES Strategy.

Chairs
avatar for John Stoney

John Stoney

President, Australian Evaluation Society
An internal evaluation practitioner within the Australian Government for nearly 15 years which he describes this as his 'day job', in his 'evening job' John is the current AES President. Prior to that he has  been (also effectively part-time) at varying stages a student and later... Read More →

Monday September 16, 2019 5:30pm - 7:00pm AEST
Pyrmont Theatre
  Special session
 
Tuesday, September 17
 

9:00am AEST

Plenary 3: What’s beyond the box of program evaluation and what does this mean for us?
What’s beyond the box of program evaluation and what does this mean for us?
 
How can evaluators contribute towards the sustainable development goals? How might evaluators support people to move beyond measuring to think though whether we are doing the right things and whether we are really making a difference across systems? This panel of 5 people will explore how we might move out of the traditional box of program evaluation, to make a bigger difference.
 
Panel consists of designers, evaluators from Australia and New Zealand: Jess Dart,  Kate McKegg, Adian Field, Jen Riley and Jacqueline (Jax) Wechsler, facilitated by Emily Verstage.

Chairs
avatar for Emily Verstege

Emily Verstege

Senior Manager, ARTD Consultants

Presenters
avatar for Jacqueline (Jax) Wechsler

Jacqueline (Jax) Wechsler

Sticky Design Studio
I am passionate about enabling better futures for individuals, society and the planet and have been practicing design for close to 20 years. While human-centred design is a useful approach, it is not the silver bullet for sustainable systems change. Recently I have been building expertise... Read More →
avatar for Jenny Riley

Jenny Riley

Chief Innovation Officer, Clear Horizon
Jen is one of the leading digital disrupters in the evaluation space, having developed and commercialised a digital data collection, storage and reporting tool Track2Change and most recently has developed and launched Clear Horizon's Learning Academy www.clearhorizonacademy.com... Read More →
avatar for Jess Dart

Jess Dart

CEO & Principal Consultant, Clear Horizon Consulting
Dr Jess Dart is the founder and CEO of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of experience... Read More →
KM

Kate McKegg

Director, The Kinnect Group
Kate has specialist skills in supporting evaluative thinking and practice in complex settings where people are innovating to create systems change. She has been applying these skills for over 25 years in government, non-government, philanthropic and community contexts, including many... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail Consulting Ltd
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →


Tuesday September 17, 2019 9:00am - 10:30am AEST
Pyrmont Theatre
  Plenary

11:00am AEST

Stories of strength: using educators' reflections on implementing a strength-based approach to Aboriginal and Torres Strait Islander education to understand mechanisms for change
Cathy Jackson (Stronger Smarter Institute), John Davis (Stronger Smarter Institute), Jana Andrade (Stronger Smarter Institute)

The Stronger Smarter Leadership Program (SSLP) is a professional development program that promotes a strength-based approach to Aboriginal and Torres Strait Islander education, challenging educators to examine their personal beliefs. Evaluation for the SSLP covers all levels of the Kirkpatrick model including participant satisfaction, behavioural change, and use of ideas, tools and strategies in the workplace. In this presentation, we focus on one aspect of the evaluation around how participants examine their underlying beliefs and challenge deficit thinking with regard to Aboriginal and Torres Strait Islander education. We describe how a Realist Evaluation approach helps understand the mechanisms occurring both during the professional development program and afterwards in the workplace that are resulting in participants changing their thinking and behaviours.

We describe the results of a series of semi-structured interviews with 50 program participants who had undertaken the SSLP between 6 months and 10 years prior to the interview taking place. Participants included both Indigenous and non-Indigenous principals, teachers and teacher aides. The interviews were conducted with open questions to allow participants to steer the interview and choose the stories they tell. We look at how participants describe how they had 'opened their eyes' with respect to Aboriginal and Torres Strait Islander education and are actively challenging deficit thinking. This in turn leads to building school-wide understandings of high expectations for all students and working together with school communities. Collecting these stories of strength is an ongoing process that helps refine the evaluation process allowing a gradual deepening of the questions as logic models and mechanisms for change become apparent.


Chairs
avatar for Linda Klein

Linda Klein

Deputy Director Research & Evaluation, GP Synergy
Linda Klein, BSc MSc PhDI have an academic background in psychology and public health, with over 30 years of practical experience in evaluation in sectors spanning health, education/training and business. At GP Synergy, I take primary responsibility for the evaluation of educational... Read More →

Presenters
avatar for Cathy Jackson

Cathy Jackson

Head of Research, Stronger Smarter Institute
Cathy joined the Stronger Smarter Institute seven years ago following on from a 20-year career at the Queensland University of Technology. At the Institute, she has responsibilities in designing and managing the overall evaluation program for the Institute’s professional development... Read More →


Tuesday September 17, 2019 11:00am - 11:30am AEST
C2.5

11:00am AEST

The challenges of establishing and growing an internal evaluation unit: Experiences from two large state government departments
Eleanor Williams (Department of Health and Human Services), Josephine Norman (Department of Health and Human Services)

A number of government departments and agencies across Australia have established new evaluation units of varying sizes and function within the past decade, all with some objective of unboxing evaluation and evidence for use in policy design and implementation.

In Victoria, two large state government departments have made significant commitments to new internal evaluation units with functions that extend beyond traditional capacity building and oversight roles to direct delivery of evaluations and cross-portfolio evidence reviews. The two presenters have played a leading role in the establishment and growth of these units.

While there has been significant research into what constitutes effective and efficient evaluation capacity building activities, less attention has been given to what is required to establish and grow an internal unit.

In this presentation, these two Victorian departments reflect and share practice examples on challenges and successes of developing and maintaining an internal delivery function. The moderator will highlight and contrast experiences including:
  • Determining the unit's value proposition Will key stakeholders get excited about your value proposition and believe in what you are doing?
  • Getting the right people (and the right mix of people). Are new staff skills and competencies needed?
  • Delivering proof of concept early to key stakeholders. How to get your stakeholders confidence and become seen as the 'fuel not the brakes'?
  • Designing fit-for-purpose products: Answering difficult questions - How will you know if the unit's work is independent, quality, and fit-for-purpose?
This presentation aims to advance the national discussion about strategies for pragmatic implementation of increased in-house evaluation based on theory and practice.

The session will feature a strong participatory element where attendees will be invited to share lessons learned, success stories and examples and challenges from their own organisations.


Chairs
avatar for Florent Gomez

Florent Gomez

Manager, Planning and Evaluation, NSW Department of Customer Service

Presenters
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →
avatar for Josephine Norman

Josephine Norman

Manager, Centre for Evaluation and Research, DHHS
I’m endlessly optimistic about our profession & the ways in which we can contribute to better outcomes, particularly in my context, for vulnerable Victorians. I’m always interested in finding new ways to give better advice, based on increasingly good evidence. I am fortunate to... Read More →
avatar for Amanda Reeves

Amanda Reeves

A/Manager, Performance and Evaluation Division, Department of Education and Training
Amanda is an experienced evaluation practitioner and policy analyst at the Department of Education Victoria. Amanda has led evaluation projects in a variety of roles in government, the not-for-profit sector and as an external consultant in education, youth mental health and industry... Read More →


Tuesday September 17, 2019 11:00am - 12:00pm AEST
C2.6

11:00am AEST

#aes19SYD unconference: evaluation for a better world
The unconference provides the time and space to discuss what matters to you about the future of evaluation for a better world.

We’re using open space. Developed to “find a way towards meetings that have the energy of a good coffee break combined with the substance of a carefully prepared agenda" (Owen, 2018), it has been used in thousands of gatherings around the world over the past few decades.

If you’ve experienced it before, you understand the possibilities. If you haven’t, be prepared to be surprised.

Come and share what you’re passionate about. All ideas and forms of contribution are welcome – you might bring a topic you want to convene a group on, move from group to group, or take a pause and find yourself in a conversation you didn’t expect to have.



Tuesday September 17, 2019 11:00am - 12:00pm AEST
C2.2-C2.3

11:00am AEST

Bringing values into evaluation: A tool for practitioners
Mathea Roorda (Allen  + Clarke)

Values are fundamental to evaluation as they provide the basis against which evaluative judgments are made. Yet evaluators often overlook them. In this skill building session, participants will be introduced to a framework intended to unbox dimensions of value for publicly-funded programs. As the overall conference theme states: evaluation can be gift - it has the potential to strengthen people's lives. Evaluation also comes with responsibilities, one of which is that the evaluator's judgments need to be based on all relevant values, not just those of the evaluation commissioner. The framework draws on two approaches to valuing, one of which comes from a branch of philosophy that is focused on value (how we understand concepts such as good and bad); the second on describing value as understood by different program stakeholders. We will step through the framework's components and then discuss its applicability for evaluation practice. A handbook for using the framework will be made available to participants.

Chairs
avatar for Anne Markiewicz

Anne Markiewicz

Director, Anne Markiewicz and Associates
Anne Markiewicz is a leading monitoring, evaluation, and training specialist with over 20 years’ experience designing monitoring and evaluation frameworks and developing and implementing a range of evaluation methodologies. She has designed and delivered M&E capacity development... Read More →

Presenters
avatar for Mathea Roorda

Mathea Roorda

Senior consultant, Allen + Clarke Consulting
Let's get to the heart of that word 'evaluation'. What do we mean by value and how do we know we've included all relevant values (criteria) in our assessment of a programme? Questions that keep me awake at night...


Tuesday September 17, 2019 11:00am - 12:00pm AEST
C2.4

11:00am AEST

Rubrics - a tool for unboxing evaluative reasoning
Julian King (Kinnect  Group), Kate McKegg (Kinnect Group, NZ), Judy Oakden (Kinnect Group, NZ), Nan  Wehipeihana (Kinnect Group, NZ), Adrian Field (Kinnect Group, NZ)

Rubrics are an intuitive way of implementing evaluation-specific methodology. They can be used in a wide variety of evaluation contexts to unbox, demystify and democratise evaluative reasoning, by facilitating a clear, shared understanding of how quality, value and effectiveness are defined.

This panel presentation will present case examples of rubrics from different contexts, illustrating how rubrics support not only explicit evaluative reasoning but also stakeholder engagement and participation, innovation, adaptive strategy, evaluation validity, communication of results, and evaluation use.

The moderator will introduce the panelists and give a brief introduction to rubrics.

Presenter 1 and 2 will discuss the use of rubrics as a tool for supporting emergent strategy and innovation, focusing on an example of a developmental evaluation. This discussion will highlight the flexibility of rubrics to support ongoing iteration and adaptation, as well as multiple stakeholder perspectives.

Presenters 3 and 4 will illustrate the use of rubrics to support the synthesis of evidence and sound evaluative reasoning. This presentation will also highlight the ability of rubrics to increase the credibility and validity of evaluation, as well as the benefits of stakeholder participation.

Presenters 5 and 6 will explore and deliberate on the use of rubrics in the communication of evaluation results. In particular, this presentation will highlight the ways rubrics can support engaging reporting and visualisation of findings that support use.

The moderator will facilitate a discussion between the panelists, to respond to questions from the audience.


Chairs
RH

Ronelle Hutchinson

Director, PwC Economics & Policy

Presenters
avatar for Julian King

Julian King

Director, Julian King and Associates
Julian specialises in evaluation and value for money. He advises, teaches, presents, and writes on these topics globally, with a particular focus on combining evaluative reasoning with economic methods of evaluation. Julian is a member of the Kinnect Group, an Associate of Oxford... Read More →
avatar for Nan Wehipeihana

Nan Wehipeihana

Ms, Kinnect Group
Nan Wehipeihana has more than 20 years' experience designing, leading and managing evaluations. Nan's work is characterised by a commitment to protecting, evidencing and growing the space to be Maori in Aotearoa New Zealand and offering insights into Maori worldviews and values. Internationally... Read More →
JO

Judy Oakden

Director, The Kinnect Group
Judy has held management roles in evaluation, market research and management consulting, and also worked in public relations. Judy shares a passion for finding better ways to help people navigate complexity and deal with the frustrating and seemingly intractable issues they face on... Read More →
KM

Kate McKegg

Director, The Kinnect Group
Kate has specialist skills in supporting evaluative thinking and practice in complex settings where people are innovating to create systems change. She has been applying these skills for over 25 years in government, non-government, philanthropic and community contexts, including many... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail Consulting Ltd
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →


Tuesday September 17, 2019 11:00am - 12:00pm AEST
Pyrmont Theatre

11:00am AEST

Unpacking Rainbow Boxes: Exploring multiculturalism and interculturality in evaluation practice.
Erin Blake (Independent Consultant), Eva Sarr (Center for Multicultural Program Evaluation)

We need to talk.

About racism, xenophobia, privilege and cultural value differences. And their implications for evaluation practice in and around Australia.

Populism, xenophobia and nationalism are on the rise globally. Fuelled by a scepticism towards multiculturalism, globalisation and human mobility; increasingly polarised politics; click-bait news media; and, a ubiquitous social media presence. In extreme instances, these social attitudes and structural barriers have had lethal consequences.

Racism, xenophobia, privilege and a lack of understanding of diverse cultural values are problematic for evaluators working on community-oriented projects, particularly those that seek to promote social cohesion and inclusion. They are also critical for evaluators working to support international aid and development processes.

This session will facilitate a respectful and honest intercultural dialogue on race, xenophobia, cultural value differences and privilege. In doing so the session will ‘unpack the rainbow boxes’ and start a conversation on issues that many find confronting and difficult to discuss, but which can affect our day-to-day work, profession and communities in a multitude of ways. Through this dialogue, we will better understand the issues at hand for contemporary evaluation practice – including unconscious bias, understanding our own values and structural discrimination – and begin developing useful strategies to better recognise and address these issues in the Australian context.

The dialogue will elicit the views and experiences of participants. Facilitators will draw on extensive Australian and international experiences, literature on culturally responsive and equitable evaluation, case studies, and feminist critiques; to elicit the points at which power and cultural values intersect with our evaluation practice (i.e. at the funding, motivation, design, data collection, analysis, interpretation, dissemination and communication stages) and how we can become more culturally responsive in our own work.

This is an important and topical issue that will interest practitioners, commissioners and consumers of evaluation.

Chairs
avatar for Kathryn Dinh

Kathryn Dinh

MEL Consultant/PhD Candidate, UNSW
Kathryn is a monitoring, evaluation and learning consultant with more than 20 years of experience across Asia, the Pacific (including Australia) and the Commonwealth of Independent States. She specialises in evaluation for health and international development programs, advocacy evaluation... Read More →

Presenters
avatar for Eva Sarr

Eva Sarr

Founding Director, The Center for Multicultural Program Evaluation
Eva Sarr is an indigenous Wolof woman of Serer ancestry from Sene-Gambia, in West Africa. She is also an indigenous Celtic- Scottish and Irish and 6th generation Australian-woman with a multi-denominational background. Her father was Muslim and her mother, Catholic.Eva’s career... Read More →
avatar for Erin Blake

Erin Blake

Monitoring, Evaluation and Learning Consultant, Erin Blake Consulting
Erin is an independent international development Monitoring, Evaluation and Learning (MEL) consultant with 12 years experience. He has a passion for ‘working with people do MEL better’ and working on complex social change programs that seek to bring about long-term positive change... Read More →
avatar for Stafford Hood

Stafford Hood

Stafford Hood is the Sheila M. Miller Professor of Education and Founding Director of the Center forCulturally Responsive Evaluation and Assessment (http://crea.education.illinois.edu) in the College of Education at the Universityof Illinois at Urbana-Champaign wherehe also holds... Read More →
avatar for Rodney Hopson

Rodney Hopson

Professor Rodney Hopson serves as Professor of Evaluation in the Department of Educational Psychology, College of Education, University of Illinois-Urbana Champaign. He has received numerous research and evaluation awards, including the Marcia Guttentag Early Career Award (2000... Read More →


Tuesday September 17, 2019 11:00am - 12:00pm AEST
C2.1

11:30am AEST

Sharing perspectives and creating meaning through insider/outsider evaluation of an Aboriginal transfer of care program from hospital to community
Liz Norsa (Western Sydney University), Nathan Jones (Aboriginal Health Unit SWSLHD), Karen Beetson (Aboriginal Health Unit SWSLHD), An  Speizer (Aboriginal Health Unit SWSLHD), Raylene Blackburn (Camden & Campbelltown Hospitals SWSLHD), Ilse Blign lt (School of Medicine Western Sydney University)

Aboriginal people with chronic conditions are more likely to leave hospital with incomplete transfer of care arrangements and more likely to be readmitted after a recent hospitalisation. The Aboriginal Transfer of Care (ATOC) Program at South Western Sydney Local Health District (SWSLHD), in which Aboriginal Liaison Officers and Transfer of Care nurses work as team to deliver a holistic patient-centred model of care, was designed to address this problem by ensuring consideration of an Aboriginal patient’s medical, cultural and psychosocial needs. Promising early results led to a formal evaluation funded by NSW Health under its Translational Research Grant Scheme. SWSLHD, Western Sydney University and the Ministry of Health are partners in this mixed-methods evaluation. The qualitative evaluation component aimed to: document the program model, describe what is ‘successful’ transfer of care for patients, their families and service providers, and identify opportunities for program enhancement and extension. The evaluation employed participatory methods, which involved over 40 interviews, participant observation and workshops at two hospitals. SWSLHD and the university members of the evaluation team brought insider and outsider perspectives: Aboriginal and non-Aboriginal; service manager or provider, and evaluator. This short presentation describes how the evaluation approach and ways of working were shaped by these different perspectives.


Chairs
avatar for Linda Klein

Linda Klein

Deputy Director Research & Evaluation, GP Synergy
Linda Klein, BSc MSc PhDI have an academic background in psychology and public health, with over 30 years of practical experience in evaluation in sectors spanning health, education/training and business. At GP Synergy, I take primary responsibility for the evaluation of educational... Read More →

Presenters
LN

Liz Norsa

Research Officer, Western Sydney University
Liz Norsa is employed as a Research Officer at the Translation Health Research Institute at Western Sydney University. As a social and cultural anthropologist Liz has a particular interest in patient agency, wellbeing, the production of health/medical knowledge and ethnography within... Read More →
RB

Raylene Blackburn

Aboriginal Liaison Officer, NSW Health
I am a proud Anaiwin & Dungutti woman, lived in the area for 30+ yearsEmployed by Campbelltown Hospital for 10 years, ALO for 7 years The role of ALO is very rewarding and knowing I am helping and supporting our community is the best feeling
avatar for Karen Beetson

Karen Beetson

Deputy Director Aboriginal Health, South Western Sydney Local Health District
Karen Beetson is a Manadandanji woman from Roma QLD and has lived and worked in the Dharawal community for the most of her life. Karen has worked for over 35 years in Aboriginal Community Development and capacity building beginning her career in Aboriginal Employment and Education... Read More →


Tuesday September 17, 2019 11:30am - 12:00pm AEST
C2.5

12:00pm AEST

Internal Evaluation Capacity Building: Unpacking what works in a (very) large government department
Liam Downing (Centre for Education Statistics and Evaluation), Rydr Tracy (Department of Education)

While evaluation capacity building is not an exact science, practitioners can benefit from understanding what has worked in other settings. This session will provide insight for evaluators at all levels into the factors underlying a successful and growing evaluation capacity building strategy within a large, state level education department; with lessons applicable across different sectors.
Strengthening evaluation capacity is a key component of evaluative practice within large sectors (or - more specifically - very large sectors). This is particularly apparent in spaces where practice and outcomes are constantly under scrutiny, and where stakes - for beneficiaries, policymakers and practitioners - are high. The early childhood, primary and secondary education sector is a perfect example of this high-stakes space; and a space where evaluation capacity building can be of benefit.
The NSW Department of Education is home to a small but influential team that focuses on building evaluation capacity among school leaders, teachers and corporate personnel. Established in 2016, the Evaluation Capacity Building (ECB) project is well regarded within the Department and has been identified by the Department of Premier and Cabinet as an example of effective service delivery in the NSW public sector. This presentation will outline key activities undertaken in this space over the last three years, and identifies five key enabling factors that have been instrumental in the project's success so far:
  1. Leveraging existing structures and reforms
  2. Establishing and maintaining a strong authorising environment
  3. Effective collaboration at multiple levels
  4. Operating with the right mix of skills and support
  5. Engaging in a disciplined design process within

The session will detail how each factor influenced the impact of evaluation capacity building efforts, and provide practitioners with a potential roadmap for what might work in their own sectors.


Chairs
avatar for Florent Gomez

Florent Gomez

Manager, Planning and Evaluation, NSW Department of Customer Service

Presenters
avatar for Liam Downing

Liam Downing

Evaluation Capacity Building Lead, NSW Department of Education
RT

Rydr Tracy

R/CEO School Engagement, CESE
Evaluation capacity building in education.Funny jokes.


Tuesday September 17, 2019 12:00pm - 12:30pm AEST
C2.6

12:00pm AEST

Using Program Design Logic to manage the risk of program failure
Andrew Hawkins (ARTD  Consultants)

This paper is about identifying, managing and mitigating the risk that a program will not produce its intended effects. A principle of this approach is that a program at its core is simply a proposition that a certain course of action will lead to a certain set of outcomes. It is about putting the logic back in program logic.

Program Design Logic (PDL) is a tool for developing evidence based policy and programs. Through the language of 'necessary' and 'sufficient' conditions in place of 'outputs' and 'outcomes' it provides a framework to determine if a program or course of action makes sense 'on paper' before we attempt to determine if it makes sense in 'reality' through monitoring and evaluation.

The five types of risk are
  1. It doesn't make sense on paper - logical risk
  2. It makes sense on paper, but assumptions don't hold - assumption risk
  3. It makes sense on paper, but we didn't do what we said we would do - performance risk
  4. It makes sense on paper, assumptions hold, we do what we said we would do, but outputs don't materialise - theoretical risk
  5. It makes sense on paper, assumptions hold, we do what we said we would do, outputs materialise, but intended outcomes don't follow, so the array of outputs was not actually sufficient to bring about a desired future state - logical risk
  6. It makes sense on paper, assumptions hold, we do what we said we would do, outputs materialise, intended outcomes follow, but longer term outcomes don't materialise -external factor risk
This paper will discuss how a PDL approach can provide a comprehensive risk management framework before the first participant is even enrolled, which may then be managed and mitigated through program re-design as well as adaptive monitoring and evaluation.


Chairs
RH

Ronelle Hutchinson

Director, PwC Economics & Policy

Presenters
AH

Andrew Hawkins

Partner, ARTD Consultants
Andrew works as a trusted advisor and strategic evaluator for public policy professionals, generating insight and evidence for decision-making. Andrew has worked for a wide range of Australian and NSW public sector agencies and not-for-profits on the design, monitoring and evaluation... Read More →


Tuesday September 17, 2019 12:00pm - 12:30pm AEST
Pyrmont Theatre

12:00pm AEST

Aboriginal Family Planning Circle evaluation: empowering Aboriginal and Torres Strait Islander communities in evaluating and future-proofing Aboriginal-led community programs
Amy Lawton (WESTIR Ltd), Olivia Hamilton (WESTIR Ltd), Cheryl Jackson (Marrin Weejali Aboriginal Corporation)

This presentation outlines an evaluation undertaken on the Aboriginal Family Planning Circle program by WESTIR Limited (Western Sydney Regional Information and Research Service), with the primary evaluation undertaken in 2015 and a follow-up evaluation in 2017.

The Aboriginal Family Planning Circle (AFPC) is a community-led program which works with Aboriginal families in Greater Western Sydney in addressing their complex needs and reducing the risk of having their children assumed into out of home care. The program is supported by the Marrin Weejali Aboriginal Corporation and is located in a region with the highest Aboriginal population in NSW. The evaluation was important given the ongoing over-representation of Aboriginal children in out of home care and the constant threat of the program losing government funding.

A range of qualitative and quantitative techniques were used during the AFPC evaluation process to capture feedback from the program's clients, service providers and community members. Interviews and focus groups found that the AFPC program was effective in addressing the client's complex needs, building better relationships between clients and government services, and ultimately helping clients retain or resume custody of their children. A return on investment analysis in the follow-up evaluation also highlighted the significant savings and returns the AFPC program generated for the government through its prevention and restoration activities.

The AFPC program is a case study of how culturally responsive evaluations can empower Aboriginal communities to advocate for the continued funding of effective Aboriginal-led programs operating in resource-constrained environments. It also reflects on the challenges experienced by evaluators when undertaking a culturally responsive Indigenous evaluation, including lack of funding for evaluations; historical mistrust of Aboriginal communities with government institutions; addressing power imbalances between Aboriginal and non-Aboriginal participants; managing political agendas and the increasing expectation in arguing the economic value of community-based programs.


Chairs
avatar for Linda Klein

Linda Klein

Deputy Director Research & Evaluation, GP Synergy
Linda Klein, BSc MSc PhDI have an academic background in psychology and public health, with over 30 years of practical experience in evaluation in sectors spanning health, education/training and business. At GP Synergy, I take primary responsibility for the evaluation of educational... Read More →

Presenters
avatar for Amy Lawton

Amy Lawton

Social Research and Information Officer, WESTIR Ltd
Amy is a social researcher who currently works for WESTIR Ltd. WESTIR Ltd, which stands for Western Sydney Regional Information and Research Service, provides social research and data support to the community services sector. She is active in several community-based initiatives, including... Read More →
avatar for Cheryl Jackson

Cheryl Jackson

Coordinator, Marrin Weejali Aboriginal Corporation
Cheryl Jackson is an Aboriginal woman of the Murawarra tribe. She is the Coordinator of Aboriginal Family Planning Circle (AFPC). The AFPC was an initiative of the Aboriginal Family Workers Support Group. The Aboriginal Family Workers Group consists of Aboriginal workers within Government... Read More →


Tuesday September 17, 2019 12:00pm - 12:30pm AEST
C2.5

12:00pm AEST

How to integrate intercultural considerations in evaluation debate and practice
Rini Mowson (Clear Horizon), Sarah Leslie (Clear Horizon)

The context within which an evaluand exists matters in evaluation. The AES evaluators' professional learning competency framework dedicates an entire domain to "attention to culture, stakeholders and contexts". Oakley, Pratt and Clayton (1998) argued that evaluation context should be treated as being at the very heart of social development and impact assessment must take full account of the bigger picture in arriving at the conclusion about the success or failure of social development programs. Thus, adapting and managing the evaluation "context" is important to balance ensuring sustainable and impactful evaluation to the end users/beneficiaries, with satisfying the needs of the program team and/or evaluation commissioner.

This paper will seek to answer two questions:: "What are the domains of context that evaluators need to be aware of?" and "How can evaluators adapt their practice to fit the context where they work?'.
The presenters will draw on their experiences in evaluation in multicultural contexts through their work in international development.

The presenters propose three domains of context that evaluators should consider before embarking on an evaluation journey. Firstly, studies demonstrate the importance of applying basic principles of evaluation such as participation, community empowerment and communicating the evaluation results back to beneficiaries, however most evaluations are donor driven exercises. With this limitation, how can evaluators empower funding recipients to enforce the application of basic principles of evaluation. Secondly, how can evaluators address power dynamics in the evaluation process to ensure the evaluation results will represent the real outcomes of the program achieved across different types of beneficiaries. Thirdly, presenters propose that all evaluation should find ways to ensure evaluation will support capacity building of relevant stakeholders including beneficiaries and communities.


Chairs
avatar for Kathryn Dinh

Kathryn Dinh

MEL Consultant/PhD Candidate, UNSW
Kathryn is a monitoring, evaluation and learning consultant with more than 20 years of experience across Asia, the Pacific (including Australia) and the Commonwealth of Independent States. She specialises in evaluation for health and international development programs, advocacy evaluation... Read More →

Presenters
avatar for Rini Mowson

Rini Mowson

Consultant, Clear Horizon
Rini has been working in the international development and Australian community development sectors for more than thirteen years. She has worked with a broad range of social change initiatives and businesses globally. Rini has extensive experience in designing and implementing monitoring... Read More →
SL

Sarah Leslie

Senior Consultant, Clear Horizon
Interested in MEL frameworks, evaluation and portfolio level MEL


Tuesday September 17, 2019 12:00pm - 12:30pm AEST
C2.1

12:05pm AEST

Alternate realities in evaluation: Possibilities for emerging tech in evaluation
Matt Healey (First  Person Consulting)

Over the last five years, we have seen the exponential growth of technologies only previously seen in science fiction films. Augmented, mixed and virtual reality are increasingly a part of our everyday reality, and this growing accessibility means that evaluators will need to understand what these technologies are and the range of possible uses. By the end of the session attendees will be more aware of what augmented, mixed and virtual reality platforms are and their possibilities. The presenter will also provide some examples of how such tools might be used by evaluators in the future.

Chairs
avatar for Anne Markiewicz

Anne Markiewicz

Director, Anne Markiewicz and Associates
Anne Markiewicz is a leading monitoring, evaluation, and training specialist with over 20 years’ experience designing monitoring and evaluation frameworks and developing and implementing a range of evaluation methodologies. She has designed and delivered M&E capacity development... Read More →

Presenters
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led over 50 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... There's probably... Read More →


Tuesday September 17, 2019 12:05pm - 12:10pm AEST
C2.4

12:10pm AEST

Collective impact evaluation in primary prevention of violence against women
Louise Falconer  (Women's Health West) 

Evaluation of the prevention of violence against women is rapidly growing and collective impact evaluation is an effective methodology to influence policy, advocacy and funding. Preventing Violence Together (PVT), Melbourne's western region's strategy and partnership to prevent violence against women, has developed the PVT Shared Measurement and Evaluation Framework, which is in the first year of implementation and has been piloted by the 2018 16 days of activism campaign. This framework is pivotal to PVT's vision that women and girls across Melbourne's west live free from violence and discrimination and have equal status, rights, opportunities, representation and respect.

Chairs
avatar for Anne Markiewicz

Anne Markiewicz

Director, Anne Markiewicz and Associates
Anne Markiewicz is a leading monitoring, evaluation, and training specialist with over 20 years’ experience designing monitoring and evaluation frameworks and developing and implementing a range of evaluation methodologies. She has designed and delivered M&E capacity development... Read More →

Presenters
avatar for Louise Falconer

Louise Falconer

Director - Strategy, Advocacy & Community Engagement, Women's Health West
Louise Falconer is the Director of Strategy, Advocacy and Community Engagement at Women’s Health West. Women’s Health West provides specialist family violence services to women and their children, and also runs primary prevention programs that promote equity and justice for women... Read More →


Tuesday September 17, 2019 12:10pm - 12:15pm AEST
C2.4

12:15pm AEST

Opening Up The Box: making evaluation useful to stakeholders
Hwee Lee Seah  (Ministry of Education, Singapore)

Programme evaluation in the Ministry of Education (MOE) in Singapore is guided by the utility of the evaluation process and its findings. This presentation narrates the on-going journey of the internal evaluators in MOE to ensure that the evaluation conducted is relevant, accessible and useful to the intended users. We will share the customised approaches and strategies adopted in engaging different levels of stakeholders (e.g., management, specialists) at every stage of the evaluation to engender ownership in the evaluation, as well as promoting evaluation literacy and capacity along the way.

Chairs
avatar for Anne Markiewicz

Anne Markiewicz

Director, Anne Markiewicz and Associates
Anne Markiewicz is a leading monitoring, evaluation, and training specialist with over 20 years’ experience designing monitoring and evaluation frameworks and developing and implementing a range of evaluation methodologies. She has designed and delivered M&E capacity development... Read More →

Presenters
avatar for Hwee Lee Seah

Hwee Lee Seah

Lead Research Specialist, Ministry of Education
I am a Science teacher by training. After teaching in schools for 4 years, I went on to work as an educational researcher in the Headquarter and next ventured into evaluation. As a lead research specialist in evaluation, I conceptualise and conduct evaluation of policies and programmes... Read More →


Tuesday September 17, 2019 12:15pm - 12:20pm AEST
C2.4

12:20pm AEST

Let's focus on the Big M and little e (Me)
Damien Sweeney (Clear  Horizon)

Monitoring is commonly defined as the systematic collection of data to inform progress, whereas evaluation is a more periodic 'evaluative' judgement, making use of monitoring, and other information. Continual improvement through monitoring requires an evaluative aspect too, so that implementers can reflect on progress, and make decisions to keep going, or adjust course. I refer to this regular reflection process as little 'e', as differentiated from more episodic assessment of progress, which is the big 'E' (in M&E). Focusing on M&e helps demystify M&E and empowers implementers.


Chairs
avatar for Anne Markiewicz

Anne Markiewicz

Director, Anne Markiewicz and Associates
Anne Markiewicz is a leading monitoring, evaluation, and training specialist with over 20 years’ experience designing monitoring and evaluation frameworks and developing and implementing a range of evaluation methodologies. She has designed and delivered M&E capacity development... Read More →

Presenters
avatar for Damien Sweeney

Damien Sweeney

Principal Consultant, Clear Horizon
Damien Sweeney is a Senior Consultant at Clear Horizon Consulting. Damien is a sustainability generalist and M&E practitioner, bringing together his experiences across numerous sectors, from local government, to seafood industry, and consulting. Damien has worked with leading behaviour... Read More →
avatar for Dave Green

Dave Green

Principal Consultant, Clear Horizon
Dave has worked in international development for over 15 years, in public, not-for-profit and private sector roles. To date, his career focused on Australia’s Aid Program in the Pacific and Southeast Asian regions; providing him a deep understanding of DFAT approaches to aid delivery... Read More →


Tuesday September 17, 2019 12:20pm - 12:25pm AEST
C2.4

1:00pm AEST

#aes19SYD unconference: evaluation for a better world (continued)
The unconference provides the time and space to discuss what matters to you about the future of evaluation for a better world.

We’re using open space. Developed to “find a way towards meetings that have the energy of a good coffee break combined with the substance of a carefully prepared agenda" (Owen, 2018), it has been used in thousands of gatherings around the world over the past few decades.

If you’ve experienced it before, you understand the possibilities. If you haven’t, be prepared to be surprised.

Come and share what you’re passionate about. All ideas and forms of contribution are welcome – you might bring a topic you want to convene a group on, move from group to group, or take a pause and find yourself in a conversation you didn’t expect to have.





Tuesday September 17, 2019 1:00pm - 3:00pm AEST
C2.2-C2.3

1:30pm AEST

"Catching the MEL bug": Using an evaluation needs assessment to unpack evaluation capacity
Mark Planigale (Lirata Ltd), Kathryn Robb (Djirra)

Moving evaluation out of the box involves empowering organisations to shape and use Monitoring, Evaluation and Learning (MEL) for their own purposes. How can we demystify and reframe MEL so we can support organisations to design and use evaluation effectively?

An evaluation needs assessment can be a vital step in this journey. Through a needs assessment, we can engage stakeholders in identifying strengths, gaps and areas for development in MEL within a team or organisation. A needs assessment also explores how stakeholders value MEL and the types of MEL which will be meaningful and useful for their context. This informs the development of tailored strategies to improve MEL capacity, while also generating understanding and enthusiasm for change.

In this paper we outline a systematic approach to evaluation needs assessment. Building on previous approaches (e.g. Preskill & Torres 1999; Volkov & King 2007; Preskill & Boyle 2008), we present a framework of 11 capacity domains, organised using three lenses: individual capacity, team and organisational capacity, and MEL life cycle. A combination of quantitative and qualitative data collection oriented around these domains helps to generate a nuanced mapping of capacity, an overview of informational needs, and a baseline against which progress can be measured.

How can this approach be applied in practice? We share a case study of an evaluation needs assessment undertaken in partnership between an Aboriginal Community Controlled Organisation and an evaluation consultancy. We reflect together on why it was important to undertake the needs assessment, lessons learned through the organisation's experience of "catching the MEL bug", and the relationships, tools and conversations which have facilitated this journey. We conclude with practical suggestions for adapting and using this framework in other contexts


Chairs
avatar for Carolyn Hooper

Carolyn Hooper

Evaluation + Research Consultant, Allen + Clarke
I'm a social scientist, with a preference for qualitative methods. I've been working in evaluation for four years and I love the variety of opportunities I get to make a difference. My doctorate in Public Health means I often work on health-related evaluations, examining interventions... Read More →

Presenters
avatar for Mark Planigale

Mark Planigale

CEO, Lirata Ltd
I am the CEO and evaluation practice lead for Lirata Ltd (www.lirata.com) - an independent non-for-profit organisation based in Melbourne. We work to advance social justice by strengthening the enablers and reducing the barriers to positive social change.We partner widely with service... Read More →


Tuesday September 17, 2019 1:30pm - 2:00pm AEST
C2.6

1:30pm AEST

Making sense of women's leadership through online SenseMaker
Alejandra Garcia Diaz Villamil (Vital Voices Global Partnership)

Stories told by participants are valued as one of the most relevant sources of information to assess the impact of a program and ultimately, produce social changes.  Narratives speak the truth and create opportunities for activism and influence change.  Nowadays, the use of technology offers a more efficient way to compile, interpret and analyze stories through the lens of the storyteller. For this reason, Vital Voices Global Partnership, as part of its efforts to bring the voice of women leaders at the forefront and understand their impact, piloted SenseMaker®. SenseMaker is a narrative-based evaluation methodology online enables leaders to be story tellers and make sense of their experience in order to understand complex change. Some of the questions this evaluation tries to address are a) how the perception on women's leadership has changed since their participation in Vital Voices, b) what are the factors including social and cultural norms that limit women's ability to lead and c) how has the network helped challenge constrains around women's leadership? The methodology to answer these questions focuses on using participant's stories to uncover attitudes that inform and influence behavior.  It draws upon anthropology, complexity theory and neuroscience. This allows for automated collection of large numbers of short stories that, together, create a nuanced picture of a given topic. Using the SenseMaker® approach, VV fellows told their stories but also conducted the primary analysis of their own stories as well. This helped reduce the potential for interpretive bias and empowering fellows to analyze and reflect on their leadership paths.  This presentation offers helpful valuable insights using SenseMaker® as part of an evaluation focused on the impact of a women's global leadership network to shape their context and change perceptions of women's leadership.

Chairs
avatar for Dr. Suzanne Evas

Dr. Suzanne Evas

Senior Evaluation and Research Officer, Victorian Department of Health and Human Services
Suzanne Evas has been working broadly in the social services sector for over 25 years. She began her professional life as an allied health practitioner and program coordinator, working across clinical and community settings in the US and Canada. A PhD scholarship at the University... Read More →

Presenters
AG

Alejandra Garcia Villamil

Director of MEL, Vital Voices
Alejandra Garcia-Diaz Villamil has over 13 years of research, evaluation, and monitoring experience within Latin America, Africa, and the United States. Her prior work experience focused on maintaining monitoring and evaluations systems for ProMujer, Community Research Institute at... Read More →


Tuesday September 17, 2019 1:30pm - 2:00pm AEST
C2.4

1:30pm AEST

"Fellows Forum Contributions of Theory to Evaluation Practice: Examples from the Field"
John Owen (The University of Melbourne), Rick Cummings (Murdoch University)

Purpose. To identify different meanings of theory as they have emerged within the evaluation discipline over time and through this show how relevant evaluation theory can enhance the quality of evaluation practice.

Argument. The notion of theory can be a mystery, especially to neophyte evaluators. This is not surprising as there is no one meaning assigned to evaluation theory; we have identified four such meanings in the literature: (1) Scriven/Fournier's basic theory (judgment of program worth) (2) Bennett/Chen's program theory (logic), (3) Shadish's theories of action of significant evaluators (eg Carol Weiss), and (4) 'funded knowledge', social theories or meta-analyses now known as best practice reviews in a field of knowledge, (eg staff development). Practising evaluators need to be mindfully clear about each of these theory types and distinctions, and to understand how they have influenced and advanced the development of evaluation as a discipline.

Implications for Practice. We believe that the quality of evaluation will be enhanced if evaluators are in a position to consider knowledge and concepts that arise from these theories when scoping evaluation work; and to incorporate them into their research designs, and in the dissemination of findings to key audiences.

Session Format. A panel of AES Fellows will provide examples of how the use of theory has contributed to a study in which they have been the principal evaluator. There will be opportunities for other Fellows to comment, and for the audience to pose questions and/or contribute examples that illustrate the advantages of incorporating theory into evaluation designs.

Conclusion. The incorporation of theory in an evaluation is a sign that this area of knowledge can be regarded as a discipline, as distinct from a craft. An important message is that aspiring evaluators need to participate in formal or informal training programs which offer opportunities to come to grips with the salience of evaluation theory.


Chairs
MN

Marion Norton

Manager Research and Evaluation, Qld DJAG

Presenters
JO

John Owen

Principal Fellow, Centre for Program Evaluation, University of Melbourne
In keeping with his academic interests in knowledge utilization, John Owen is committed to evaluation strategies that promote change and improvement in policy and program delivery. He is of the view that commissioned evaluations should respond to the knowledge needs of decision makers... Read More →
RC

Rick Cummings

Senior Research Fellow, Murdoch University
I have been conducting evaluations of public programs and policies since the late 1970s. A lot has changed and I am interested in the evolution of evaluation use over this time as well as the increasing demand to evaluate large social policies. Both of these challenge evaluators and... Read More →


Tuesday September 17, 2019 1:30pm - 2:30pm AEST
C2.1

1:30pm AEST

Designing Evaluations for Policy Coherence: The Differentiated Support for School Improvement Case
Janet Clinton (University of Melbourne), Ruth Aston (University of Melbourne), Emily Qing (University of Melbourne), Ghislain Arbour (University of Melbourne), Anne Tonkin (Department of Education and Training), Stephanie Moorhouse (Department of Education and Training)

What if an evaluation of the implementation of a policy could generate information about the relationships between interrelated policies, such that governments could identify how implementers can be supported through actionable feedback, to support targeted and responsive policy implementation.

It is this question (among others) that we are tackling in the evaluation of the Differentiated support for school improvement (DSSI) initiatives, funded by the Victorian Government Department of Education and Training. More than ever, public sector policy evaluations need to incorporate the relationships between policies and identify the cumulative and collective influence of multiple policies which may not necessarily be the 'evaluand'.

This panel presentation will discuss;
  1.   Evaluation design for testing intended policy coherence, including fixed and flexible components for  repeated measures over time, with responsive measures that adapt to changing information needs
  2.   Multi-purpose measurement model that supports data aggregation, and triangulation across multiple evaluations
  3.   Co-design of the DSSI data portal and facilitating data ownership among the policy implementers
  4.   Functional partnerships between the evaluators and the commissioner
Presenters from the evaluation team (University of Melbourne) and the commissioner (Department of Education and Training), will discuss the methodological and practical considerations for the design of evaluations that include gathering progressive large-scale mixed implementation and impact data, to generate regular and tailored feedback. We will discuss how we are embracing disruption using a personalised online platform, to facilitate data collection and access to tailored feedback and for users to give feedback. Finally, the presenters will also share findings of how engagement in evaluation can become part of policy implementation, through predicting implementation behaviours and impact.

The session moderator will be Dr Ghislain Arbour, who will pose questions to the panel, and facilitate discussion from the audience. Dr Arbour has considerable expertise and evaluation experience in the public sector.


Chairs
avatar for Lee-Anne Molony

Lee-Anne Molony

Director, Clear Horizon
Principled-focused evaluation Evaluating place-based approachesOrg level evaluation frameworks

Presenters
avatar for Janet Clinton

Janet Clinton

Professor, University of Melbourne
Professor Janet Clinton is Director of the Centre for Program Evaluation (‘CPE’) at the Melbourne Graduate School of Education (‘MGSE’). She was previously the senior academic in Program Evaluation and the Academic Director for the School of Population Health at the University... Read More →
avatar for Ruth Aston

Ruth Aston

Postdoctoral Research Fellow, University of Melbourne
Dr Ruth Aston is a Research Fellow at the Centre for Program Evaluation. Ruth has project managed several large-scale evaluations across Australia and internationally. She has a background in public health, including working on a three-year national project investigating the workforce... Read More →
EQ

Emily Qing

Research Fellow, Centre for Program Evaluation
Emily is a Research Fellow at the Centre for Program Evaluation, Melbourne Graduate School of Education. She has worked on a range of large-scale evaluations of education programs, and has been engaged in the design of monitoring and evaluation systems for complex education initiatives... Read More →
avatar for Stephanie Moorhouse

Stephanie Moorhouse

Senior Project Officer, Department of Education and Training (Vic)
Stephanie Moorhouse is a Senior Project Officer at the Victorian Department of Education and Training. Since 2017, she has worked on the implementation of Differentiated support for school improvement (DSSI); a four-year project that takes a partnering approach to strengthen teaching... Read More →
avatar for Ghislain Arbour

Ghislain Arbour

Senior Lecturer, The University of Melbourne
Ghislain Arbour is a Senior Lecturer at the Centre for Program Evaluation at the University of Melbourne in Australia.He has a hard time not talking about terminology in evaluation (see what happens if you ask about his dictionary project), the nature of evaluation theory and models... Read More →


Tuesday September 17, 2019 1:30pm - 2:30pm AEST
Pyrmont Theatre

1:30pm AEST

Lessons learned co-designing a program and its evaluation in an emerging policy landscape
Poppy Wise (Urbis Pty Ltd), Malcolm Haddon (Multicultural NSW), Tim Carroll (Bankstown Youth Development Service)

Co-design is a buzzword in evaluation, and for good reason. Genuine engagement with the parties being evaluated supports strong outcomes for evaluations and strengthens relationships with stakeholders.

We will share lessons from the co-design process we adopted for the development and evaluation of the COMPACT Program - a community-based resilience program funded by the NSW Government as part of the Countering Violent Extremism (CVE) package. COMPACT funds 12 locally-based projects, focussed on young people, to "safeguard Australia's peaceful and harmonious way of life".

We will explore our learnings through a joint presentation:
  • The NSW Government agency will share the critical importance and key benefits of co-designing the first-of-its-kind COMPACT program with the communities it intended to influence. This ensured the design was culturally appropriate and strongly supported by the community in a potentially divisive policy landscape like CVE.
  • The consultant evaluator will outline our approach to co-designing the program logic and evaluation framework with the 30+ organisations involved in delivering the COMPACT program. We will reflect on the benefits and trade-offs associated with undertaking genuine co-design.
  • The community organisation will reflect on the experience of being a co-designer, and how this has influenced their own impact measurement practices. They are an an arts-based community development organisation based in South Western Sydney. They were one of the 30+ partners selected to deliver the COMPACT program.
  • Finally, together we will summarise what it means to undertake genuine co-design approaches, for clients, evaluators and participants. We will share our learnings regarding the value of adopting a co-design approach when developing and evaluating programs in an emerging policy area like CVE, where there is currently a lack of robust evaluation work, and the definition of CVE is still evolving.

Chairs
avatar for Andy Moore

Andy Moore

Senior Advisor Performance and Evaluation, NZDF
I have had thirty years of experience in conducting, designing, and evaluating vocational training within NZ Defence. In that time, I have deployed oversea, predominantly in the role of coordinating aid programmes on behalf of NZ Defence. My current role is the Senior Adviser Performance... Read More →

Presenters
PW

Poppy Wise

Director, Urbis
Poppy Wise is a specialist in public policy research and evaluation. She’s worked in research and evaluation for 15 years in the private and not-for-profit sectors, most recently advising on policy and programs in mental health, disability, domestic violence and community safety... Read More →
MH

Malcolm Haddon

Senior Manager, Community Resilience
avatar for Zainab Kadhim

Zainab Kadhim

Project Coordinator, Bankstown Youth Development Service (BYDS)
Zainab Kadhim is a mixed raced writer, musician, spoken-word poet and community arts worker from South-Western Sydney. Her works address themes of social justice, female empowerment and self-reflection. After completing her Masters degree in Environmental Change Management at the... Read More →


Tuesday September 17, 2019 1:30pm - 2:30pm AEST
C2.5

2:00pm AEST

The retrospective development of a monitoring and evaluation framework for the Northern Territory chronic conditions prevention and management strategy: Unpacking the problems and possibilities
James Smith (Menzies School of Health Research), Kalinda Griffiths (University of New South Wales), Moira Stronach (Northern Territory Department of Health), Liz Kasteel (Northern Territory Department of Health), Jenny Summerville (Northern Territory Primary Health Network), Julie Franzon (Northern Territory Primary Health Network), Michelle Ganzer (Northern Territory Department of Health), CCPMS Monitoring & Evaluation Working Group)

In 2010, the Northern Territory Government released a ten-year Chronic Conditions Prevention and Management Strategy (CCPMS). This was followed by the release of three separate implementation plans (2010-2012; 2014-2016; 2017-2020) across the CCPMS timeframe. A longer implementation timeframe was adopted to allow for the measurement of longer-term outcomes. The CCPMS and subsequent implementation plans clearly outlined guiding principles, key goals, key action areas, objectives, strategies and indicators/progress measures. In theory, the 'evaluation box was built and neatly wrapped' providing a useful platform to undertake monitoring and evaluation functions, which had been considered from the outset. However, it has recently surfaced that indicators/progress measures were poorly aligned to the objectives and strategies, and that in some instances data was not available to report against the indicators. Similarly, the indicators included in implementation plans changed across the life of the CCPMS, reflecting changes in policy direction and government priorities. This made it difficult to identify how best to measure the impact and outcomes of the CCPMS. That is, the 'structure of the evaluation box was weak'. In 2018, to address this conundrum, a multi-agency Monitoring and Evaluation Working Group, with independent Co-Chairs, was established to develop a retrospective Northern Territory CCPMS Monitoring and Evaluation Framework. In this presentation, we draw on multiple perspectives from the Working Group to track and discuss the process used to develop the framework. We will explain how we 'unwrapped, deconstructed and reconstructed the box'. We will explain how and why our multi-phased approach included: an indicator mapping process across multiple policy documents (2010-2018); preparing a retrospective logic model; identifying contemporary Indigenous evaluation principles; seeking expert advice on qualitative and quantitative measures; and prioritising indicators based on availability, utility or pre-existing reporting processes. In doing so, we will unpack the problems and possibilities encountered by the Working Group.

Chairs
avatar for Carolyn Hooper

Carolyn Hooper

Evaluation + Research Consultant, Allen + Clarke
I'm a social scientist, with a preference for qualitative methods. I've been working in evaluation for four years and I love the variety of opportunities I get to make a difference. My doctorate in Public Health means I often work on health-related evaluations, examining interventions... Read More →

Presenters
avatar for James Smith

James Smith

Father Frank Flynn Fellow and Professor of Harm Minimisation, Menzies School of Health Research
James is the Father Frank Flynn Fellow and Professor of Harm Minimisation at Menzies School of Health Research - with much of his work sitting at the health/education nexus. Previous to this role he was a 2017 Equity Fellow with the National Centre for Student Equity in Higher Education... Read More →
avatar for Jenny Summerville

Jenny Summerville

Performance and Quality Manager, NT PHN
Dr Jenny Summerville is the Performance and Quality Manager at Northern Territory PHN. She has more than 20 years experience coducting research and evaluation in academic and industry settings. Her work has spanned a variety of industry and sector contexts including health, community... Read More →


Tuesday September 17, 2019 2:00pm - 2:30pm AEST
C2.6

2:00pm AEST

Learning from feminist economics to measure what counts to women
Farida Fleming (Assai  Consult), Neema Nand (Fiji Women's Fund)

Women’s economic empowerment is currently a key focus of funding for development agencies. Expanding women’s economic opportunities benefits both women and society. For example, the benefits of expanding women’s economic opportunities found by the UN Foundation and ExxonMobil Foundation’s research include greater investments in children, reduced poverty for all and enhanced aspirations for the next generation of girls and women. However, the danger is that women’s economic empowerment programming and related monitoring and evaluation is based on an interpretation of economics that privileges a male-identified, western, and heterosexual perspective.

The paper takes a feminist economic approach to further develop existing approaches to monitoring and evaluating women’s economic empowerment initiatives. This approach draws from three key ideas. Firstly, it emphasizes the importance of collectives, in contrast to a focus on individual women. Secondly, it problematizes the household unit taking account of intra-household bargaining and differences in power. Thirdly, it encourages a focus on both women and men in order to see how women’s economic empowerment results in changes in gendered work, especially care activities.

The paper draws from the experience of women’s organisations, collectives and social enterprises in Fiji working to empower women socially and economically.

Chairs
avatar for Dr. Suzanne Evas

Dr. Suzanne Evas

Senior Evaluation and Research Officer, Victorian Department of Health and Human Services
Suzanne Evas has been working broadly in the social services sector for over 25 years. She began her professional life as an allied health practitioner and program coordinator, working across clinical and community settings in the US and Canada. A PhD scholarship at the University... Read More →

Presenters
avatar for Farida Fleming

Farida Fleming

Principal Evaluator, Assai
I evaluate social justice programs, mostly in the fields of education and women's empowerment. I'm focused on getting people and organisations to better use evaluations to improve their practice. I facilitate learning processes, through evaluation activities, to help people and organisations... Read More →
avatar for Menka Goudan

Menka Goudan

Senior Program Manager, Fiji Womens Fund
I am Menka Goundan. I currently work as the Senior Program Manager at the Fiji Women's Fund. At the Fund, I manage grant administration and capacity support. I previously worked at the Fiji Women's Rights Movement as the Team Leader- Gender and Transitional Justice and prior to the... Read More →


Tuesday September 17, 2019 2:00pm - 2:30pm AEST
C2.4

2:30pm AEST

Communities of Practice, mentoring and evaluation advice: using soft power approaches to build capacity
Florent Gomez (NSW Department of Finance, Services and Innovation)

In the same way that some countries use culture as a soft power approach to extend their influence, evaluation should give serious consideration to soft capacity building tools such as Communities of Practice. This approach can be incredibly effective in diffusing evaluative thinking across organisations that are less familiar with it.

A New South Wales government department which is not a traditional stronghold for evaluation – as compared to human services departments such as education, health or community services – has established a successful Evaluation Community of Practice since November 2017. The Community of Practice brings together staff with varying levels of evaluation maturity to ‘share the love for evaluation’. The intent is to offer a more informal and less intimidating forum for participants to share challenges and learning than a traditional expert-to-learner approach. Over 50 people gather at each quarterly event where presenters provide case studies, panel discussions and practical exercises such as collectively developing a program logic or crafting good survey questions.

After a year and a half, participants reported an increased understanding of what evaluation is about and of key tools such as program logic, as well as applying those learning back in their workplace. The Community of Practice has opened up the conversation on evaluation across the organisation. While a slow and diffuse process, there is now a growing interest in evidence-based approaches, outcome framing and evaluative thinking.

Other soft power approaches used involve staff mentoring and evaluation advice. These have proved to be particularly powerful in improving the quality of evaluations – and are not necessarily much more resource intensive than formal training. Provided at the initial stage, targeted evaluation advice contributes to getting the evaluation framing right which generates a better evaluation brief. This, in turn, results in better evaluation outcomes, where the evaluation produces evidence around what the organisation is interested to learn about.

Chairs
avatar for Carolyn Hooper

Carolyn Hooper

Evaluation + Research Consultant, Allen + Clarke
I'm a social scientist, with a preference for qualitative methods. I've been working in evaluation for four years and I love the variety of opportunities I get to make a difference. My doctorate in Public Health means I often work on health-related evaluations, examining interventions... Read More →

Presenters
avatar for Florent Gomez

Florent Gomez

Manager, Planning and Evaluation, NSW Department of Customer Service
avatar for Michelle Bowron

Michelle Bowron

NSW Department of Customer Service
Currently working in the NSW Department of Customer Service and responsible for delivering evaluation activities and increasing evaluation awareness. I have an interest in learning more about evaluation approaches and the value it adds to existing and future business initiatives... Read More →


Tuesday September 17, 2019 2:30pm - 3:00pm AEST
C2.6

2:30pm AEST

Ethics unveiled: Foregrounding who is holding the box in the evaluation of higher education equity programs
Penny Jane Burke (Centre of Excellence for Equity in Higher Education, The University of Newcastle), Matthew Lumb (Centre of Excellence for Equity in Higher Education, The University of Newcastle), Rhyall Gordon (Centre of Excellence for Equity in Higher Education, The University of Newcastle)

Evaluation is a highly contested field, with animated debates about appropriate methods and frameworks, as well as complex methodological dilemmas and considerations. This paper shares our journey in translating an innovative ‘Pedagogical Methodology’ (Burke and Lumb, 2018) into evaluation practice in the context of equity in higher education. By opening up participation in the evaluation process, we sought to draw on the knowledge of participants within a Children’s University outreach program. With a commitment to valuing the knowledge of participants in programs, and to explore aspects of power in the question of ‘Who should hold the box?’ in terms of evaluation, we engaged Fraser’s (1996) social justice framework of recognition, redistribution and representation. In pursuing this approach, we critically examined discourses relating to what constitutes credible evidence of impact and the ways in which certain discourses can create the conditions for decontextualized and dehumanizing regimes of evaluation.

This paper is situated in the context of Equity and Widening Participation practice in Australia where evaluation commonly involves “measuring the easily measurable” (Harrison, 2018). Our efforts to reframe evaluation sought to examine what was outside the ‘box’ of the easily measurable. We did this by working with participants to foreground what it is they valued and how this understanding can enrich the re/development of university equity initiatives. Aligned to our University’s equity and social justice principles, we shifted emphasis to seeking representation of the perspectives of those whose values have been historically discounted or under-represented, rather than the assessment of what is valuable about equity programs being only a reflection of those in privileged or powerful positions.

Chairs
MN

Marion Norton

Manager Research and Evaluation, Qld DJAG

Presenters
ML

Matthew Lumb

Associate Director, Centre of Excellence for Equity in Higher Education
Matt's interest in evaluation developed through experiences as both a community development professional and classroom teacher. With colleagues at the Centre of Excellence for Equity in Higher Education, he works to foreground the politics of value and knowledge at play in processes... Read More →
avatar for Rhyall Gordon

Rhyall Gordon

Praxis Officer, The Centre of Excellence for Equity in Higher Education
Rhyall Gordon works in the role of Praxis Officer for the Centre of Excellence for Equity in Higher Education at the University of Newcastle. His role involves developing initiatives to support the exchange between theory and practice in the context of equity projects both with the... Read More →


Tuesday September 17, 2019 2:30pm - 3:00pm AEST
C2.1

2:30pm AEST

From impact evaluation to evaluating with impact: Trialling a new approach to increase uptake of evaluation results
Kathryn Dinh (ZEST Health Strategies), Peta Leemen (The Fred Hollows Foundation)

Too often we see effort put into evaluation fall away once the report is complete, leaving findings that fail to make an impact.

This presentation outlines collaboration between The Fred Hollows Foundation and ZEST Health Strategies to evaluate a highly regarded program in a way that would maximise the usefulness of findings. From an initial intention to evaluate impact of the Comprehensive Eye Care Model in Vietnam, the evaluation ultimately used a modified realist evaluation approach to understand which combinations of operating contexts, program activities and people's motivations produced effective outcomes. The aim was to see which learnings from Vietnam could inform future program design across The Foundation.

The realist evaluation approach was new for The Foundation, and at first staff in Vietnam and Australia could not see what it meant, or how they could use the results. Introducing the approach involved communication to explain the rationale, and building an understanding of what different groups might value out of the process.

Throughout the evaluation, the evaluator and commissioner considered how best to communicate findings to different parts of the organisation, testing ideas with a reference group. A series of succinct and accessible communication products from the evaluation, tailored for different audiences, were developed to address different needs. A useful innovation was developing a series of case studies for program designers and accompanying PowerPoint presentation for internal use. These are now being used by program designers across The Foundation to inform their work.

In this presentation we share learnings from the NGO and evaluation consultant's perspectives. We discuss how ongoing consultation and consideration of the evaluation outputs helped motivate staff to be involved and see value in a new evaluation approach as well as to use the outputs. The presentation contributes to discussion among evaluators on maximising utility of evaluation results.


Chairs
avatar for Lee-Anne Molony

Lee-Anne Molony

Director, Clear Horizon
Principled-focused evaluation Evaluating place-based approachesOrg level evaluation frameworks

Presenters
avatar for Kathryn Dinh

Kathryn Dinh

MEL Consultant/PhD Candidate, UNSW
Kathryn is a monitoring, evaluation and learning consultant with more than 20 years of experience across Asia, the Pacific (including Australia) and the Commonwealth of Independent States. She specialises in evaluation for health and international development programs, advocacy evaluation... Read More →
PL

Peta Leeman

Senior Monitoring and Evaluation Adviser, The Fred Hollows Foundation


Tuesday September 17, 2019 2:30pm - 3:00pm AEST
Pyrmont Theatre

2:30pm AEST

From theory to practice in gender evaluation: A systematic review of approaches in international development
Jess MacArthur (ISF-UTS), Naomi Carrard (Institute of Sustainable Futures), Juliet Willetts (Institute of Sustainable Futures)

In the field of evaluation, we are often tasked with untangling the complexity of social change. This is especially the case in the assessment of changes of gender equality in international development programming — both in programmes that explicitly seek to advance gender equality and programmes which may have a wider development agenda.  There is an opportunity to link evaluation theory to the breadth of research disciplines that explore the dynamics of gender equality, “unboxing” innovative ways to evaluate gender-related outcomes. These outcomes incorporate changes in position, power and equality for women, men, boys and girls. They can be foreseen, unforeseen, positive, negative, intended or unintended and can include agency, decision-making, leadership, space, voice, and wellbeing.
 
This presentation will share findings from a systematic literature review of approaches exploring gender equality impacts of international development programming. We analysed approaches with reference to their alignment with research paradigms and their relative focus on different aspects of gender equality. The analysis illustrates how different approaches to gender equality evaluations are able to interrogate different aspects of gender equality. The review also revealed limitations in the breadth of approaches typically applied, with scope to reflect on the value of diversified and intentional approaches leading to transformative change.
 
Although measuring changes in gender equality is a notoriously complex evaluation space, this research highlights the opportunity to strengthen approaches to gender evaluation by bridging theory and practice; drawing on a more diverse set of disciplines as well as current approaches to measuring gender equality.

Chairs
avatar for Dr. Suzanne Evas

Dr. Suzanne Evas

Senior Evaluation and Research Officer, Victorian Department of Health and Human Services
Suzanne Evas has been working broadly in the social services sector for over 25 years. She began her professional life as an allied health practitioner and program coordinator, working across clinical and community settings in the US and Canada. A PhD scholarship at the University... Read More →

Presenters
avatar for Jess MacArthur

Jess MacArthur

Doctoral Candidate, Institute for Sustainable Futures - University of Technology Sydney
Jess is a doctoral student at the Institute for Sustainable Futures, University of Technology Sydney. Her research focuses on recognising and discovering innovative ways to measure how water and sanitation programming in South and Southeast Asia affects women. Jess specialises in... Read More →


Tuesday September 17, 2019 2:30pm - 3:00pm AEST
C2.4

2:30pm AEST

Achieving successful outcomes through evaluation: A practical example of evidence-based practice for an Indigenous program
Janice Smith (Charles Sturt University), Shaarn Hayward (Charles Sturt University), Suellen Priest (Charles Sturt University), Christine Lindsay (Charles Sturt University)

The Indigenous Academic Success Program at Charles Sturt University offers a suite of academic services to Indigenous students to improve aspiration, retention, and success. The program has supported over 730 students enrolled across Charles Sturt University courses since its conception in 2016 and is largely comprised of Indigenous staff, who oversee the planning, evaluation, implementation, reporting, and improvement of the program.

The program is deeply embedded within the Indigenous community, with six of the seven permanent staff currently employed in the program identifying as Indigenous, and representing ten different Indigenous nations or language groups. Evaluative practices have been applied throughout program setup and delivery, through the use of program logics, quantitative and qualitative methodologies. A key evaluation method is the use of interviews to gather feedback from students who are using, have used, or been offered access to the service.

This presentation unpacks the program's evaluation process and design, detailing the ways the programs annual evaluation report determines the progress, outcomes, and development of the program in consideration of the student community the program works with. It will also consider how the evaluation of both participants and those invited to participate who did not take up the offer of support has been conducted in a way that provides a safe and effective mechanism for Indigenous participants to participate. Feedback from students at all levels of engagement is positioned as central to understanding the program's progress and success, and this presentation will look at how the evaluation data has been used to measure the program's progress in reaching outcomes and inform its improvements and future direction.






Chairs
avatar for Andy Moore

Andy Moore

Senior Advisor Performance and Evaluation, NZDF
I have had thirty years of experience in conducting, designing, and evaluating vocational training within NZ Defence. In that time, I have deployed oversea, predominantly in the role of coordinating aid programmes on behalf of NZ Defence. My current role is the Senior Adviser Performance... Read More →

Presenters
avatar for Janice Smith

Janice Smith

Tutorial Coordinator, Indigenous Academic Success, Charles Sturt University
Janice Smith has been in her current role with Charles Sturt University (CSU) since August 2016. She has previously worked in the Public Sector administering Indigenous Education Programs that focused on improving the educational outcomes for Aboriginal and Torres Strait Islander... Read More →
avatar for Kristy Saarenpaa

Kristy Saarenpaa

Coordinator, Program Evaluation and Reporting, Charles Sturt University
Kirsty Saarenpaa has worked in the Higher Education sector for 17 years predominately in contract management and compliance, both in the domestic and international sectors at Charles Sturt University and the University of Newcastle. Most recently Kirsty joined Charles Sturt's Division... Read More →


Tuesday September 17, 2019 2:30pm - 3:00pm AEST
C2.5

3:30pm AEST

Plenary 4: Gary VanLandingham "Evaluation in the age of evidence-informed policy-making -- opportunities, challenges and paths forward"
Gary VanLandingham (Askew School of Public Administration and Policy)

Advocates have proclaimed that we are entering the age of evidence-based policy-making, in which the data generated by rigorous program evaluations will be used to inform the tough choices governments must make to solve wicked problems around the globe. Many promising developments are spurring this optimism, including big data approaches that transform our ability to assess program outcomes, the growth in research clearinghouses that are curating and aggregating evaluation findings to identify 'what works', and stronger economic models that enable evaluators to readily calculate the return on investment that programs can generate. However, we also face critical challenges including growing political polarization, limited resources for policy experimentation, and skill gaps in the profession. This presentation will discuss these issues and propose concrete steps that must be taken to achieve the field's long-held goals of 'speaking truth to power' and becoming key advisers to policymakers.

Presenters
avatar for Gary VanLandingham

Gary VanLandingham

Gary VanLandingham currently serves as Professor, MPA Program Director, and the Reuben Askew Senior Practitioner in Residence with the Askew School of Public Administration and Policy at the Florida State University. Previously, he was the founding Director of the Pew-MacArthur Results... Read More →


Tuesday September 17, 2019 3:30pm - 5:00pm AEST
Pyrmont Theatre
  Plenary
 
Wednesday, September 18
 

9:00am AEST

Plenary 5: Jane Davidson "Unboxing the core like our lives depend on it – because they do"
Jane Davidson (Real Evaluation LLC)

What’s one of the most undercooked ingredients in evaluation cuisine globally? Unfortunately for Planet Earth and its lovely inhabitants, it’s the critically important “evaluative” piece, the thing that makes evaluation … well, evaluation. That’s the part where we don’t just say what the results are, but how good they are – and (most importantly) why.

This sounds deceptively simple, I know, but in this address, I will share with you why undercooking the actually evaluative part of our work has far-reaching implications for our profession and those we serve.

Skip the evaluative piece or get it wrong, and what happens? At best, we will be delivering poor value for the evaluation investment. At worst, we will be perpetuating or even exacerbating social injustices.
In 2019, we are in unprecedented times, when misinformation is used as a weapon to cause division and discord. As purveyors of truth and justice, we have a critically important responsibility and an obligation to bring our very best evaluative game to this war on reality. Thankfully, we have more in our repertoire than you might have realized.
Let the unboxing begin!



Presenters
avatar for Jane Davidson

Jane Davidson

Real Evaluation LLC
Dr. Jane Davidson is best known for pioneering the increasingly popular Evaluation Rubrics Methodology, along with her various other refreshingly practical evaluation frameworks and approaches.Originally from Aotearoa New Zealand, Jane is former Associate Director of The Evaluation... Read More →


Wednesday September 18, 2019 9:00am - 10:00am AEST
Pyrmont Theatre
  Plenary

10:30am AEST

Front-end loading: The value of formative evaluation in setting program focus: a case study of the Australian Volunteers Program
Keren Winterford (University of Technology Sydney), Anna Gero (Institute for Sustainable Futures, University of Technology Sydney), Jake Phelan (Austalia Volunteers Program)

This paper explores the practice of a formative evaluation for the Australian Volunteers Program and sets out why formative evaluation is valuable to setting program focus and defining approaches to impact evaluation. Reflections from independent evaluators and the Monitoring Evaluation and Learning team of the Australian Volunteers Program are provided within this presentation drawing together multi-stakeholder and practitioner perspectives on theory and practice of formative evaluation.

The overall objective of the formative evaluation presented in this paper was to map the global footprint of the Australian Volunteers Program in three impact areas in order to (i) establish a baseline; (ii) inform strategic options for strengthening engagement in the impact areas and; (iii) propose methodology for demonstrating outcomes in impact areas. The three impact areas of Inclusive economic growth; Human Rights; and Climate Change/Disaster Resilience/Food Security are informed by the Australian Government Volunteers Program Global Program Strategy. Rather than setting out evaluation findings, the paper explores the practice of collaborative evaluation design; use of mixed methods including key informant interviews, document review, and quantitative analysis to prepare working definitions of impact areas. We explore the practice of drawing on local (country contexts) and global measures (Sustainable Development Goals) to define impact areas and how we have made sense of these to apply to the Australian Volunteers Program.

The paper distinguishes the theory and practice of formative evaluation and sets out the unique contribution it offers to policy and programming agendas. The paper talks about the value of evaluation across multiple points in the project cycle and value of linking formative and summative evaluations as highlighted within this case. Informed by this case study, the presenters offer tips and tricks for those commissioning and conducting evaluations to ensure formative evaluations provide best contribution to policy and programming agendas.


Chairs
avatar for Kiri Parata

Kiri Parata

Whakauae Research for Māori Health and Development
Kia ora I'm Kiri, living on the Sunshine Coast of Queensland. My whakapapa (genealogy) is to Te Atiawa, Ngāti Toa, Ngāti Raukawa, Ngāti Ruanui and Ngāi Tahu in Aotearoa, New Zealand. I am a Māori health researcher and evaluator and I'm committed to tino rangatiratanga where indigenous... Read More →

Presenters
avatar for Keren Winterford

Keren Winterford

Research Director, Institute for Sustainable Futures, University of Technology Sydney
Dr Winterford has 20 years of work experience working in the international development sector, in multiple capacities with Managing Contractors, NGOs, as a private consultant, and more recently in development research. She currently provides research and consultancy services for numerous... Read More →
avatar for Farooq Dar

Farooq Dar

Monitoring, Evaluation and Learning Advisor, Australian Volunteers International
Farooq has accumulated 15+ years of experience as an international development practitioner designing and managing complex multi-sectoral humanitarian and development programs/projects, working on governance, compliance and policy issues across various countries around Asia including... Read More →
avatar for Anna Gero

Anna Gero

Research Principal, University of Technology Sydney
Anna Gero is a climate change and disaster risk leader and specialist with over 10 years experience in the Asia-Pacific region. She is an experienced project manager, and has led climate change and disaster resilience consultancies, evaluations and research projects since 2008 across... Read More →


Wednesday September 18, 2019 10:30am - 11:00am AEST
C2.6

10:30am AEST

Giving evaluation data back to the end user: experience from two workplace health initiatives
Jorja Millar (WorkSafe  Victoria), Clara Walker (Cancer Council Victoria)

Giving evaluation data back to the end user: experience from two workplace health initiatives
As program participants are increasingly being saturated with requests to participate in surveys and other data collection, there is a need for evaluation data collection to not be a burden on program participants. One way to do this is to collect and give back data that meet the needs of program participants.

This paper explores the process of developing and implementing data collection tools where the findings are used not only for overall evaluation of an initiative, but for end users' purposes, including conducting their own planning, economics of prevention, and evaluation. These data collection approaches are therefore not passive, but also influence the intervention through a design-thinking approach.

We will explore this topic via two case studies from Victorian workplace health and wellbeing initiatives. Our initiatives support workplaces to be prevention led and create healthier workplace environments to improve employee health and wellbeing. Therefore our initiatives are multi-layered, with overall state-wide frameworks influencing workplace initiatives to ultimately benefit individual employees.

We will highlight key challenges and lessons learned from the process of developing and implementing multi-purpose data collection tools which support our evaluation objectives and also provide tailored feedback to workplace end users to support their own planning, implementation and evaluation. We will also provide an overview of initial user experience and feedback and how the tools support ongoing improvement of our health and well-being initiatives. This paper will provide practical insights for evaluation practitioners of all backgrounds who are looking to use a client-centric approach and increase engagement with evaluation data collection through creating value for end users.


Chairs
Presenters
avatar for Jason Thompson

Jason Thompson

Dr Jason Thompson holds a PhD in Medicine, Masters in Clinical Psychology, and a Bachelor of Science with Honours.Dr Thompson’s work is focused on the translation of research into practice across the areas of transportation safety, public health, post-injury rehabilitation, and... Read More →
avatar for Jorja Millar

Jorja Millar

With significant experience in designing and implementing evaluation frameworks and developing evidence used for high level decision making in process- driven environments. Spanning mental health, education, early childhood, planning, community services and development in local, state... Read More →


Wednesday September 18, 2019 10:30am - 11:00am AEST
C2.5

10:30am AEST

Inside, outside, all around: Three perspectives on evaluation capacity building
Stewart Muir (Australian Institute of Family Studies), Jessica Smart (Australian Institute of Family Studies), Emily Mellon (Outcomes Practice Evidence Network (OPEN)), Alisha Heidenreich (Relationships Australia South Australia)

Through evaluation capacity building we seek to grow and nurture the practice of evaluation and transform non-evaluators into evaluators. Understood broadly, evaluation capacity building consists of three components; capacity to "do" evaluation, capacity to "use" evaluation, and a culture that is supportive of evaluation (Stewart, 2014). The community services sector is experiencing a push towards evidence-based practice and outcomes measurement, yet despite increased investment, there are some common gaps in the sector's evaluation practice. This session explores different approaches to solving this problem through evaluation capacity building, drawing on the experiences of three panellists:

  • Stewart Muir, Australian Institute of Family Studies (AIFS). The Families and Children Expert Panel Project operates nationally to build the capacity of service providers to use evidence in practice.
  • Alisha Heidenreich. Relationships Australia South Australia's evaluation team work alongside staff to build their capacity to discover the best - and what must be improved - in their services, programs and practices.
  • Mandy Charman. Outcomes Practice Evidence Network aims to strengthen the evidence base and improve outcomes for Victorian children.

The session will provide a very brief overview of each project and compare and contrast experiences to examine the role of evaluation capacity building in "unboxing" evaluation. Nerida Joss (AIFS) will moderate, posing questions and facilitating conversation, including taking questions from the audience. Discussion will be centred around questions such as:
  • How do we promote the use of evaluation?
  • What are the benefits of being positioned "outside" or "inside" an organisation?
  • How do we build an organisational culture that supports evaluation?
  • What are effective evaluation capacity building strategies?
  • How do you evaluate evaluation capacity building?

Attendees will leave this session having critically engaged with the key challenges and enablers of evaluation capacity building, and with practical examples of capacity building activities and how they work in different contexts.

Chairs
avatar for Kerrie Ikin

Kerrie Ikin

Adjunct Senior Lecturer, UNE Business School, University of New England
Dr Kerrie Ikin FACEKerrie began her career as a teacher and executive member in government schools across New South Wales, Australia. For over 25 years she worked at the systems level of the education department as a director of schools, senior policy adviser, and senior administrator... Read More →

Presenters
SM

Stewart Muir

Executive Manager, Family Policy and Research, Australian Institute of Family Studies
Dr Stewart Muir is an anthropologist by training and the Executive Manager of the Family Policy and Research program at the Australian Institute of Family Studies. Stewart has undertaken research and evaluation in Australia and the UK in and has particular interest in family and kinship... Read More →
avatar for Alisha Heidenreich

Alisha Heidenreich

Program Evaluation Officer, Relationships Australia South Australia
Alisha is the Program Evaluation Officer at Relationships Australia South Australia. She has significant experience coordinating and evaluating programs and projects in multiple sectors including, community services, international development, independent consultancy, and higher education... Read More →
avatar for Nerida Joss

Nerida Joss

Senior Manager, Knowledge Translation and Impact, Australian Institute of Family Studies
MC

Mandy Charman

Project Manager, Outome Practice and Evidence Network, Centre for Excellence in Child and Family welfare
Dr Mandy Charman is the Project Manager for the Outcome, Performance and Evidence Network (OPEN) in the Centre for Excellence in Child and Family Welfare. OPEN, which represents a sector–government–research collaboration, has been developed to strengthen the evidence base of... Read More →


Wednesday September 18, 2019 10:30am - 11:30am AEST
C2.1

10:30am AEST

The early career evaluator experience: exploring pathways into and up in evaluation
Francesca Demetriou (), Eunice Sotelo (Australian Institute for Teaching and School Leadership), Aneta Cram (Katoa Ltd)

In the context of professionalisation, the evolving role of the evaluator, and the varied and changing entry points into the field, what does it mean for early career evaluators entering and planning a career in this space?

In this interactive session, ran by and aimed at early career evaluators, exploratory research into the experiences and needs of early career evaluators in Australia will be shared, reflected and collectively built upon.

This session will begin with a short presentation on the preliminary findings of research exploring the experiences of early career evaluators from a variety of backgrounds and contexts, the commonalities and differences of their experiences, their supports and challenges, and the questions they have about their development.

This will be followed by a world café style session to engage the community of early career evaluators who are present to reflect on the themes and their own experience, explore their own learnings and challenges, as well as the burning questions that they want to have answered! Feedback will serve to validate, add nuance and enrich the learnings from the initial research.

The session will provide the opportunity for early career evaluators to connect into a community of peers, and build their support networks as they continue their journey of development, whilst actively contributing into research that will add to the knowledge base around what is working and what is needed to support pathways into and up in the evaluation profession.

Discussion generated will be synthesised and shared with session participants via email following the conference. The discussion will also contribute to the research findings, which we will write up into a cohesive report (final product to be determined) that may help inform the AES capacity building strategy.


Chairs
avatar for Jess MacArthur

Jess MacArthur

Doctoral Candidate, Institute for Sustainable Futures - University of Technology Sydney
Jess is a doctoral student at the Institute for Sustainable Futures, University of Technology Sydney. Her research focuses on recognising and discovering innovative ways to measure how water and sanitation programming in South and Southeast Asia affects women. Jess specialises in... Read More →

Presenters
avatar for Eunice Sotelo

Eunice Sotelo

Project Officer, Australian Institute for Teaching and School Leadership
Interested in evaluation capacity building in the policy influence space. As an embedded evaluator, I'm open to trading stories about what it's like and how you've faced your own challenges. Message me on LinkedIn or tap me on the shoulder at the conference.


Wednesday September 18, 2019 10:30am - 11:30am AEST
C2.3

10:30am AEST

Un-boxed: Developmental evaluation's great strength and ultimate challenge
Samantha Togni (S2 Consulting), Kate McKegg (Knowledge Institute / Kinnect Group), Nan Wehipeihana (Kinnect Group)

Complex social and environmental issues increasingly challenge us to innovate to promote equity and sustainability. Evaluation in these real-world settings is important to support innovation effectiveness; it is also challenging, as conventional evaluation is not a good fit with innovation and complexity. Developmental evaluation responds to these challenges by integrating evaluation with the innovation, informing development through iterative learning.

DE is agnostic to methods. Rather, it is a relationship-based approach guided by essential principles brought to life in ways and to degrees relevant to the context. In this way it is agile and adaptive to support innovation development and learning in real-time with rigorous evaluation. This is DE's greatest strength; it is 'un-boxed' from the constraints of conventional evaluation design, enabling the evaluation to move with the innovation and embrace emergence and complexity. Simultaneously, this is DE's ultimate challenge: what is DE exactly and how do you do it?

Australian and international Indigenous and non-Indigenous DE evaluator panellists respond to these challenging questions, critically reflecting on our practice. We will explore: how you describe and know you are doing DE when it looks different in different contexts; how you convince commissioners of its value when you cannot know in advance exactly what it will entail and what data will be collected; and how do you define rigour in DE? We will examine DE's relationships-based nature and the assertion that the evaluator is the key 'tool' in this approach, exploring the strengths of the DE principles. We will invite audience sharing of DE definitions and experiences.

Recognition is growing of DE's value in supporting innovation development in complexity. DE is challenging and re-defining what we mean by evaluation. We need to develop our knowledge base on DE practice to better understand how it works and what it takes.


Chairs
avatar for Ghislain Arbour

Ghislain Arbour

Senior Lecturer, The University of Melbourne
Ghislain Arbour is a Senior Lecturer at the Centre for Program Evaluation at the University of Melbourne in Australia.He has a hard time not talking about terminology in evaluation (see what happens if you ask about his dictionary project), the nature of evaluation theory and models... Read More →

Presenters
avatar for Samantha Togni

Samantha Togni

Evaluation & Social Research Consultant, S2 Consulting
Samantha Togni is an evaluation and social research consultant based in Alice Springs. She has more than 20 years’ experience in Indigenous health and wellbeing research and evaluation, working with rural and remote Aboriginal organisations in northern and central Australia. Her... Read More →
KM

Kate McKegg

Director, The Kinnect Group
Kate has specialist skills in supporting evaluative thinking and practice in complex settings where people are innovating to create systems change. She has been applying these skills for over 25 years in government, non-government, philanthropic and community contexts, including many... Read More →
avatar for Nan Wehipeihana

Nan Wehipeihana

Ms, Kinnect Group
Nan Wehipeihana has more than 20 years' experience designing, leading and managing evaluations. Nan's work is characterised by a commitment to protecting, evidencing and growing the space to be Maori in Aotearoa New Zealand and offering insights into Maori worldviews and values. Internationally... Read More →


Wednesday September 18, 2019 10:30am - 11:30am AEST
Pyrmont Theatre

10:30am AEST

Value for Investment: unboxing a transdisciplinary approach to valuing
Julian King (Julian King & Associates)

Value for money (VFM) is a challenge for evaluators. Today's governments and impact investors seek valid, convincing ways to understand the social, cultural and environmental value of their resource allocation decisions. Evaluation and economics share an interest in determining the quality and value of resource use - but there is a conundrum. On one hand, too few evaluators are confident in cost analysis and economic methods of evaluation. Conversely, too few economists realise that evaluative thinking offers complementary ways to understand value. This presentation shares a transdisciplinary 'Value for Investment' (VFI) approach. This practical, intuitive process uses evaluative reasoning and mixed methods (qualitative, quantitative, and economic). It offers evaluators new ways to uncover and communicate the value of social investments, supporting resource allocation decisions for a fairer and more sustainable future. Theory, practice and examples will be shared.

Chairs
avatar for Natalie Fisher

Natalie Fisher

Director, NSF Consulting
I am an evaluation consultant with more than 15 years of experience working for clients predominantly in the arts and cultural sectors but also in environmental sustainability and human services. In 2017 I graduated with a Master of Evaluation from the University of Melbourne (First... Read More →

Presenters
avatar for Julian King

Julian King

Director, Julian King and Associates
Julian specialises in evaluation and value for money. He advises, teaches, presents, and writes on these topics globally, with a particular focus on combining evaluative reasoning with economic methods of evaluation. Julian is a member of the Kinnect Group, an Associate of Oxford... Read More →


Wednesday September 18, 2019 10:30am - 11:30am AEST
C2.4

10:30am AEST

Navigating Indigenous evaluation contexts: A time for critical reflection
James Smith (Menzies School of Health Research), Kim Robertson (Charles Darwin University), Donna Stephens (Menzies School of Health Research), Kalinda Griffiths (University of New South Wales)

The need to strengthen evaluation approaches in Indigenous evaluation contexts is well documented at national and global levels. In response, many evaluators have suggested that a greater understanding and use of Indigenist and decolonising evaluation methods is required, preferably with evidence of strong Indigenous leadership and participation. This has paralleled discussions about the importance of Indigenous data sovereignty. A deeper appreciation of the principles underpinning Indigenous evaluation work has also been a focus of recent policy and strategy discussions in Australia, with a notable increase in the development of Indigenous focused evaluation frameworks as a result. In tandem, strategies to build capacity in Indigenous evaluation (of both Indigenous and non-Indigenous evaluators), have also started to surface through research commentary and evaluation practice. Within the context of the conference theme - 'unboxed' - it is about understanding the complex interplay between values, power, culture and diversity. Indeed, perhaps there is no box at all, and an intricately woven basket is a better metaphor. Nevertheless, there are relatively few forums in which people working in Indigenous evaluation contexts have the opportunity to critically reflect on their practice. This interactive session aims to provide a safe space to openly discuss the challenges and opportunities that both Indigenous and non-Indigenous evaluators face in undertaking Indigenous evaluation work. This includes an opportunity to engage in open dialogue about the anxieties, tensions and celebrations associated with Indigenous evaluation. The session will be led by three Indigenous and one non-Indigenous facilitators that have worked collaboratively on various Indigenous evaluation projects at local, state and national levels. Key points of discussion will be documented as a communique for participants. The will be provided to the Cultural Capacity and Diversity Committee of the Australasian Evaluation Society to help inform further areas for development and action in this space.


Chairs
avatar for Ruth McCausland

Ruth McCausland

Senior Research Fellow, School of Social Sciences, UNSW
Dr Ruth McCausland is Director of Research and Evaluation for the Yuwaya Ngarra-li partnership between the Dharriwaa Elders Group and UNSW, and Senior Research Fellow in the School of Social Sciences, UNSW. Her research focuses on women, young people, people with disabilities and... Read More →

Presenters
avatar for James Smith

James Smith

Father Frank Flynn Fellow and Professor of Harm Minimisation, Menzies School of Health Research
James is the Father Frank Flynn Fellow and Professor of Harm Minimisation at Menzies School of Health Research - with much of his work sitting at the health/education nexus. Previous to this role he was a 2017 Equity Fellow with the National Centre for Student Equity in Higher Education... Read More →
avatar for Donna Stephens

Donna Stephens

Research and Project Manager Wellbeing and Preventable Chronic Diseases Division, Menzies School of Health Research
avatar for Kim Robertson

Kim Robertson

Senior Analyst, Indigenous Policies and Programs, Charles Darwin University
Kim Robertson, is Senior Analyst, Indigenous Policies and Programs with the Office of the Pro Vice-Chancellor Indigenous Leadership at Charles Darwin University and was a member of the Steering Group for Professor Smith’s 2017 NCSEHE Equity Fellowship investigating ways of strengthening... Read More →


Wednesday September 18, 2019 10:30am - 11:30am AEST
C2.2

11:00am AEST

Not champions, advocates! Supporting evaluation in non-profit organisations
Alison Rogers (PhD Candidate, Centre for Program Evaluation)

Evaluation is challenging for human service non-profit organisations. Evaluation advocates are attempting to use evaluation to demonstrate change in the lives of their client group and are trying to find ways of embedding evaluation to improve services and be accountable.

I undertook research with 17 advocates who worked in culturally diverse non-profit Australian organisations. The advocates had meaningful, productive, long-term and mutually beneficial working relationships with evaluators.

The advocates displayed positivity, enthusiasm and persistence and influenced others to understand and use evaluation as a tool to achieve the vision of the organisation. Regardless of level on the hierarchy, gender or cultural background, they highly valued evaluation logic, evaluation literacy and positive interpersonal relationships. The advocates built environments where colleagues felt welcome, respected, supported, valued and comfortable to participate by promoting equity, inclusion and individualised consideration.

This presentation shares findings from a study that sought out the perspectives of end-users directly. The conference theme suggests that evaluators need to “draw on the knowledge and practices of those they work with” and this presentation will enable evaluation advocates to learn from external and internal evaluators and for evaluators to learn about effective strategies and approaches from the way the advocates work with their colleagues to promote evaluation.


Chairs
Presenters
avatar for Alison Rogers

Alison Rogers

PhD Candidate, The University of Melbourne
Alison Rogers is a PhD candidate with the Centre for Program Evaluation at The University of Melbourne. She is also the Strategic and Innovation Adviser with the Indigenous Australian Program of The Fred Hollows Foundation based in Darwin, Northern Territory.


Wednesday September 18, 2019 11:00am - 11:30am AEST
C2.5

11:05am AEST

Surprise! No one read your organisations annual corporate performance report. Now what?
Brooke Edwards (NSW Government)

Recent experience of a trend towards annual corporate performance reports leads me to question why alternative and more compelling performance reporting formats are being overlooked. What’s beyond the box? Or, what’s beyond the dusty corporate reports archive box? Isn’t it time we embraced new methods of sharing and showcasing our performance data?

With the benefit of hindsight I discuss the downside risks of pursuing a corporate performance report as the cornerstone of your M&E reporting and communication strategy, consider what we actually want to achieve through M&E performance reporting and present some alternative communication formats to get us really thinking outside the box!

Chairs
avatar for Kiri Parata

Kiri Parata

Whakauae Research for Māori Health and Development
Kia ora I'm Kiri, living on the Sunshine Coast of Queensland. My whakapapa (genealogy) is to Te Atiawa, Ngāti Toa, Ngāti Raukawa, Ngāti Ruanui and Ngāi Tahu in Aotearoa, New Zealand. I am a Māori health researcher and evaluator and I'm committed to tino rangatiratanga where indigenous... Read More →

Presenters
avatar for Brooke Edwards

Brooke Edwards

Evaluation Analyst, NSW Government
New(ish) to the field of Evaluation! Stared off in monitoring and reporting within a scientific research organisation and now working with the NSW Government completing process evaluations of grant management programs.


Wednesday September 18, 2019 11:05am - 11:10am AEST
C2.6

11:10am AEST

He Whetū Arataki (Guiding Star) youth leadership programme evaluation
Gill Potaka-Osborne (Whak ae Research Services), Teresa Taylor (Whak ae Research Services)

In 2018, Te Rūnanga o Ngāti Hauiti (tribal council) commissioned their research unit to complete an evaluation of their youth leadership programme that had been running for nine years without change. The programme purpose, 'to develop youth as leaders' - succession planning, was facilitated by tribal experts and elders who endeavored to fuse past and present in a way that resonated with youth. The evaluation invited tribal members to reflect and consider what had worked well, the challenges and how best to move forward. This evaluation models how indigenous communities can commission and conduct independent evaluations to meet tribal aspirations.


Chairs
avatar for Kiri Parata

Kiri Parata

Whakauae Research for Māori Health and Development
Kia ora I'm Kiri, living on the Sunshine Coast of Queensland. My whakapapa (genealogy) is to Te Atiawa, Ngāti Toa, Ngāti Raukawa, Ngāti Ruanui and Ngāi Tahu in Aotearoa, New Zealand. I am a Māori health researcher and evaluator and I'm committed to tino rangatiratanga where indigenous... Read More →

Presenters
avatar for Gill Potaka-Osborne

Gill Potaka-Osborne

Researcher, Whakauae for Maori Health and Development
I am an Indigenous Māori New Zealander and grew up in Whanganui, Aotearoa New Zealand. My tribal affiliations are Te Ātihaunui-ā-Pāpārangi, Ngāti Tuera, Ngāti Pamoana, Ngāti Hauiti and Ngāti Pareraukawa. I joined Whakauae Research Services (Whakauae) in 2005 following stints... Read More →
TT

Teresa Taylor

Kaimahi, T & T Consulting Limited
Indigenous evaluation practice.


Wednesday September 18, 2019 11:10am - 11:15am AEST
C2.6

11:15am AEST

What's beyond the box: Learning from 'tribal' communities and encouraging community ownership of evaluation - a collaborative approach, building on translational research, using an implementation science evaluation framework
Robert Simpson (Mackay Institute of Research and Innovation (MIRI) - Mackay Hospital and Health), Dr Bridget Abell (Australian Centre for Health Services Innovation)

An entertaining and interactive presentation exploring a community based program evaluation that combats the rising population health issues of obesity and diabetes across overweight and obese regional communities - Mackay, Isaac and the Whitsundays.

Evaluation can be part of inspiring communities to healthier life changes and combatting major social epidemics. This presentation discusses evaluation of a collaborative "tribal" approach to behavioural change and how implementation science frameworks can highlight facilitators and barriers to program sustainability and impact from various stakeholder viewpoints. Key features are innovative translational research, community partnerships/ownership of outcomes and evaluation of a tribal innovation from beyond traditional perspectives.


Chairs
avatar for Kiri Parata

Kiri Parata

Whakauae Research for Māori Health and Development
Kia ora I'm Kiri, living on the Sunshine Coast of Queensland. My whakapapa (genealogy) is to Te Atiawa, Ngāti Toa, Ngāti Raukawa, Ngāti Ruanui and Ngāi Tahu in Aotearoa, New Zealand. I am a Māori health researcher and evaluator and I'm committed to tino rangatiratanga where indigenous... Read More →

Presenters
avatar for Robert Simpson

Robert Simpson

Project Manager, Mackay Hospital and Health
Robert Simpson BA MBARob is a born and bred Queenslander specialising in enabling alignment of priorities to overcome challenging problems - whether these relate to communities, organisational, cultural, or individual behavioural change. Experienced evaluator, previously responsible... Read More →


Wednesday September 18, 2019 11:15am - 11:20am AEST
C2.6

11:20am AEST

Design tips for visualising your data
David Wakelin (ARTD Consultants)

Every day we create, analyse and visualise a lot of data. We need to effectively share our findings so they can be turned into actions. Making these small changes when visualising your data can make a big difference in whether your audience can understand and use your findings. I will share simple design tips to instil clarity in the visualisations you design to help your audience to see what you see, know what you know, understand your message and turn evidence into action.

Chairs
avatar for Kiri Parata

Kiri Parata

Whakauae Research for Māori Health and Development
Kia ora I'm Kiri, living on the Sunshine Coast of Queensland. My whakapapa (genealogy) is to Te Atiawa, Ngāti Toa, Ngāti Raukawa, Ngāti Ruanui and Ngāi Tahu in Aotearoa, New Zealand. I am a Māori health researcher and evaluator and I'm committed to tino rangatiratanga where indigenous... Read More →

Presenters
avatar for David Wakelin

David Wakelin

Senior Consultant, ARTD Consultants
I am a keen data analyst with a passion for data visualisation. I've been working on a wide range of projects lately and have seen immense value in being able to tell stories with data I am working with.


Wednesday September 18, 2019 11:20am - 11:25am AEST
C2.6

11:30am AEST

A Primer on Using Qualitative Comparative Analysis (QCA) in Evaluation
Brad Astbury (ARTD)

Qualitative Comparative Analysis (QCA) is a well-established family of research techniques from the applied social sciences. The QCA approach blends qualitative and quantitative sources to analyse causal patterns using a small to medium numbers of cases. Early QCA approaches emerged in the 1980s and have developed since then. While the potential of QCA for a range of evaluation applications has recently been recognised, there are few examples that demonstrate the steps involved to apply this technique in evaluation practice.

This presentation reports on the use, benefits and challenges of QCA in the context of a study that sought to identify different pathways of conditions leading to sustainability of demonstration projects. The session will provide advice on case selection, calibration of conditions and outcome(s), minimisation procedures, necessary and sufficient conditions, truth table analysis using fsQCA software, dealing with contradictory configurations and interpretation of results in the context of theoretical and case-specific knowledge.


Chairs
avatar for Natalie Fisher

Natalie Fisher

Director, NSF Consulting
I am an evaluation consultant with more than 15 years of experience working for clients predominantly in the arts and cultural sectors but also in environmental sustainability and human services. In 2017 I graduated with a Master of Evaluation from the University of Melbourne (First... Read More →

Presenters
avatar for Brad Astbury

Brad Astbury

Director, ARTD Consultants
Brad Astbury is a Director at ARTD Consulting, based in the Melbourne office. He has over 18 years’ experience in evaluation and applied social research and considerable expertise in combining diverse forms of evidence to improve both the quality and utility of evaluation. He has... Read More →


Wednesday September 18, 2019 11:30am - 12:00pm AEST
C2.4

11:30am AEST

Increasing policy impact of disability inclusive evaluation by using an inclusive citizenship lens
Karen Fisher (UNSW Sydney), Prof Sally Robinson (Disability and Community Inclusion, Flinders University)

In this paper we examine whether disability inclusive evaluation can demonstrate the values of inclusive citizenship to influence policy change. The purpose is to observe how inclusive evaluation enables a voice in policy to improve the impact of evaluation.

The values of inclusive citizenship are justice, recognition, self-determination, and solidarity (Lister 2007). We apply these values to policy evaluation that uses methods inclusive of people with disability to enact diversity within evaluation.

We apply the question to a controversial evaluation about closing disability institutions in Australia. The evaluation team included people with and without disabilities - academics, a researcher who had lived in an institution and field researchers who had worked with people with complex communication needs - in partnership with a Disabled Persons Organisation and government. It used a human rights framework to analyse data from policy documents, site observations, interviews, secondary data and quality of life measures.

The evaluation team used various participatory strategies to ensure inclusion of people with intellectual disabilities to organise and conduct the evaluation and to apply the results. It had high policy impact by engaging the government in the evaluation process and applying the inclusive methods. Evaluation practice included reflective conversations in the team about benefits and challenges of the participatory methods.

It found that the inclusive citizenship framework enabled the team to complement the strengths of the various team members. The range of inclusive methods was necessary to ensure that people's expertise was appropriately engaged.

Evaluators aiming to achieve impact to improve the lives of people affected by the policy must consider and invest in inclusive methods for evaluation utility during the design, conduct and delivery of evaluation, even technical evaluations such as this case study. The underlying values of inclusive citizenship can inform inclusive evaluation in all aspects of the evaluation process.


Chairs
Presenters
avatar for Karen Fisher

Karen Fisher

Professor, Social Policy Research Centre UNSW
I conduct research and evaluation about disability and mental health policy in Australia and China. I use inclusive methods with people with disability, families and other stakeholders. I am enjoying extending that to film, photos and other accessible methods to ensure the evaluations... Read More →
avatar for Sally Robinson

Sally Robinson

Professor, Disability and Community Inclusion, Flinders University
I’ve worked alongside people with disability my whole working life. In that time, I’ve been lucky enough to have colleagues with intellectual disability who gave me really important direction. Without their guidance, I wouldn’t have taken on my PhD research, which was about... Read More →


Wednesday September 18, 2019 11:30am - 12:00pm AEST
C2.5

11:30am AEST

A fundamental choice: Internal or external evaluation capacity building? Or a bit of both?
Vanessa Hood (Rooftop Social), Liam Downing (NSW Department of Education and Training)

Who is best placed to build the evaluation capacity of an organisation - internal staff members or external consultants?  Or a combination? How do you make the decision about what will be best for your organisation?  If you've ever contemplated these questions, then this interactive session is for you.

During the session, the facilitators will share the reality of their experiences in internal and external evaluation capacity building (ECB) roles - the pros and cons, the similarities and differences, the different approaches they've tried (hint, it's not all about running a good workshop!)

Participants will also be encouraged to share their experience around ECB decision making in their context. As a group, we will pull apart how decisions are made in this space.

We will collaboratively develop a 'decision tree' that can be used to support the decision-making process. The facilitators will use creative processes that allow participants to interact with each other and contribute to their ideas.

The outputs of the session, including the draft decision tree, will sent to participants afterwards. For those who wish to remain engaged, a follow-up Adobe Connect session will be offered to finalise the model and then it will be distributed to the AES Evaluation Capacity Building Special Interest Group (AES ECB SIG).


Chairs
avatar for Kerrie Ikin

Kerrie Ikin

Adjunct Senior Lecturer, UNE Business School, University of New England
Dr Kerrie Ikin FACEKerrie began her career as a teacher and executive member in government schools across New South Wales, Australia. For over 25 years she worked at the systems level of the education department as a director of schools, senior policy adviser, and senior administrator... Read More →

Presenters
avatar for Vanessa Hood

Vanessa Hood

Associate Director, Rooftop Social
I've been working as a facilitator and evaluator for over 15 years, in a wide range of contexts, including horticulture, sustainability and financial literacy. I work at Rooftop Social with Duncan Rintoul and a team of associates from around Australia, on evaluation capability building... Read More →
avatar for Liam Downing

Liam Downing

Evaluation Capacity Building Lead, NSW Department of Education


Wednesday September 18, 2019 11:30am - 12:30pm AEST
C2.1

11:30am AEST

Confidence for evaluators: The unspoken skill
Matt Healey (First Person Consulting)

Typically, evaluators are seen and presented as all-knowing experts across a never-ending range of areas: quantitative, qualitative and mixed research methods, engagement approaches, cultural competencies, reporting tools and platforms. On top of this, is a need to understand the constant change within and across social, health and environmental arenas, exponential changes in technology and the implications for evaluation. In many ways, it is impossible for evaluators to know everything - even more so for evaluators at the earlier stages of their career.

During an emerging evaluators panel session at aes18, one of the key themes that emerged when emerging and early career evaluators discussed how to move out of the intermediate "fuzzy middle" towards becoming 'experts' was the need to be both comfortable in uncertainty and confident with themselves, their knowledge and their practice in evaluation. While the AES competency framework emphasises competence in a range of areas, the need to be confident (and develop confidence) is only implicit across domains, and only explicit in the context of building confidence in others, and in statistical methods!

This session will draw on practices and principles from the presenter's own experience in developing his confidence in the context of presenting, facilitating and dealing with large audiences. Through a mix of lightning talks, light-hearted hands-on activities and reflective small group discussions attendees will leave with tools and approaches immediately implementable during and post-conference. Ultimately, this highly interactive skill session will make explicit the unspoken (but crucial) soft skill of confidence.




Chairs
avatar for Jess MacArthur

Jess MacArthur

Doctoral Candidate, Institute for Sustainable Futures - University of Technology Sydney
Jess is a doctoral student at the Institute for Sustainable Futures, University of Technology Sydney. Her research focuses on recognising and discovering innovative ways to measure how water and sanitation programming in South and Southeast Asia affects women. Jess specialises in... Read More →

Presenters
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led over 50 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... There's probably... Read More →


Wednesday September 18, 2019 11:30am - 12:30pm AEST
C2.3

11:30am AEST

Evaluation: What is the value in the box?
Co-authors and moderators: Laurence Denholm (NSW Department of Premier and Cabinet), Anthea McClintock (NSW Department of Premier and Cabinet)

Panelists: Lyn Alderman (The Evaluators’ Collective), Geoff Gallop AC  (Emeritus Professor ANZSOG and Director, Graduate School of Government, University of Sydney, former Premier of Western Australia), Nicholas Gruen  (Lateral Economics), William Murphy (NSW Department of Customer Service), Simon Smith (Nous Group), Jonathan Wheaton (NSW Department of Planning, Industry and Environment)

Note: Audience participation (questions to panel and audience polling) by mobile phone, using sli.do. Please download the sli.do app from your app store or use the sli.do website.

Despite steady expansion in the scope and intensity of evaluation practice in government agencies throughout the OECD, criticism of evaluation continues amongst policy professionals and public administrators on the basis of cost and lack of timeliness in results. Emerging policy implementation practices such as "deliverology", which address alleged shortcomings in evaluation practice, now compete for the attention of policy and program managers, some who see evaluation as nothing more than a mandated compliance activity. Although evidence from evaluation can contribute to accountability and communications in established programs, the paramount value lies in application of evaluation results for long-term improvement of future policies and programs through evidence-based decision support. Improving the decision support value of evaluation will however require better tailoring of evaluation outputs to decision-makers' needs, addressing tensions between timeliness, objectivity and cost of results, and broadening the application of evaluation results through informed inference. Guided by an expert panel, this session will pursue a participatory approach to consensus on factors, real or perceived, that significantly constrain the adoption and hence the full potential of evaluation as a decision support tool. Participants and panel will propose and explore opportunities for mitigation of those constraints that are real and rebuttal of those that are not. It is expected that the ground between evaluation and "deliverology" will be especially fertile. Participation will provide practicing evaluators and those commissioning evaluation services with an opportunity to contribute to ongoing debate about the value of evaluation as a policy decision support tool. Importantly, participation will enable those interested in advocating for evaluation with a toolkit to rebut incorrect criticism and, as far as practicable, facilitate effective responses to valid criticism from those who are often the end-users of evaluation outputs. The session will guide more effective and more valuable application, design and reporting of evaluation.

Chairs
avatar for Ghislain Arbour

Ghislain Arbour

Senior Lecturer, The University of Melbourne
Ghislain Arbour is a Senior Lecturer at the Centre for Program Evaluation at the University of Melbourne in Australia.He has a hard time not talking about terminology in evaluation (see what happens if you ask about his dictionary project), the nature of evaluation theory and models... Read More →

Presenters
avatar for Laurence Denholm

Laurence Denholm

Principal Policy Analyst, Program Evaluation Unit, NSW Department of Premier and Cabinet
Laurence Denholm BVSc(Hons) LLB(Hons) DipAgSc GradDipLegPrac PhD(Cornell) GAICD.https://au.linkedin.com/in/dr-laurence-denholm-gaicd-79217129 Dr Denholm has worked in public sector agencies since 1969, in Victoria, NSW, the Commonwealth and the United States. Following a career in... Read More →
avatar for Anthea McClintock

Anthea McClintock

Senior Manager Evaluation, NSW DPC
I lead the Program Evaluation Unit of Department of Premier and Cabinet. Our team is currently evaluating regional infrastructure programs funded by the NSW State Government. Prior to working with DPC, I worked for the Department of Primary Industries, ABARE and the Industry Commission... Read More →
avatar for Lyn Alderman

Lyn Alderman

Principal, The Evaluators' Collective
Lyn is passionate about evaluation as a discipline with deep work experience and research into strategy, evaluation theory and practice and the effectiveness of decision-making tools.
avatar for Geoff Gallop

Geoff Gallop

Emeritus Professor, University of Sydney
Premier of Western Australia 2001-2006. Director of the Graduate School of Government at Sydney University 2006-2015
avatar for Nicholas Gruen

Nicholas Gruen

CEO, Lateral Economics
Nicholas Gruen is a widely published policy economist, entrepreneur and commentator on our economy, society and innovation.He is CEO of Lateral Economics, Visiting Professor at Kings College London Policy Institute and Adjunct Professor at UTS Business School, Chair of Peach Financial... Read More →
avatar for William Murphy

William Murphy

A/Deputy Secretary - Customer, Delivery and Transformation, NSW Department of Customer Service
William Murphy’s role in the newly formed Department of Customer Service is to create a powerful, evidence-driven platform to drive transformation in the design, funding and delivery of government services. He brings together the Customer Service Commission, the Behavioural Insights... Read More →
avatar for Simon Smith

Simon Smith

Principal, Nous Group
Simon is a principal with Nous Group, an award winning Australian management consulting firm. Prior to consulting, Simon had 24 years’ experience in the NSW public sector, including as CEO and Secretary of the NSW Department of Industry, Chief Executive of the NSW Department of... Read More →
avatar for Jonathan Wheaton

Jonathan Wheaton

Director, Economic Programs, NSW Government - Department of Planning, Industry and Environment
Jonathan has 15 years experience working across social and economic programs in the NSW Government, specialising in regional economic development. He is currently responsible for administering economic and community infrastructure funding programs totalling over $1.7 billion under... Read More →


Wednesday September 18, 2019 11:30am - 12:30pm AEST
Pyrmont Theatre

11:30am AEST

Better Evaluation: Aboriginal and Torres Strait Islander Evaluation Project
Donna Stephens (Menzies School of Health Research), Sharon Babyack (Indigenous Community Volunteers), Belinda Gibb (Indigenous Community Volunteers), Debbie Hoger (Murawin Consulting), Carol Vale (Murawin Consulting), Kate Kelleher (Kate Kelleher Consulting), Greet Peersman (BetterEvaluation Project)

Better Evaluation is a global public good collaboration to improve how evaluation is planned, managed, conducted and used. Its website (betterevaluation.org) shares information on evaluation methods and processes, approaches and thematic pages, events and resources.

A new Better Evaluation project draws together evaluators, researchers and community development practitioners from Aboriginal and Torres Strait Islander organisations and academia to promote evaluation that builds knowledge and understanding of the heterogeneous nature of Aboriginal and Torres Strait Islander communities. The aim is to produce an ethical framework that draws on key principles and understandings in contemporary Aboriginal and Torres Strait Islander evaluation and use this framework to identify and highlight examples of evaluation practice that are rigorous, culturally appropriate and endorsed by community. In working with communities, the project team also had to address its own shared ethical code while working from their individual, community and organisational standpoints. This navigation is indicative of much broader national and international conversations of data sovereignty and what constitutes the ethical principles of effective evaluation in these specific contexts. Most importantly, the translation of Aboriginal and Torres Strait Islander knowledge of effective evaluation that translates to and transforms actions, into language and constructs that can be understood in the broader community.

Successful evaluation does not exclude non-indigenous researchers and evaluators; yet, it requires parameters deeply embedded in an Aboriginal and Torres Strait Islander focused evaluation culture that are not yet solidified in current evaluation practice in these settings. This project has sought to unbox community voices by sharing only community-endorsed evaluation examples on the website, providing a unique platform for the voice of both the Indigenous evaluator and the Indigenous participant in the evaluation, to be heard and privileged.


Chairs
avatar for Ruth McCausland

Ruth McCausland

Senior Research Fellow, School of Social Sciences, UNSW
Dr Ruth McCausland is Director of Research and Evaluation for the Yuwaya Ngarra-li partnership between the Dharriwaa Elders Group and UNSW, and Senior Research Fellow in the School of Social Sciences, UNSW. Her research focuses on women, young people, people with disabilities and... Read More →

Presenters
avatar for Donna Stephens

Donna Stephens

Research and Project Manager Wellbeing and Preventable Chronic Diseases Division, Menzies School of Health Research
avatar for Sharon Babyack

Sharon Babyack

General Manager Impact & Strategy, ICV - Indigenous Community Volunteers
While at ICV, I've delivered the Monitoring, Evaluation and Learning Review project, co-designed the M&E database and framework and developed and run the consultation and M&E training processes with our regional teams.I'm currently co-leading our team as we undertake participatory... Read More →
avatar for Belinda Gibb

Belinda Gibb

Belinda is a proud Dharug woman, the traditional owner group from Western Sydney Australia. She has over 20 years’ experience in education, policy and program delivery, in both government and the not for profit sector, including a senior manager role at the Healing Foundation, and... Read More →
avatar for Greet Peersman

Greet Peersman

Deputy Director, BetterEvaluation, BetterEvaluation/ANZSOG
My areas of expertise are:•Assessing and strengthening integrated and impact-focused monitoring and evaluation (M&E) systems for government and NGOs•M&E capacity strengthening at individual and organizational levels•Identifying evaluation priorities and selecting situationally-appropriate... Read More →


Wednesday September 18, 2019 11:30am - 12:30pm AEST
C2.2

11:35am AEST

The whole box and dice: economic evaluation trends and forecasts
Mark Galvin (EY)

Recent government moves towards outcomes budgeting is the latest illustration that outcomes thinking is here to stay. Outcomes evaluation coupled with economic evaluation is increasing and increasingly interdependent, especially in the social policy and services space. With such anticipation, the risk of an empty box looms large. Demonstrating and valuing outcomes requires intentional and fit-for-purpose measurement approaches. Sharing approaches is critical to further innovation and support for robust public decision making.

This Ignite presentation will showcase changes in the policy landscape, as well as visual depictions of evaluation methodologies that situate 'traditional' social outcomes as benefits and how significant economic value is derived through effective services delivery and cost avoidance.


Chairs
avatar for Kiri Parata

Kiri Parata

Whakauae Research for Māori Health and Development
Kia ora I'm Kiri, living on the Sunshine Coast of Queensland. My whakapapa (genealogy) is to Te Atiawa, Ngāti Toa, Ngāti Raukawa, Ngāti Ruanui and Ngāi Tahu in Aotearoa, New Zealand. I am a Māori health researcher and evaluator and I'm committed to tino rangatiratanga where indigenous... Read More →

Presenters
avatar for Mark Galvin

Mark Galvin

Partner, EY
Mark is a Partner in EY’s Government and Public Sector practice and leads the firm’s Evaluation Practice Network. Mark is an economist and evaluator with over 15 years of experience as a professional advisory consultant. He is passionate about the use of traditional economic and... Read More →
avatar for Alain Nader

Alain Nader

Senior Manager, EY
Over the past ten years I have delivered strategic advice and implementation support to a number of government agencies, both State and Federal. Areas of particular interest include examining the roles and responsibilities of government, improving citizen outcomes and the allocative... Read More →


Wednesday September 18, 2019 11:35am - 11:40am AEST
C2.6

11:40am AEST

Using e-diaries to collect evaluation data
Carolyn Hooper (Allen and Clarke Policy and Regulatory Specialists)

During an intervention evaluation, front-line service delivery staff made periodic diary entries using an on-line portal. Diarists responded to prompts specific to the evaluation questions. The output provided valuable insights to the day-to-day realities of those delivering the intervention; resulting in front-line staff having a strong voice in the evaluation report. The e-diary is an accessible, innovative method for collecting data, suited to situations where a detailed view of the work at the intervention delivery interface is valuable, but direct observation by an evaluator is problematic. Come and see how we did it.

Chairs
avatar for Kiri Parata

Kiri Parata

Whakauae Research for Māori Health and Development
Kia ora I'm Kiri, living on the Sunshine Coast of Queensland. My whakapapa (genealogy) is to Te Atiawa, Ngāti Toa, Ngāti Raukawa, Ngāti Ruanui and Ngāi Tahu in Aotearoa, New Zealand. I am a Māori health researcher and evaluator and I'm committed to tino rangatiratanga where indigenous... Read More →

Presenters
avatar for Carolyn Hooper

Carolyn Hooper

Evaluation + Research Consultant, Allen + Clarke
I'm a social scientist, with a preference for qualitative methods. I've been working in evaluation for four years and I love the variety of opportunities I get to make a difference. My doctorate in Public Health means I often work on health-related evaluations, examining interventions... Read More →


Wednesday September 18, 2019 11:40am - 11:45am AEST
C2.6

11:45am AEST

Lessons from the Dark Side: How Corporates do Client Experience
Emily Verstege (ARTD Consultants)

I've been in a corporate wilderness for the last four years, working with for-profit organisations to gather evidence to understand their clients better. I quickly realised corporations know lots about their clients in ways that we don't, as governments or non-profits. This Ignite presentation un-boxes client experience for evaluators, with anecdotes from the "dark side".

Chairs
avatar for Kiri Parata

Kiri Parata

Whakauae Research for Māori Health and Development
Kia ora I'm Kiri, living on the Sunshine Coast of Queensland. My whakapapa (genealogy) is to Te Atiawa, Ngāti Toa, Ngāti Raukawa, Ngāti Ruanui and Ngāi Tahu in Aotearoa, New Zealand. I am a Māori health researcher and evaluator and I'm committed to tino rangatiratanga where indigenous... Read More →

Presenters
avatar for Emily Verstege

Emily Verstege

Senior Manager, ARTD Consultants


Wednesday September 18, 2019 11:45am - 11:50am AEST
C2.6

12:00pm AEST

Part 2 Take your idea and make it speak: Publishing your evaluation stories
Have you got a conference presentation you want to share with other audiences through other platforms? Has the conference sparked an idea for an article or a blog?

In this session, the editors of the EJA and AES blog are providing a space for you take your ideas one step closer to publication. The session will be shaped by the interests of participants, but could include a focus on framing your ideas within a workable flow for publication; different types of papers; repurposing content across platforms; and how you can use the peer review process as a tool to support your writing. Who knows, you may even find a writing buddy in the room!

You do not need to have attended the Finding your voice session on Monday to attend this session.

Chairs
Presenters
avatar for Liz Gould

Liz Gould

Associate Director, NSW Department of Premier and Cabinet
I've been evaluating and using evaluation methodologies to manage and conduct public health, community, and social services evaluation and review projects for commonwealth and state government for the better part of a decade. I am an Editor of the Evaluation Journal of Australasia... Read More →
CQ

Carol Quadrelli

Consultant, University of Queensland
I have over 25 years of experience in the higher education sector with additional time served in local and state government roles. I have worn, and continue to wear many hats! Academic roles include: qualitative research, unit coordinator /lecturer /tutor in Law (Criminology... Read More →
avatar for Bronwyn Rossingh

Bronwyn Rossingh

Chief Financial Officer, Tiwi Island Training and Employment
Bronwyn is passionate about supporting the vision of Aboriginal communities and organisations. She has worked extensively in remote Aboriginal Communities in the NT and WA in the areas of financial management, governance, community engagement, enterprise development, financial capability... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & Managing Director, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
avatar for Eunice Sotelo

Eunice Sotelo

Project Officer, Australian Institute for Teaching and School Leadership
Interested in evaluation capacity building in the policy influence space. As an embedded evaluator, I'm open to trading stories about what it's like and how you've faced your own challenges. Message me on LinkedIn or tap me on the shoulder at the conference.


Wednesday September 18, 2019 12:00pm - 12:30pm AEST
C2.5

12:00pm AEST

Evaluating system change: Exploring how project innovations transform business as usual
Adrian Field (Dovetail), Julian King (Julian King and Associates), Kate McKegg (The Knowledge Institute)

How do project innovations create changes in wider organisational systems and practice? This short paper will discuss our learning from evaluating three dynamic road safety projects working within an innovation umbrella programme.

This session will highlight the challenges and opportunities for taking innovation to scale, reflecting on our learning from theoretical approaches outside evaluation that offer compelling new windows for evaluator’s understanding of impact and change. Grounded in real-world application of three innovative road safety projects, the paper will present the inter-weaving of socio-technical systems theory, developmental evaluation, rubrics, and learning from the innovation literature.

These approaches were used, along with their practical application through rubrics and multiple data collection methods, to explore the extent to which the projects fostered innovation that translated into sustained business operations.

This paper will provide useful ideas and reflections for participants including how collaboratively developed evaluation rubrics were used to define and assess levels and dimensions of system change that each project could reflect against, through a developmental process of engagement and reflection.

We will also reflect on the contribution that collaborative partnerships, communities of practice, people-centred approaches, and reframing risk offer to evaluation practitioners as avenues for exploring the translation of innovation to system change. Socio-technical systems theory will be provided as a lens for understanding the potential for local or niche innovations to lever changes in wider systems.

The session will conclude with an exploration of the role of evaluation in capturing and catalysing innovation.

Chairs
avatar for Natalie Fisher

Natalie Fisher

Director, NSF Consulting
I am an evaluation consultant with more than 15 years of experience working for clients predominantly in the arts and cultural sectors but also in environmental sustainability and human services. In 2017 I graduated with a Master of Evaluation from the University of Melbourne (First... Read More →

Presenters
avatar for Julian King

Julian King

Director, Julian King and Associates
Julian specialises in evaluation and value for money. He advises, teaches, presents, and writes on these topics globally, with a particular focus on combining evaluative reasoning with economic methods of evaluation. Julian is a member of the Kinnect Group, an Associate of Oxford... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail Consulting Ltd
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →


Wednesday September 18, 2019 12:00pm - 12:30pm AEST
C2.4

1:30pm AEST

MEL in fragile and conflict-affected settings: Remote monitoring of the aid program in Afghanistan
Ulla Keech-Marx (DFAT)

How do you monitor and evaluate a large aid program in an active conflict zone? How do you effectively verify data from development projects and monitor for unintended consequences when the security situation presents significant risks to those on the ground?

The Afghanistan M&E Lab was set up in late 2017 with funding from DFAT's InnovationXchange. Its purpose is to explore creative ways to monitor and manage the Australian aid program to Afghanistan from afar. The findings from the Lab underpin our approach to MEL for the Australian aid program in Afghanistan, and have potential for application in other remote, conflict-affected or otherwise inaccessible settings.

The Lab encompasses a number of sub-projects investigating different potential remote monitoring options. This includes trialling the use of sentinel indicators to monitor change at the system level, developing monitoring and learning techniques drawing on tools for iterative adaptive programming, and investigating whether big data can be used to develop proxy indicators for verification purposes. Can banking big data be used to develop proxies for women's economic empowerment? And can changes in women's use of mobile phones tell us anything about changes in their mobility or status?

In this session we will share our learnings to date on meeting the MEL challenge in Afghanistan.

Chairs
avatar for Peter Ellis

Peter Ellis

Director, Nous Group
Professionally I'm both an evaluator and a statistician, with a particular interest in using evidence to improve public sector outcomes. While I'm now in consultancy, I've previous run evaluation functions, including as Director Program Evaluation for AusAID, and Manager Tourism Research... Read More →

Presenters
avatar for Ulla Keech-Marx

Ulla Keech-Marx

Principal Consultant, Research Monitoring and Evaluation Practice, Coffey International Development
Ulla Keech-Marx is an experienced international development professional with specialist expertise in MEL, gender equality and governance. She holds post-graduate qualifications in Evaluation and Asian Studies (Indonesia). She has a particular interest in strategic MEL, politically-astute... Read More →
avatar for Sarah Ransom

Sarah Ransom

Assistant Director Afghanistan Development, DFAT
Sarah is a development professional with a career focus on governance, civil society, gender, M&E, and all their intersections. She worked in and on South East Asia for over ten years (including postings in Vietnam and Laos) and now works on gender, politics and peace in the Australian... Read More →


Wednesday September 18, 2019 1:30pm - 2:00pm AEST
C2.4

1:30pm AEST

Participatory Action Research - An approach for evaluators to discover and celebrate community strengths
Sharon Babyack (Indigenous Community Volunteers), Belinda Gibb (Indigenous Community Volunteers)

Building from community strengths, recognising and celebrating culture, community ownership and collaborative design and delivery is paramount for programming and evaluation in this Aboriginal and Torres Strait Islander organisation. Sharing benefits and reciprocal respect is important for any evaluator seeking to work with Aboriginal and Torres Strait Islander people and communities.

Participatory Action Research (PAR) provides a valuable option for embedding monitoring and evaluation into practical activities requested by communities.

In 2018 the organisation designed and launched a two year PAR project. It took this approach to maximise the benefits of the research for the fourteen communities who agreed to participate. The project tests the organisation's Story of Change - a theory that cements the patterns of the steps many communities have taken towards holistic wellbeing. Improving governance has proven to be a key step to achieving longer term community aspirations.

The PAR project asks, 'How does the organisation's approach strengthen understanding and implementation of governance to empower communities to achieve their dream?' The approach is flexible and multi-disciplinary and includes observation, co-design and delivery of activities, participatory monitoring and evaluation, co-authoring case studies with each community; and semi-structured interviews using a purpose built participatory tool. The organisation has received ethical approval from the Australian Institute of Aboriginal and Torres Strait Islander Studies (AIATSIS) for the project.

Participatory Action Research is a useful form of inquiry as it is close to the ground, values the contribution of those with the lived experience and facilitates shared learning. As a flexible, multi-disciplinary approach PAR can also accommodate the co-design and delivery of the activities.

Importantly, feedback loops are built into this dynamic and cyclical approach to evaluation. This accommodates shared learning and the immediate adaptation of activities and solutions for improved outcomes. It mobilises evidence. This makes it meaningful for the people involved.


Chairs
avatar for Mia Bromley

Mia Bromley

Manager, Paxton Partners
I am an experienced health and social service sector consultant and non-executive director with expertise across health, education and justice systems. I have a strong focus on emerging funding models, system redesign, and outcomes measurement. I enjoy partnerships and collaborations... Read More →

Presenters
avatar for Sharon Babyack

Sharon Babyack

General Manager Impact & Strategy, ICV - Indigenous Community Volunteers
While at ICV, I've delivered the Monitoring, Evaluation and Learning Review project, co-designed the M&E database and framework and developed and run the consultation and M&E training processes with our regional teams.I'm currently co-leading our team as we undertake participatory... Read More →
avatar for Belinda Gibb

Belinda Gibb

Belinda is a proud Dharug woman, the traditional owner group from Western Sydney Australia. She has over 20 years’ experience in education, policy and program delivery, in both government and the not for profit sector, including a senior manager role at the Healing Foundation, and... Read More →
avatar for Doyen Radcliffe

Doyen Radcliffe

Regional Manager, Indigenous Community Volunteers
Doyen Radcliffe is a Yamatji Naaguja Wajarri man from the Midwest Region of Western Australia. Doyen  is a community minded individual with a passion for empowering  Indigenous communities to reach their real potential to improve  quality of life, health, social and economic wellbeing... Read More →


Wednesday September 18, 2019 1:30pm - 2:15pm AEST
C2.5

1:30pm AEST

Peer Assessment as a step toward professionalisation
Sue Leahy (ARTD Consultants), Helen Simons FAcSS FRSA (University of Southampton), Delyth Lloyd (Department of Health and Human Services )

Evaluators around the world are seeking to identify the unique set of skills needed to successfully practice evaluation and professionalise our work, through systematic approaches to training and in some cases, credentialing schemes. In Australia, we have been supported to develop as evaluation practitioners through conferences, training programs and resources, such as the Australian Evaluation Society (AES) competency framework and guidelines for ethical practice. But in other respects, our journey toward professionalisation is in its infancy and there is an appetite for more structured pathways to support peoples' professional journeys into and within the evaluation sector. In some countries evaluators have been trialing self and peer assessment schemes to help structure learning and offer professional development support. This paper showcases the experience of the United Kingdom Evaluation Society (UKES) in piloting its Voluntary Evaluator Peer Review System (VEPR). It involves a videoconference link with the convenor of the UKES Professionalisation subgroup and a facilitated question and answer session that will allow participants to explore the implementation of the UK peer assessment process. This makes a strong link to the Society's capabilities framework and informs future training. Reflections and learnings from the session will be provided to the AES to inform the possible development of a peer-assessment scheme in Australia.

Chairs
avatar for Lee-Anne Molony

Lee-Anne Molony

Director, Clear Horizon
Principled-focused evaluation Evaluating place-based approachesOrg level evaluation frameworks

Presenters
avatar for Sue Leahy

Sue Leahy

Managing Director, ARTD Consultants
Sue is an accomplished evaluator, policy analyst and facilitator and managing Principal at ARTD, a leading evaluation and public policy company based in NSW. She joined ARTD in 2009 from the NSW Department of Family and Community Services, where she managed a wide-ranging program... Read More →
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Department of Health and Human Services, Vic
Implementation and dissemination, health program evaluation, capacity building.
avatar for Professor Helen Simons FAcSS, FRSA

Professor Helen Simons FAcSS, FRSA

Professor of Evaluation and Education | Council member and Convenor of Professionalization subgroup of UK Evaluation Society, University of Southampton
I am a founder member of the United Kingdom Evaluation Society, a Council member since its inception and currently convenor of the Professionalization subgroup of the Society. This has four strands - Ethics, Evaluation Capabilities, Voluntary Evaluator Peer Review (VEPR) and Evaluation... Read More →


Wednesday September 18, 2019 1:30pm - 2:30pm AEST
C2.1

1:30pm AEST

Unpacking the complex boxes
Joanna Farmer (Deloitte)

Social problems appear more complex than ever before, as people - and the services that support them - are ever more connected.

Policy and program developers increasingly recognise that the solutions to interdependent challenges are complex interventions. However, as evaluators, we are often expected to work within boxes, constrained in the extent to which we can address complexity. Some of these constraints are practical - funding and program scope - while others pose fundamental challenges to how we do evaluation, such as balancing our accountabilities to all relevant stakeholders.

Evaluation is an important part of the policy making cycle that provides valuable information on intervention design and implementation. But to maintain relevance in an increasingly complex world, evaluators have to adapt approaches that look to systems - not simply programs. Evaluators need to look not just at what's in the box, but what's beyond.

In this world café session, attendees will be encouraged to step outside their current box - be it their discipline, sector or theoretical leanings - to share and learn with others while we unpack the big boxes of evaluation. In opening, the presenter will draw on evaluation theory and her experiences designing evaluations for complex social problems before providing key discussion topics for attendees.

Contributions will be summarised and provided back to participants after the session.


Chairs
avatar for Mathea Roorda

Mathea Roorda

Senior consultant, Allen + Clarke Consulting
Let's get to the heart of that word 'evaluation'. What do we mean by value and how do we know we've included all relevant values (criteria) in our assessment of a programme? Questions that keep me awake at night...

Presenters
avatar for Jo Farmer

Jo Farmer

I'm a policymaker and evaluator focused on improving the health and wellbeing of all Australians, particularly those with mental illness. My experience includes strategic planning, evaluation and service design in not for profits, government and service delivery agencies.My particular... Read More →


Wednesday September 18, 2019 1:30pm - 2:30pm AEST
C2.3

1:30pm AEST

Unboxing the Inquiry - the Independent Inquiry into the Australian Public Service and its implications for evaluation
John Stoney (AES)

The Independent Inquiry into the Australian Public Service was announced in June 2018. It received over 600 initial submissions, including from the AES, which proposed the Inquiry should consider options for developing appropriate organisational infrastructure and support systems for evaluation and policy evidence, capable of informing policy decision-making and showing the effectiveness of the APS. This included:
  • investment in better systems,
  • increasing the APS staff evidentiary and performance literacy
  • a critical mass of staff with specialist technical expertise
  • encouraging a culture of performance management
  • institutional infrastructure

Following the release of its ‘Priorities for Change’ interim report and associated discussion papers in April 2019, the AES lodged a second submission, proposing:
an enabling environment for performance
  • a central whole-of-Australian Government evaluation hub
  • support for the introduction of a professions model and an ‘APS Academy’
  • consulting across and outside of government to inform the design and introduction of these and other reforms proposed by the Independent Review

With the Final Report due to be delivered shortly to the Australian Government, and the Prime Minister recently outlining some of his vision for the APS, this session will enable AES members to hear an update on the Inquiry, panel members views about the issues raised in the AES submission, and their thinkingon the possible implications for evaluation arising from the Inquiry's work.


Chairs
SG

Susan Garner

Director, Garner Willisson
I've been involved in evaluation work since 1996, when I was asked to evaluate a health policy about funding of public hospitals. I got 'hooked' from that point onwards, and even after 30 years in the field of evaluation, there's always another challenge to embrace, something interesting... Read More →

Presenters
avatar for John Stoney

John Stoney

President, Australian Evaluation Society
An internal evaluation practitioner within the Australian Government for nearly 15 years which he describes this as his 'day job', in his 'evening job' John is the current AES President. Prior to that he has  been (also effectively part-time) at varying stages a student and later... Read More →
MW

Mary Welsh

Mary has recently retired from the Australian Public Service. She has a PhD in public policy evaluation and extensive experience in policy and program evaluation, working mainly with commissioned evaluations in the schools and early childhood sectors.
avatar for Alexandra Ellinson

Alexandra Ellinson

Performance Audit Leader, Audit Office of NSW
AH

Andrew Hawkins

Partner, ARTD Consultants
Andrew works as a trusted advisor and strategic evaluator for public policy professionals, generating insight and evidence for decision-making. Andrew has worked for a wide range of Australian and NSW public sector agencies and not-for-profits on the design, monitoring and evaluation... Read More →


Wednesday September 18, 2019 1:30pm - 2:30pm AEST
Pyrmont Theatre

1:30pm AEST

Disrupting power dynamics and bringing diverse voices to evaluation
Jade Maloney (ARTD), Emma Bedwin (NSW Fair Trading)

As evaluators, we need not only technical competencies, but the capacity to understand macro- and micro-politics, power dynamics and competing perspectives on what is of value and whose values count.

When we work with communities identified as 'vulnerable', we need to be particularly conscious of how past policies and ongoing practice can limit people's confidence to voice their perspective.
But it is not only when working with 'vulnerable' communities that we must be conscious of power dynamics. There are also power dynamics at play when working with program staff who are unfamiliar with evaluation theory and practice, and who fear evaluation and how it will be used by decision-makers.

There is another layer to the dynamic when an external funder is involved. The funder can set evaluation terms of reference and have a dominant voice in setting the parameters for what is valued.
If we are to recognise the rights of people with lived experience to shape the policies and programs that affect their lives, and recognise practice knowledge (which is often discounted in research literature), we need to find ways to recognise, navigate and disrupt power relationships.

We will use a series of creative techniques to enable evaluators to first embody the power dynamics involved in several evaluation scenarios and how these could be disrupted. As a group we will then explore what we as evaluators can and have done to:
  • influence who is at the table in evaluation
  • welcome and give space to diverse voices
  • balance competing perspectives.

To facilitate the conversation, we draw on a case study evaluation of a co-designed, co-delivered community engagement program, delivered in partnership between government, community organisations and people with intellectual disability, people with psychosocial disability, and people with disabilities from Aboriginal and culturally and linguistically diverse communities.
The principles and ideas from the session will be collated in a final closing the circle discussion and distributed to interested AES members.

Chairs
avatar for Karen Fisher

Karen Fisher

Professor, Social Policy Research Centre UNSW
I conduct research and evaluation about disability and mental health policy in Australia and China. I use inclusive methods with people with disability, families and other stakeholders. I am enjoying extending that to film, photos and other accessible methods to ensure the evaluations... Read More →

Presenters
avatar for Jade Maloney

Jade Maloney

Partner & Managing Director, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →


Wednesday September 18, 2019 1:30pm - 2:30pm AEST
C2.2

1:35pm AEST

Personality preferences - Implications for influencing evaluation design and utilisation
Eve Barboza (Wholistic Learning Pty Ltd)

Can the personality preference of the evaluator influence the design and utilisation of evaluation? Can any differences in these personality preferences between evaluator and client / audience of the evaluation explain some of the controversies in evaluation practice? This session explores how personality preferences could be drawn on to inform the design of evaluation and influence the implementation and utilisation of evaluation findings. Drawing on some positive and negative experiences of the presenter we will explore personality preferences as a framework to inform and support your work to improve the design and utilisation of your evaluation projects

Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social

Presenters
avatar for Eve Barboza

Eve Barboza

Director / Facilitator, Wholsitic Learning
BA Hons, MAPS (Member Australian Psychological Society) CAHRI (Certified Professional Australian Human Resources Institute) and an AES member since 1990. During a career spanning 25 years Eve has developed the knowledge and experience necessary to work as an evaluator, researcher... Read More →


Wednesday September 18, 2019 1:35pm - 1:40pm AEST
C2.6

1:40pm AEST

A live unboxing: The evaluation capacity building role
Liam Downing (Centre for Education Statistics and Evaluation)

In a session designed especially for those who LOVE watching those unboxing videos on YouTube, I will unbox, set up, and use a brand new evaluation capacity building role live on the AES 2019 stage. I will show you what's inside, how it works and what it can do. You can see if it's the right choice for you to build skills and grow the profession through capacity building. This Ignite presentation will also use props. PROPS!

Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social

Presenters
avatar for Liam Downing

Liam Downing

Evaluation Capacity Building Lead, NSW Department of Education


Wednesday September 18, 2019 1:40pm - 1:45pm AEST
C2.6

1:45pm AEST

Evolving from academic researcher to evaluator
Natalia Krzyzaniak (NPS MedicineWise)

In contrast to common perception, evaluation and research are two distinct disciplines. Both require the application of data collection and analysis skills and centre on the shared objective of answering a question. However, the purpose of each discipline, and dissemination of the data collected, differ. Entering the evaluation profession from a research background, requires a level of adaptation to become an efficient and successful evaluator. This presentation will walk the audience through my journey from a researcher to an emerging evaluator, outline the key similarities and differences between research and evaluation, and the upskilling required to become an efficient evaluator.

Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social

Presenters
NK

Natalia Krzyzaniak

NPS MedicineWise
Natalia is a recent PhD graduate, majoring in Pharmacy, from the University of Technology Sydney. She currently holds a position as a Program Evaluation Officer at NPS MedicineWise, and is involved in the evaluation of educational programs delivered to health professionals and consumers... Read More →


Wednesday September 18, 2019 1:45pm - 1:50pm AEST
C2.6

1:50pm AEST

Getting past the imposter syndrome: you don't have to be an expert to help build evaluation capacity in your organisation.
Margaret Moon (SafeWork NSW)

If you're new to evaluation you might feel like an imposter at least some of the time. You get appointed to a new role with "evaluation" in the title and suddenly you're expected to be an expert!

This can be daunting.

But many of the skills and qualities that evaluators need are transferable. For example, a good evaluator needs the right mindset and a positive attitude, good critical thinking skills and penchant for asking lots of questions. These are excellent foundational skills.

This presentation will help emerging evaluators identify their strengths and feel more confident in building evaluation capacity.


Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social

Presenters
avatar for Margaret Moon

Margaret Moon

Senior Project Officer, SafeWork NSW
I manage the evaluation program at SafeWork NSW. This involves commissioning evaluations of programs designed to improve safety in NSW and building evaluation capacity across the organisation. I have previously worked as a film editor at the Australian Broadcasting Corporation, as... Read More →


Wednesday September 18, 2019 1:50pm - 1:55pm AEST
C2.6

2:00pm AEST

The dance of evaluation: Engaging stakeholders to develop an evaluation framework across a highly diverse training organisation
Racheal Norris (GP Synergy), Linda Klein (GP Synergy)

This presentation will outline the processes and challenges involved in developing an efficient evaluation framework, using a state-wide vocational training organisation as a case-study. GP Synergy delivers an accredited General Practice training program, across eight highly diverse subregions of NSW and the ACT, for doctors wishing to specialise as General Practitioners. A small Evaluation Team was established in 2017 to develop a rigorous, adaptive evaluation system to monitor and report on delivery of educational activities.

Using evidence-based methodology, the team adopted a participatory approach and engaged stakeholders across three key levels:

Education Executive
An interactive program logic workshop was held to discuss and identify various evaluation priorities at the senior-level.

Medical Educators
The team worked closely with individual educators to design evaluation tools that were standardised, yet responsive to the unique needs of each region. This involved careful consideration of psychometric properties to ensure robust and reliable measures of key outcomes. A semi-automated reporting system was created to maximise efficiency of delivering timely feedback, and the team guided educators to correctly interpret and utilise this information for continuous improvement.

GP Registrars
The team consulted with registrars (trainees) to explore and develop pathways to "close the loop" and communicate evaluation findings and implications for the training program. This also involved educating registrars about the broader theoretical framework behind evaluation and how to provide useful, constructive feedback.

Evaluation at GP Synergy remains an evolving process, with ongoing multi-level engagement ensuring evaluation systems continue to be responsive and adaptable to stakeholder needs. The role of the Evaluation Team in educating stakeholders and colleagues about evaluation 'steps' has been fundamental to successful data collection and reflection on findings resulting in change. Insights will be offered to others developing evaluation frameworks/methods within settings where flexibility and responsiveness are key.


Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social

Presenters
avatar for Racheal Norris

Racheal Norris

Evaluations Officer, GP Synergy
Racheal is an Evaluation Officer within the NSW & ACT Research and Evaluation Unit of GP Synergy. Racheal is involved in the collection and reporting of feedback from GP registrar and supervisor development workshops. Racheal also contributes to the ongoing development of a broader... Read More →
avatar for Linda Klein

Linda Klein

Deputy Director Research & Evaluation, GP Synergy
Linda Klein, BSc MSc PhDI have an academic background in psychology and public health, with over 30 years of practical experience in evaluation in sectors spanning health, education/training and business. At GP Synergy, I take primary responsibility for the evaluation of educational... Read More →


Wednesday September 18, 2019 2:00pm - 2:30pm AEST
C2.6

2:00pm AEST

Exploring 'beyond the box': Applying implementation theory to evaluate a quality improvement project in Aboriginal and Torres Strait Islander primary health care
Alison Laycock (University Centre for Rural Health), Gillian Harvey (The University of Adelaide), Nikki Percival (University of Technology Sydney), Frances Cunningham (Menzies School of Health Research), Jodie Bailie (University Centre for Rural Health), Veronica Matthews (University Centre for Rural Health), Kerry Copley (Aboriginal Medical Services Alliance Northern Territory), Louise Patel (Aboriginal Medical Services Alliance Northern Territory), Ross Bailie (University Centre for Rural Health)

Implementation science examines what methods and strategies work to promote the use of research findings and other evidence into routine practice, to improve the quality and effectiveness of health services and care. It explores, for example, how health interventions can be adapted and scaled in ways that are accessible and equitable to improve health. Implementation science can provide important knowledge for improving Aboriginal and Torres Strait Islander health, however little research addresses how implementation theories or frameworks have been applied to evaluate projects and programs in Indigenous health.

Drawing on developmental evaluation data, we used the integrated Promoting Action on Research Implementation in Health Services (iPARIHS) framework to examine factors contributing to the success, or otherwise, of a large-scale interactive dissemination project. The project engaged stakeholders with continuous quality improvement data from Aboriginal and Torres Strait Islander primary health care services to co-produce knowledge for improving care.

In this presentation, we describe how we selected and applied this theoretical framework as an evaluation tool. We examine the extent to which use of the framework enhanced our understanding of project interactions, limitations and success in the Aboriginal and Torres Strait Islander health care context and influenced our ongoing work to improve health.


Chairs
avatar for Peter Ellis

Peter Ellis

Director, Nous Group
Professionally I'm both an evaluator and a statistician, with a particular interest in using evidence to improve public sector outcomes. While I'm now in consultancy, I've previous run evaluation functions, including as Director Program Evaluation for AusAID, and Manager Tourism Research... Read More →

Presenters
avatar for Alison Laycock

Alison Laycock

PhD Candidate, Menzies School of Health Research
Alison is an evaluator and PhD candidate at Menzies School of Health Research and the Centre for Research Excellence in Integrated Quality Improvement in Indigenous primary health care. At aes19, Alison is presenting the evaluation of a collaborative knowledge translation project... Read More →


Wednesday September 18, 2019 2:00pm - 2:30pm AEST
C2.4

2:15pm AEST

Aboriginal engagement, Aboriginal evaluation: Owning an evaluation through comprehensive co-design.
Lisa Jackson Pulver (Health Performance Council of South  Australia; University of Sydney), Andrew Wineberg (Health Performance Council Secretariat)

The various state-run health services in South Australia are charged with implementing 'an effective consumer engagement system'. In 2015, one of the state's several local health networks published a strategy for engagement with its Aboriginal consumers and community members. As part of our remit to review the effectiveness of our state's community engagement methods, we decided to evaluate how well that engagement strategy had been implemented.

As the strategy we were evaluating was about Aboriginal health consumers and community members, we recognised early that they themselves would best have the experience and legitimacy necessary to guide our review. We therefore set up a governing advisory group made up of people with a strong mix of Aboriginal health perspectives, including - crucially - members of the very same grass roots Aboriginal community register that was itself the flagship creation of the strategy being evaluated.

In this session, we will introduce our project governance and the creation of our Aboriginal advisory group and explain the lengthy but worthwhile collaborative process our group then used to create an evaluation logic model and to design the evaluation. We will explain how our advisory group provided strong governance for the substantive components of the evaluation, including their advice on protecting Aboriginal cultural property by procuring external expert assistance from a majority Aboriginal social research firm to undertake primary data collection. Finally, we will present the iterative validation process we used to prove and refine our draft findings and results to ensure that they resonated with the community.

This session is a must-see for anyone interested in making their evaluations of community targeted strategies truly collaborative and empowering, giving ownership and validity to the community that are the prime stakeholders in a strategy under study.


Chairs
avatar for Mia Bromley

Mia Bromley

Manager, Paxton Partners
I am an experienced health and social service sector consultant and non-executive director with expertise across health, education and justice systems. I have a strong focus on emerging funding models, system redesign, and outcomes measurement. I enjoy partnerships and collaborations... Read More →

Presenters
avatar for Lisa Jackson Pulver

Lisa Jackson Pulver

Member, Health Performance Council
Professor Jackson Pulver is the Deputy Vice-Chancellor Indigenous Strategy and Services at the University of Sydney.A visionary epidemiologist and medical educator who played a key role in the development of a designated Aboriginal and Torres Strait Islander Health Unit (Murri Marri... Read More →
avatar for Tosh Kelly

Tosh Kelly

Aboriginal Expert by Experience, Health Performance Council
Tosh Kelly is a proud Wiradjuri / Barkindji man from central New South Wales who has lived in the rural Riverland region of South Australia for 13 years where he works as an entrepreneur.Tosh has many years’ experience on a number of advisory boards and committees, including the... Read More →


Wednesday September 18, 2019 2:15pm - 3:00pm AEST
C2.5

2:30pm AEST

Unpacking the competencies - among commissioners, managers and evaluators
This presentation is a merge of the two listed below and contains a free-ranging discussion on evaluator competencies,


Advanced Tips For Commissioning and Managing High-Quality, Useful Evaluation 
Jane Davidson (Real Evaluation LLC),Tessie Catsambas (Encompass LLC)

What are the most important traps to avoid and tips for commissioning and managing high-quality, value-for-money evaluation? This interactive panel session will be an informative helicopter tour for evaluation commissioners, evaluation team leaders, and internal and external professionals who oversee or manage evaluation projects. It will provide: (1) a deeper appreciation of the role of evaluation management in commissioning and delivering high-quality, value-for-money evaluations; (2) an overview of the role and essential competencies of evaluation managers; and (3) sample strategies and tools for commissioning and managing better and more useful evaluations for organizational learning and stronger leadership. Participants are invited to share their own experiences and engage in a highly interactive discussion with the presenters, who will draw on decades of practical experience leading both large international multi-country evaluations and small-team and solo evaluation projects, as well as providing advice to client organizations on how to scope, commission, and manage highly effective evaluations.

Unpacking the competencies - in theory and practice
 
Amy Gullickson (University of Melbourne Centre for Program Evaluation), Delyth Lloyd (Department of Health and Human Service), Sue Leahy (ARTD)

The AES Professional Learning Competency Framework was developed in 2012 and in 2019, the Learning and Professional Practice Committee engaged the AES community in research with the intention to update the competency set. The goal was to assess the framework in relation to what evaluation theorists have discussed in the literature about what skills and knowledge are needed for evaluation practice. In this interactive session, we'll report out on recent theoretical work in this area, the findings to date of the community research project, get community feedback on the findings so far and their relevance to evaluation practice, and discuss next steps.

Chairs
avatar for Amy Gullickson

Amy Gullickson

Senior Lecturer, University of Melbourne - Centre for Program Evaluation

Presenters
avatar for Jane Davidson

Jane Davidson

Real Evaluation LLC
Dr. Jane Davidson is best known for pioneering the increasingly popular Evaluation Rubrics Methodology, along with her various other refreshingly practical evaluation frameworks and approaches.Originally from Aotearoa New Zealand, Jane is former Associate Director of The Evaluation... Read More →
avatar for Sue Leahy

Sue Leahy

Managing Director, ARTD Consultants
Sue is an accomplished evaluator, policy analyst and facilitator and managing Principal at ARTD, a leading evaluation and public policy company based in NSW. She joined ARTD in 2009 from the NSW Department of Family and Community Services, where she managed a wide-ranging program... Read More →
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Department of Health and Human Services, Vic
Implementation and dissemination, health program evaluation, capacity building.


Wednesday September 18, 2019 2:30pm - 3:00pm AEST
Pyrmont Theatre

2:30pm AEST

Assessing achievements in implementing place-based initiatives - unboxing the assessment process
Patricia O'Connor (Australian Healthcare Associates), Tracey Marriner (Australian Healthcare Associates), Shantanu Sheshgir (Australian Healthcare Associates), Jill Waddell (Australian Healthcare Associates)

Assessing the incremental achievements of place-based-initiatives (PBIs) has become an increasingly important component of contemporary evaluation practice. While much is known about the characteristics of successful PBIs, the practicalities of assessing implementation progress across multiple PBIs in a single program remain a complex challenge.

When tasked with evaluating a national program jointly funded by the Australian Government Departments of Health and Education and Training, aimed at improving Aboriginal health and education outcomes, this challenge became a reality for our evaluation team.

This presentation explores the four-stage process undertaken to develop a tool to assess implementation progress across a 13-site PBI program. These sites included a mix of urban, regional and remote locations. PBI maturity ranged from several months to multiple years, with some sites adopting a collective impact approach.

In Stage 1, a literature scan was undertaken to identify the attributes of successful PBIs and the breadth of indicators/measures used to evaluate them. Stage 2 involved mapping each indicator/measure against the eight PBI domains identified in Stage 1. Duplicates were removed and multiple codes were applied in some cases to facilitate measurement by sub-themes such as collective impact and integration. Indicators/measures were then converted to a plain language statement format, so that achievements could be assessed using a five-point scale, ranging from 'not yet started' (0) to 'achieved' (4). A rubric was constructed from the literature findings to guide the rating process undertaken by the evaluation team (Stage 3).

Summing up ratings by PBI domain (Stage 4) identified the domains where a site had made achievements and domains that required a concentrated effort going forward. This standardised tool also facilitated reporting of program-level findings and insights.

Factor analysis will later be used to determine the most important indicators within each domain, thereby reducing the number of questions being asked.


Chairs
avatar for Mathea Roorda

Mathea Roorda

Senior consultant, Allen + Clarke Consulting
Let's get to the heart of that word 'evaluation'. What do we mean by value and how do we know we've included all relevant values (criteria) in our assessment of a programme? Questions that keep me awake at night...

Presenters
avatar for Patricia O'Connor

Patricia O'Connor

Senior Consultant, Australian Healthcare Associates
Dr. Trish O’Connor has worked as an evaluator for more than 15 years. In a career that has included positions in health, education, public and private sectors, Trish has developed and applied her specialist skills in research, evaluation, training and communication in a range of... Read More →
avatar for Greer Edsall

Greer Edsall

Consultant, Australian Healthcare Associates
Greer is passionate about health equity and social justice and has a particular interest in working to improve outcomes for disadvantaged Australians.She has excellent written communication skills (including undertaking literature reviews, policy analyses, writing tenders and summarising... Read More →


Wednesday September 18, 2019 2:30pm - 3:00pm AEST
C2.3

2:30pm AEST

Operationalising systems-thinking approaches to evaluating health system innovations: The example of HealthPathways Sydney
Carmen Huckel Schneider (University of Sydney), Sarah Norris (University of Sydney), Sally Wortley (University of Sydney), Angus Ritchie (University of Sydney), Fiona Blyth (University of Sydney), Adam Elshaug (University of Sydney), Andrew Wilson (University of Sydney)

There have been increasing calls to take a systems-thinking approach to evaluating health policies and programs - acknowledging the complexity of health systems and the many actors, institutions, relationships, drivers and values that impact on health system change. Several key frameworks have emerged that support systems-thinking, including "WHOs Framework for Action"; "NASSS - Non-Adoption, Abandonment, and Challenges to Scale-Up, Spread and Sustainability"; and the "Vortex Model". However little has been written on how to operationalise systems framework elements into practical evaluation studies comprising methodologically rigorous data collection and analysis methods - all while staying true to the principles of systems-thinking.

In this presentation we seek to unbox the challenge of operationalising a system-thinking approach to evaluating healthcare delivery innovations. We use the NASSS framework as our example to demonstrate how to expand system-thinking frameworks, progress towards theories and pose systems-thinking-driven, yet researchable questions. This requires crossing epistemological boundaries, and taking a 'multiple studies' approach adopting various methods of inquiry. We report on applying these principles to evaluate HealthPathways Sydney, a website for GPs to navigate care pathways for their patients through primary and specialist care. We followed a two phase approach, beginning with a series of sub-studies using standard qualitative and quantitative methods and reflected on the conduct of these studies to pinpoint system level factors (macro contexts, institutional settings, critical events, agents and relationships) that were necessary to understand in order to determine how the innovation interacted with the system. Our second phase adopted systems-thinking study methods including geo-spatial mapping, social network analysis, process tracing, frames analysis and situational analysis. Results were then synthesised into a rich case of the introduction of an innovation into the system. We uncovered progress towards desired outcomes, but also barriers to consolidating and embedding the technology when other system factors were in play.


Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social

Presenters
avatar for Carmen Huckel Schneider

Carmen Huckel Schneider

Senior Lecturer, Health Policy, University of Sydney
I am Deputy Director at the Menzies Centre for Health Policy, and Program Director of the Master of Health Policy at the University of Sydney. I am co-lead of the Health Governance and Financing, and Applied Policy Analysis Groups at the Menzies Centre for Health Policy, a Senior... Read More →
SN

Sarah Norris

Senior Research Fellow, Menzies Centre for Health Policy
How broader approaches to evaluation can be applied to health technology evaluation, and vice versa.


Wednesday September 18, 2019 2:30pm - 3:00pm AEST
C2.6

2:30pm AEST

Co-designing a place-based evaluation
Roxanne Bainbridge (Central Queensland University), Robyn Bailey (Allen + Clarke), Julia Carr (Griffith University), Robert Monaghan (Monaghan Dreaming), Ned Hardie-Boys (Allen + Clarke)

Evaluating large-scale, complex health programs poses a host of challenges. Traditional evaluation designs which compare locations with and without a given program are not appropriate because many of the programs are available in most locations. Place-based evaluation designs are promoted as a potential way to enhance understanding of context and address the lack of counterfactual comparisons. However, there are few published examples of evaluation designs that use a place-based approach to guide implementation.

In this presentation, we describe the emergent and multi-layered approach to co-designing a system-level evaluation, using a place-based approach. We aim to advance the understanding of place-based approaches to evaluation and research, by illustrating how the approach is being used in the evaluation of the Australian Governments investment in Aboriginal and Torres Strait Islander primary health care through the Indigenous Australians Health Programme (IAHP).

Chairs
avatar for Peter Ellis

Peter Ellis

Director, Nous Group
Professionally I'm both an evaluator and a statistician, with a particular interest in using evidence to improve public sector outcomes. While I'm now in consultancy, I've previous run evaluation functions, including as Director Program Evaluation for AusAID, and Manager Tourism Research... Read More →

Presenters
avatar for Roxanne Bainbridge

Roxanne Bainbridge

Director Centre for Indigenous Health Equity Research, Central Queensland University
I am a Gungarri/Kunja Aboriginal researcher from South Western Queensland in Australia and Professorial Research Fellow at Central Queensland University where I am Director for the Centre for Indigenous Health Equity Research. My current priority evaluation is embedded in a partnership... Read More →
avatar for Robyn Bailey

Robyn Bailey

Senior Associate, Evaluation + Research, Allen + Clarke
Hello! I am a Pakeha (European) New Zealander, currently working in both Australia and New Zealand. Evaluation, and contributing to better outcomes for people and communities through evaluation, is my passion. Along with colleagues, I work with both government agencies and NGO providers... Read More →


Wednesday September 18, 2019 2:30pm - 3:00pm AEST
C2.4

2:30pm AEST

Buddhist Evaluation: Thinking outside the box of Western-derived methods
Kathryn Dinh (UNSW), Heather Worth (UNSW), Bridget Haire (UNSW)

The field of evaluation tends to be dominated by certain Western-derived understandings of the way the world works and underlying belief that these understandings are universal. Culturally responsive evaluation recognises the existence of diverse world views and some of its exponents argue that it needs to encompass more than simply working closely in collaboration with locally-based partners. It should additionally involve modifying and creating new evaluation approaches that are grounded in non-Western world views.

While there has been significant innovation in evaluation approaches that reflect Indigenous world views in Australia, New Zealand, the US and elsewhere, there has been less progress in reflecting the world views of South-East and East Asia. Buddhism has a significant global influence today, and particularly in these regions where it is practised by a large majority of the population.

In this presentation, we suggest an applied approach to culturally responsive evaluation by first analysing the world views underpinning Buddhism and the Most Significant Change (MSC) technique, a participatory method for monitoring and evaluation that involves the collection of stories of significant change. We then identify where these converge and diverge and suggest practical ways in which the MSC technique could be adapted to reflect a Buddhist world view. Finally we look at how in a globalised world, societies are made up of a complex and dynamic mix of values, philosophies, traditions, religions and cultures. We discuss that as evaluators, we can use this approach to work with locally-based colleagues to unpack the theory and value systems underpinning existing evaluation methods, and repackage the methods or create new ones that reflect, and are responsive to, the complex and dynamic world views in the local context being evaluated.


Chairs
avatar for Karen Fisher

Karen Fisher

Professor, Social Policy Research Centre UNSW
I conduct research and evaluation about disability and mental health policy in Australia and China. I use inclusive methods with people with disability, families and other stakeholders. I am enjoying extending that to film, photos and other accessible methods to ensure the evaluations... Read More →

Presenters
avatar for Kathryn Dinh

Kathryn Dinh

MEL Consultant/PhD Candidate, UNSW
Kathryn is a monitoring, evaluation and learning consultant with more than 20 years of experience across Asia, the Pacific (including Australia) and the Commonwealth of Independent States. She specialises in evaluation for health and international development programs, advocacy evaluation... Read More →


Wednesday September 18, 2019 2:30pm - 3:00pm AEST
C2.2

2:30pm AEST

This presentation is now in Pyrmont
Amy Gullickson (University of Melbourne Centre for Program Evaluation), Delyth Lloyd (Department of Health and Human Service), Sue Leahy (ARTD)

The AES Professional Learning Competency Framework was developed in 2012 and in 2019, the Learning and Professional Practice Committee engaged the AES community in research with the intention to update the competency set. The goal was to assess the framework in relation to what evaluation theorists have discussed in the literature about what skills and knowledge are needed for evaluation practice. In this interactive session, we'll report out on recent theoretical work in this area, the findings to date of the community research project, get community feedback on the findings so far and their relevance to evaluation practice, and discuss next steps.

Chairs
avatar for Lee-Anne Molony

Lee-Anne Molony

Director, Clear Horizon
Principled-focused evaluation Evaluating place-based approachesOrg level evaluation frameworks

Presenters
avatar for Amy Gullickson

Amy Gullickson

Senior Lecturer, University of Melbourne - Centre for Program Evaluation
avatar for Sue Leahy

Sue Leahy

Managing Director, ARTD Consultants
Sue is an accomplished evaluator, policy analyst and facilitator and managing Principal at ARTD, a leading evaluation and public policy company based in NSW. She joined ARTD in 2009 from the NSW Department of Family and Community Services, where she managed a wide-ranging program... Read More →
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Department of Health and Human Services, Vic
Implementation and dissemination, health program evaluation, capacity building.


Wednesday September 18, 2019 2:30pm - 3:00pm AEST
C2.1

3:00pm AEST

Closing plenary: Evaluation, un-boxed
Now we’ve un-boxed it, it’s time to discuss where to next for evaluation. Keynote speakers, the unconference convenors and experts will share their thoughts on exactly what’s in the box of evaluation, and the tools and skills evaluators need to ensure we stack up into the future. To do this, they’ll look beyond the box to what we can draw from other disciplines, and how can we learn from as well as share with the communities we work with, and what this means for evaluation as a ‘profession’. Stick around for the conversation to create connections and think about how we shape the ever-evolving role of evaluation and evaluators beyond #aes19SYD.

Followed by:
Conference close, John Stoney, AES President
Handover to aes20 International Evaluation Conference, Brisbane, Australia

Chairs
avatar for Jade Maloney

Jade Maloney

Partner & Managing Director, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →

Presenters
avatar for Jane Davidson

Jane Davidson

Real Evaluation LLC
Dr. Jane Davidson is best known for pioneering the increasingly popular Evaluation Rubrics Methodology, along with her various other refreshingly practical evaluation frameworks and approaches.Originally from Aotearoa New Zealand, Jane is former Associate Director of The Evaluation... Read More →
avatar for David Fetterman

David Fetterman

President & CEO, Fetterman & Associates
David Fetterman is President and CEO of Fetterman & Associates, an international evaluation consulting firm. He has 25 years of experience at Stanford University, serving as a School of Education faculty member, School of Medicine director of evaluation, and senior member of Stanford... Read More →
avatar for Gary VanLandingham

Gary VanLandingham

Gary VanLandingham currently serves as Professor, MPA Program Director, and the Reuben Askew Senior Practitioner in Residence with the Askew School of Public Administration and Policy at the Florida State University. Previously, he was the founding Director of the Pew-MacArthur Results... Read More →
avatar for Kiri Parata

Kiri Parata

Whakauae Research for Māori Health and Development
Kia ora I'm Kiri, living on the Sunshine Coast of Queensland. My whakapapa (genealogy) is to Te Atiawa, Ngāti Toa, Ngāti Raukawa, Ngāti Ruanui and Ngāi Tahu in Aotearoa, New Zealand. I am a Māori health researcher and evaluator and I'm committed to tino rangatiratanga where indigenous... Read More →
avatar for Gill Westhorp

Gill Westhorp

Professorial Research Fellow, Charles Darwin University
Gill leads the Realist Research Evaluation and Learning Initiative (RREALI) at Charles Darwin University. RREALI develops new methods and tools within a realist framework, supports development of competency in realist approaches and provides realist evaluation and research services... Read More →
avatar for Jo Farmer

Jo Farmer

I'm a policymaker and evaluator focused on improving the health and wellbeing of all Australians, particularly those with mental illness. My experience includes strategic planning, evaluation and service design in not for profits, government and service delivery agencies.My particular... Read More →
BB

Ben Barnes

Director Evaluation, NSW Department of Education
I began in consultancy, and made the move to the public sector in 2012. I am now Director of Evaluation at the Centre for Education Statistics and Evaluation in the NSW Department of Education. We have a team of over 30 internal evaluators, data analysts and evaluation capacity builders... Read More →


Wednesday September 18, 2019 3:00pm - 4:30pm AEST
Pyrmont Theatre
  Plenary