Loading…
This event has ended. Visit the official site or create your own event on Sched.
Conference registrations have closed | Presenter and chair briefing notes and other information available here.
Self-guided historical walking tours: These walking tours are accessed via the Sydney Culture Walks app and highlight Aboriginal history, heritage & culture: https://www.sydneyculturewalksapp.com/barani-redfern 
https://www.sydneyculturewalksapp.com/barani-warrane
C2.6 [clear filter]
Monday, September 16
 

11:00am AEST

Evaluation governance: creating fertile ground
Julian Thomas (Urbis), Ariane Hermann (AustralianGovernment Attorney-General's Department), Amanda Shipway (Queensland Department of Justice and Attorney-General), Adam Nott (Australian Government Attorney-General's Department), Kay Hackworth (Victorian Department of Justice and Community Safety), Alison Wallace (Urbis), Frances McMurtrie (Urbis)

This joint presentation by clients (the Commonwealth Attorney-General's Department and the Queensland Department of Justice and Attorney-General) and commissioned evaluators (Urbis) will show how inclusive evaluation governance arrangements for complex, large scale investments can create a strong foundation for collaborative future action.

This presentation demonstrates a case study the client and evaluator experiences during a national review of the funding for legal assistance services, which involved Australian federal, state and territory governments as joint-commissioning clients. The work focused on the extent to which the $1.3b National Partnership Agreement on Legal Assistance Services 2015-2020 (the NPA) supports shared interests among governments to "improve access to justice and resolve legal problems for the most disadvantaged people in Australia and maximise service delivery through the effective and efficient use of available resources".

For the commissioning governments, the review was an important precursor to inter-governmental negotiations over the future shape of national collaboration on legal assistance services, which can often develop into contested and sometimes protracted processes. Planning for the review was a catalyst for Commonwealth and State/Territory governments to establish a Steering Committee to formulate terms of reference and guide a procurement process. A by-product of this early collaborative work was the establishment of productive, multi-lateral relationships and a shared ownership of process and purpose. Following the appointment of the evaluator, the Steering Committee structure sustained a high level of engagement after appointment of the evaluator, with the participation of the evaluator introducing a new and constructive dynamic.

Our take-away observation is that well designed governance arrangements for large, multi-stakeholder evaluation projects addressing contested issues have significant benefits that extend beyond delivery of the evaluation. In facilitating relationship building around a collective purpose, effective evaluation governance promotes broader, post-evaluation collaboration.


Chairs
avatar for Larissa Brisbane

Larissa Brisbane

Team Leader, Strategic Evaluation, Dept of Planning and Environment NSW
It was only a short step from training in environmental science, and a background in cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing your stories of what you've done and what you've learned, especially in the areas... Read More →

Presenters
avatar for Julian Thomas

Julian Thomas

Director, Urbis
I'm an evaluator with a particular interest in policy/programs tackling complex social issues, and my work at Urbis is concentrated on health, education and justice portfolios.. My favourite evaluations have focused on how large scale/national investments are planned, delivered and... Read More →
avatar for Ariane Hermann

Ariane Hermann

Assistant Secretary, Attorney-General's Department
Starting my career as an organisational and counselling psychologist, I joined the Commonwealth Public Service in 2009. I have had a range of positions across the Attorney-General's Department, including leading the development of a National Impact Assessment Framework and a National... Read More →
avatar for Amanda Shipway

Amanda Shipway

Director, Legal Assistance, Strategy and Funding (LASF), Department of Justice and Attorney-General (DJAG)
Amanda Shipway has worked for the Department of Justice and Attorney-General since 2009, her fist role with the DJAG was at Victim Assist Queensland where she worked for five years as the Victim Services Coordinator, before moving to her current role as Director, Legal Assistance... Read More →


Monday September 16, 2019 11:00am - 11:30am AEST
C2.6

11:30am AEST

Early insights from evaluating post-disaster community recovery
Claire Grealy (Urbis), Christina Bagot (Urbis)

This presentation draws upon our experience over the last decade in undertaking evaluations of disaster recovery efforts in Victoria and Queensland. Key themes explored in this presentation consider the need for: evaluation-informed program design, accessible platforms for communities to participate in the evaluation, consultations to be tailored and trauma-informed and, evaluation methods that consider the dynamic ongoing recovery-context.

Evaluating recovery efforts presents unique challenges for evaluators and our work in this area emphasises the importance of careful planning and consideration around the logistics of the consultation (data collection) phase. In particular, communities recovering from disaster face additional barriers to traditional consultation methods and there is a need for evaluators to create accessible platforms for a range of people to provide their input.

Evaluators also need to be aware and equipped to consult with communities and service users who are have experienced acute and recent trauma. Our experience has shown that trauma takes many different forms for individuals and this affects their interaction with services and participation in consultations. Ethical conduct is paramount, using trauma-informed research methods and consultation processes that enable the collection of a range of perspectives but still safe-guard informants from re-traumatisation.

In addition, our evaluations have also found that consultations need to be conscious of prevailing cultures of 'don't speak up' and its impact on data collection activities, particularly in rural communities where stoicism is the norm. Stigma of mental health symptoms and attitudes towards help-seekers can hinder the willingness of community members to acknowledge the range of consequences arising from the disaster.


Chairs
avatar for Larissa Brisbane

Larissa Brisbane

Team Leader, Strategic Evaluation, Dept of Planning and Environment NSW
It was only a short step from training in environmental science, and a background in cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing your stories of what you've done and what you've learned, especially in the areas... Read More →

Presenters
avatar for Claire Grealy

Claire Grealy

Partner, Urbis
Motivated to influence - better experiences, better systems and better places, for everyone. Evaluation + is how I think about it - always with an eye to translating the learning into knowledge and change for the better. Passionate about working with our partners to influence change... Read More →
avatar for Christina Bagot

Christina Bagot

Senior Consultant, Urbis


Monday September 16, 2019 11:30am - 12:00pm AEST
C2.6

12:00pm AEST

Empathy mapping - Discovering what they value
Andrew Moore (NZ  Defence), Victoria Carling (NZ Defence, NZ)

Empathy mapping is an emerging collaborative approach that focuses on the results of a programme. Used to gain the perspective of different stakeholders, from the commissioner to the programme participants, it seeks to define what they truly value from a programme. Empathy mapping requires participants to reflect on what success looks like, according to them, by considering what they would see, hear, do, say, think, or feel during and post programme. The results can then be used, as the building blocks of evaluation rubrics to define measurable criteria. The collaborative approach ensures a shared understanding is achieved on the quality, value, and effectiveness of a programme.

Drawing from their experience the presenters will demonstrate how empathy mapping has been used to build the foundations for successful evaluation within NZ Defence. Highlighting how empathy mapping can maximize contact time with key stakeholders, document the shared understanding of programme results and subsequently promote a collective interpretation of evaluation reports.

The session will allow participants to gain an insight into: What is empathy mapping? Where did it come from? What are the components of an empathy map? Why are they useful as building blocks for evaluation practice? How they can be used to build evaluation-rubrics?


Chairs
avatar for Larissa Brisbane

Larissa Brisbane

Team Leader, Strategic Evaluation, Dept of Planning and Environment NSW
It was only a short step from training in environmental science, and a background in cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing your stories of what you've done and what you've learned, especially in the areas... Read More →

Presenters
VC

Victoria Carling

Regional Evaluator, NZ Defence Force
avatar for Andy Moore

Andy Moore

Senior Advisor Performance and Evaluation, NZDF
I have had thirty years of experience in conducting, designing, and evaluating vocational training within NZ Defence. In that time, I have deployed oversea, predominantly in the role of coordinating aid programmes on behalf of NZ Defence. My current role is the Senior Adviser Performance... Read More →


Monday September 16, 2019 12:00pm - 12:30pm AEST
C2.6

1:30pm AEST

Beyond co-design to co-evaluation: Reflections on collaborating with consumer researchers
Rachel Aston (ARTD Consultants), Amelia Walters (ARTD Consultants), Amber Provenzano (ARTD Consultants)

There is increasing recognition that consumers of mental health services and consumer researchers play an essential role in creating quality and effective research (Lammers & Happell, 2004; Hancock et al., 2012). However, little evidence exists around the engagement of consumer researchers in research and even less in evaluation (Lammers & Happell, 2004). Consumer researcher inclusion can enhance the utility, relevance, and validity of the evaluation process, conclusions, and judgements of programs, policies and initiatives that directly involve and impact on the lives of end-users.

A Victorian Primary Health Network has introduced an innovative Mental Health Stepped Care Model designed to match services with individual's and local population needs. Using this as an evaluation case example, collaboration with a consumer researcher is shown as critical to the success of the evaluation due to the design of the methodology, and in particular the emphasis on qualitative data gathering and case studies of primary health services.

Supporting the emergent literature and challenging the historical view of consumers as passive potential beneficiaries of the research and evaluation process, we propose that the active involvement of a consumer researcher in all stages of the evaluation process creates powerful mutual learning (Brosnan, 2012).

We will discuss how to practically support consumer researchers in evaluation to contribute their lived experience, to further develop their professional skills, and to foster greater ownership of evaluation for the community. We suggest minimising potential power disparities between the evaluation team and the consumer researcher, through a mentoring and allyship model (Happell et al., 2018).

Finally, we will raise important implications for the practice and wider discipline of evaluation. Progressing beyond co-design to co-evaluation, the unique contribution of consumer researchers' values and lived experience, being embedded in evaluation, leading to maximizing the utility, relevance and accuracy of the findings, will be elucidated.


Chairs
avatar for Christina Kadmos

Christina Kadmos

Principal, Kalico Consulting

Presenters
avatar for Rachel Aston

Rachel Aston

Manager, ARTD Consultants
Rachel is an experienced social researcher and evaluator at ARTD Consultants. She brings eight years’ experience conducting research and evaluation for government, NGOs and in the higher education sector. Rachel has a high level of expertise in qualitative and mixed-methods research... Read More →
avatar for Amber Provenzano

Amber Provenzano

Analyst, ARTD Consultants
Amber joined ARTD in 2018. She supports evaluation and research in areas of complex social policy, particularly in the health, mental health and disability sectors. She has experience with undertaking document and literature reviews, conducting interviews, survey design and administration... Read More →
avatar for Amelia Walters

Amelia Walters

Lived Experience Researcher, ARTD Consultants
Amelia is a mental health advocate, consultant, researcher, and peer worker.Amelia is the Lived Experience Researcher to the ARTD Consultants and University of Melbourne’s independent evaluation of the Eastern Melbourne Primary Health Network Mental Health Stepped Care model. She... Read More →


Monday September 16, 2019 1:30pm - 2:00pm AEST
C2.6

2:00pm AEST

Does empowerment evaluation work? Findings from a case-study
Kerrie Ikin  (University of New England)

End users running their own evaluations!
End users owning the evaluation results!
End users influenced by the evaluation processes!
This paper is all about empowerment: values, capacity building, ownership, power.
Curious?

Come and find out about how an entire staff became involved in their school's three-year journey in an empowerment evaluation process and what the research about this process revealed.

In the New South Wales government education system in Australia, reviewing schools has undergone a sea change. Community-of-practice approaches to school planning and evaluation, followed by external but peer-led validation has become the norm. This model presumes a high level of competence in collaborative strategic planning and evaluation as well as a high level of evaluation capacity by school principals and staff. One school principal, realising the challenges that the new model posed, engaged an evaluator to develop and implement a process (empowerED) that would help his school rise to these challenges.

EmpowerED was specifically designed to strengthen the school's learning community by creating in partnership across it stronger and better professional practice. Who held the power in an evaluation was challenged as traditional evaluation roles were turned on their heads—the staff became the evaluators; the evaluator became their critical friend. Through this process, it was envisaged that staff would build capacity for change, be empowered as whole-of-school evaluators, and embrace ownership of their school's plan. The ultimate goal was to improve student learning outcomes. And the approach paid off. Findings from the concurrent research show how as staff developed transparency, openness, and trust in the process and with each other, their understanding of and input into the school's plan and directions increased, and their evaluation capacity was built. Early indications also suggest improved student learning outcomes may be in part attributable to empowerED.


Chairs
avatar for Christina Kadmos

Christina Kadmos

Principal, Kalico Consulting

Presenters
avatar for Kerrie Ikin

Kerrie Ikin

Adjunct Senior Lecturer, UNE Business School, University of New England
Dr Kerrie Ikin FACEKerrie began her career as a teacher and executive member in government schools across New South Wales, Australia. For over 25 years she worked at the systems level of the education department as a director of schools, senior policy adviser, and senior administrator... Read More →


Monday September 16, 2019 2:00pm - 2:30pm AEST
C2.6

2:30pm AEST

The Perpetrator Perspective: Breaking down the barriers in family violence research and evaluation
Luke Condon (Deloitte), Kate Palmer (Deloitte Access Economics), Sasha Zegenhagen (Deloitte Access Economics), Karen Kellard (Social Research Centre), Scott Pennay (Social Research Centre), Jenny Anderson (Department of Health and Human Services), Sally Finlay (Family Safety Victoria), Ilana Jaffe (Family Safety Victoria)

The Victorian Royal Commission into Family Violence placed a strong emphasis on the need to better understand who is experiencing family violence, their circumstances, and how they can be supported. The unique experiences of both the victim and the perpetrator are critical to measuring the impact of family violence programs, and contributing to best practice for changing the behaviour of people who use violence. However, engaging with perpetrators and victims presents an ethical minefield. It requires us to 're-evaluate' our approach to evaluation, view risks from a different lens, and think outside the box, all whilst meeting ethical standards.

In Victoria, interventions to address perpetrator behaviour are being redefined to be both broader and better integrated into wider family violence responses. This includes improving the inclusivity of these programs to better target the diverse needs and circumstances of perpetrators of family violence. Evaluation of these new programs will inform policy and drive system improvement, making it more responsive to the needs of our diverse community. As such, it is important to understand the perspective of the 'service users' and how their experience is contributing to evidence of outcomes. Inclusion of the perpetrator and victim voice within the evaluation design requires complex consideration of the potential risks involved for both victim and researcher, balanced with the anticipated benefits of the research at both an individual and community-wide level.

Drawing on the perspective and expertise of program service providers is key to understanding and addressing the broad range of considerations and sensitivities when engaging with this typically complex population. From recruitment strategies, to participant incentives, and discussion guides, the standard methods do not apply, and a 'one-size-fits-all' approach does not work. We discuss how a collaborative approach to evaluation design is key to ensuring research is centred on the needs of participants, thus maximising the positive impact of perpetrator programs in the future.


Chairs
avatar for Christina Kadmos

Christina Kadmos

Principal, Kalico Consulting

Presenters
avatar for Luke Condon

Luke Condon

Partner, Deloitte Access Economics
I've been an evaluator for around 12 years after originally starting my career in the public sector. My clients are primarly state and federal government and while my main areas of focus are health, justice and community services I've done evaluations on all sorts of topics. I enjoy... Read More →
avatar for Karen Kellard

Karen Kellard

Director, Social Research Centre (SRC)
Karen Kellard is the Exec Director of the Qualitative Research Unit at the Social Research Centre (owned by the Australian National University) in Melbourne, Australia. Her interests are in the use of qualitative approaches in evaluation, and in conducting research on sensitive topics... Read More →
avatar for Jenny Anderson

Jenny Anderson

Global Director MEL, Movember
My organisation Movember is dedicated to changing the face of men's health. We are a leading global men’s health organisation that focuses on three key health areas: prostate cancer, testicular cancer, and mental health and suicide prevention. Funds raised are directed into research... Read More →


Monday September 16, 2019 2:30pm - 3:00pm AEST
C2.6

3:30pm AEST

When the West Meets the East: Collaborative design, analysis and delivery of program evaluation in rural generalist training program in Japan
Takara Tsuzaki  (Western Michigan University)

This presentation demonstrates a case study of a mixed method and bilingual program evaluation which was conducted on a newly launched rural medicine/rural generalist program in Japan with a focus on collaborative and iterative learning processes. The client, GENEPRO LLC and the evaluator will share challenges in designing and implementing the evaluation, and how we have been successful in building trust among stakeholders, integrating evaluation into practice, and fostering iterative learning within the organization.

The model - Rural Generalist Program Japan (RGPJ) - is based on the Australian model which has been regarded as the most comprehensive and matured rural generalist medicine training scheme in the world. To meet the specific needs of rural generalist medicine in Japan, provision of rural healthcare was needed to be tailored to regional and local context. Exporting this medical training scheme from Australia to Japan also meant a new collaborative endeavor to develop a unique program evaluation model and approach in Japan.

This presentation will highlight the contextual differences between the East and the West in terms of philosophies and cultural values and how they are manifest in the evaluation practices. The concept of both the theoretical and practical evaluation has developed differently in Japan in the past 50 years when it is compared to the West. Furthermore, evaluation has been conducted predominantly using quantitative data in the medical and healthcare sector in Japan. However, the rural generalist medicine requires distinctly broad scope of practice as well as unique combination of abilities and aptitude to respond to the community needs of rural and remote areas of Japan. As a result, the evaluation approach, including the underlying values, philosophies and methodologies had to be thoroughly examined and openly discussed to bring all the stakeholders on board.

We will share the lessons from the collaborative evaluation process by discussing: what the evaluative thinking and collaborative evaluation design mean in the Japanese rural and medical settings; how we have come up with innovative approaches to communicate with stakeholders who have evaluation anxiety and fear of modernist undertaking; how we have acknowledged and overcome (in)translatability issues in languages, imbedded values, and social contexts of each stakeholder groups; and how the collaborative evaluation processes impacted the organizational culture during and after the evaluation.

Chairs
avatar for Rebecca Arnold

Rebecca Arnold

Senior Project Officer - MERI, Department of Environment and Water Resources (SA)

Presenters
avatar for Takara Tsuzaki

Takara Tsuzaki

Interdisciplinary Ph.D. in Evaluation, Western Michigan University
Takara Tsuzaki is a specialist in public relations, social policy research and evaluation. She has worked as researcher, consultant and evaluator for 15 years in the private, public, academic and not-for-profit sectors in Japan and the United States. Working extensively in the fields... Read More →


Monday September 16, 2019 3:30pm - 4:00pm AEST
C2.6

4:00pm AEST

Evaluating a place-based partnership program: Can Get Health in Canterbury
Amy Bestman (Health Equity Research & Development Unit (HERDU), Sydney Local Health District), Jane Lloyd (Health Equity Research & Development Unit (HERDU),  Sydney Local Health District), David Lilley (Health Equity Research & Development Unit (HERDU), Sydney Local Health District), Barbara Hawkshaw (Central and Eastern Primary Health Network)

This presentation wrestles with the balance between ensuring a robust community-led, inter-sectoral, public health program in a culturally and linguistically diverse (CALD) location and how to effectively provide sufficient monitoring, evaluation, reflection and improvement opportunities while the intervention is in situ.

Can Get Health in Canterbury (CGHiC) is a unique inter-sectoral program with three key partners (the University of New South Wales, Sydney Local Health District and Central Eastern Primary Health Network) and many local partnerships with community organisations. It was established in 2013 to address high health needs among CALD population groups within Canterbury, NSW.
CGHiC's partnership with the community is supported by the employment of community networkers and the establishment of collective control projects. Bengali and Arabic networkers link the community with the health system, and also provide insight to the health system on the unique needs of the community. The collective control projects enable the community to have greater power over decision making, priority setting and allocation of resources. These projects aim to improve capacity of both community groups and the health system and encourage bi-directional learning and reflection.

Two external evaluations have previously been conducted which provide a point in time reflection on the impact of the project. Now that CGHiC is in its sixth year of operation, we are evaluating the program in-house with the following foci: the external impact of the program; the governance structure, priority setting and decision making of the program; and, the activities of the program. While this process is ongoing, the program team have implemented monitoring tools and processes to measure recent activities. The CGHiC evaluation will contribute to the field of evaluation through the development of novel methodologies, approaches and insights to evaluating complex place-based, multi-sectoral, population-level programs in situ.


Chairs
avatar for Rebecca Arnold

Rebecca Arnold

Senior Project Officer - MERI, Department of Environment and Water Resources (SA)

Presenters
avatar for Amy Bestman

Amy Bestman

Community Partnerships Fellow, UNSW
Dr Bestman’s work has been driven by a strong public health approach and has focused on the translation of research to practice and policy. Her research has focused on public health qualitative studies that address inequity in vulnerable populations such as children, disadvantaged... Read More →


Monday September 16, 2019 4:00pm - 4:30pm AEST
C2.6
 
Tuesday, September 17
 

11:00am AEST

The challenges of establishing and growing an internal evaluation unit: Experiences from two large state government departments
Eleanor Williams (Department of Health and Human Services), Josephine Norman (Department of Health and Human Services)

A number of government departments and agencies across Australia have established new evaluation units of varying sizes and function within the past decade, all with some objective of unboxing evaluation and evidence for use in policy design and implementation.

In Victoria, two large state government departments have made significant commitments to new internal evaluation units with functions that extend beyond traditional capacity building and oversight roles to direct delivery of evaluations and cross-portfolio evidence reviews. The two presenters have played a leading role in the establishment and growth of these units.

While there has been significant research into what constitutes effective and efficient evaluation capacity building activities, less attention has been given to what is required to establish and grow an internal unit.

In this presentation, these two Victorian departments reflect and share practice examples on challenges and successes of developing and maintaining an internal delivery function. The moderator will highlight and contrast experiences including:
  • Determining the unit's value proposition Will key stakeholders get excited about your value proposition and believe in what you are doing?
  • Getting the right people (and the right mix of people). Are new staff skills and competencies needed?
  • Delivering proof of concept early to key stakeholders. How to get your stakeholders confidence and become seen as the 'fuel not the brakes'?
  • Designing fit-for-purpose products: Answering difficult questions - How will you know if the unit's work is independent, quality, and fit-for-purpose?
This presentation aims to advance the national discussion about strategies for pragmatic implementation of increased in-house evaluation based on theory and practice.

The session will feature a strong participatory element where attendees will be invited to share lessons learned, success stories and examples and challenges from their own organisations.


Chairs
avatar for Florent Gomez

Florent Gomez

Manager, Planning and Evaluation, NSW Department of Customer Service

Presenters
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →
avatar for Josephine Norman

Josephine Norman

Manager, Centre for Evaluation and Research, DHHS
I’m endlessly optimistic about our profession & the ways in which we can contribute to better outcomes, particularly in my context, for vulnerable Victorians. I’m always interested in finding new ways to give better advice, based on increasingly good evidence. I am fortunate to... Read More →
avatar for Amanda Reeves

Amanda Reeves

A/Manager, Performance and Evaluation Division, Department of Education and Training
Amanda is an experienced evaluation practitioner and policy analyst at the Department of Education Victoria. Amanda has led evaluation projects in a variety of roles in government, the not-for-profit sector and as an external consultant in education, youth mental health and industry... Read More →


Tuesday September 17, 2019 11:00am - 12:00pm AEST
C2.6

12:00pm AEST

Internal Evaluation Capacity Building: Unpacking what works in a (very) large government department
Liam Downing (Centre for Education Statistics and Evaluation), Rydr Tracy (Department of Education)

While evaluation capacity building is not an exact science, practitioners can benefit from understanding what has worked in other settings. This session will provide insight for evaluators at all levels into the factors underlying a successful and growing evaluation capacity building strategy within a large, state level education department; with lessons applicable across different sectors.
Strengthening evaluation capacity is a key component of evaluative practice within large sectors (or - more specifically - very large sectors). This is particularly apparent in spaces where practice and outcomes are constantly under scrutiny, and where stakes - for beneficiaries, policymakers and practitioners - are high. The early childhood, primary and secondary education sector is a perfect example of this high-stakes space; and a space where evaluation capacity building can be of benefit.
The NSW Department of Education is home to a small but influential team that focuses on building evaluation capacity among school leaders, teachers and corporate personnel. Established in 2016, the Evaluation Capacity Building (ECB) project is well regarded within the Department and has been identified by the Department of Premier and Cabinet as an example of effective service delivery in the NSW public sector. This presentation will outline key activities undertaken in this space over the last three years, and identifies five key enabling factors that have been instrumental in the project's success so far:
  1. Leveraging existing structures and reforms
  2. Establishing and maintaining a strong authorising environment
  3. Effective collaboration at multiple levels
  4. Operating with the right mix of skills and support
  5. Engaging in a disciplined design process within

The session will detail how each factor influenced the impact of evaluation capacity building efforts, and provide practitioners with a potential roadmap for what might work in their own sectors.


Chairs
avatar for Florent Gomez

Florent Gomez

Manager, Planning and Evaluation, NSW Department of Customer Service

Presenters
avatar for Liam Downing

Liam Downing

Manager, Evaluation and Data, Quality Teaching Practice, NSW Department of Education
Liam is an experienced and impactful evaluation leader, with 17+ years of experience. He is focused on ensuring that evaluation is rigorous in its design, meaningful in informing next steps, and driven by building the capacity of as many people as possible to engage deeply in evaluative... Read More →
RT

Rydr Tracy

R/CEO School Engagement, CESE
Evaluation capacity building in education.Funny jokes.


Tuesday September 17, 2019 12:00pm - 12:30pm AEST
C2.6

1:30pm AEST

"Catching the MEL bug": Using an evaluation needs assessment to unpack evaluation capacity
Mark Planigale (Lirata Ltd), Kathryn Robb (Djirra)

Moving evaluation out of the box involves empowering organisations to shape and use Monitoring, Evaluation and Learning (MEL) for their own purposes. How can we demystify and reframe MEL so we can support organisations to design and use evaluation effectively?

An evaluation needs assessment can be a vital step in this journey. Through a needs assessment, we can engage stakeholders in identifying strengths, gaps and areas for development in MEL within a team or organisation. A needs assessment also explores how stakeholders value MEL and the types of MEL which will be meaningful and useful for their context. This informs the development of tailored strategies to improve MEL capacity, while also generating understanding and enthusiasm for change.

In this paper we outline a systematic approach to evaluation needs assessment. Building on previous approaches (e.g. Preskill & Torres 1999; Volkov & King 2007; Preskill & Boyle 2008), we present a framework of 11 capacity domains, organised using three lenses: individual capacity, team and organisational capacity, and MEL life cycle. A combination of quantitative and qualitative data collection oriented around these domains helps to generate a nuanced mapping of capacity, an overview of informational needs, and a baseline against which progress can be measured.

How can this approach be applied in practice? We share a case study of an evaluation needs assessment undertaken in partnership between an Aboriginal Community Controlled Organisation and an evaluation consultancy. We reflect together on why it was important to undertake the needs assessment, lessons learned through the organisation's experience of "catching the MEL bug", and the relationships, tools and conversations which have facilitated this journey. We conclude with practical suggestions for adapting and using this framework in other contexts


Chairs
avatar for Carolyn Hooper

Carolyn Hooper

Evaluation + Research Senior Consultant, Allen + Clarke
I'm a social scientist, with a preference for qualitative methods. I've been working in evaluation for seven years and I love the variety of opportunities I get to make a difference. My doctorate in Public Health means I often work on health-related evaluations, examining interventions... Read More →

Presenters
avatar for Mark Planigale

Mark Planigale

CEO, Lirata Ltd
I am the CEO and evaluation practice lead for Lirata Ltd (www.lirata.com) - an independent non-for-profit organisation based in Melbourne. We work to advance social justice by strengthening the enablers and reducing the barriers to positive social change.We partner widely with service... Read More →


Tuesday September 17, 2019 1:30pm - 2:00pm AEST
C2.6

2:00pm AEST

The retrospective development of a monitoring and evaluation framework for the Northern Territory chronic conditions prevention and management strategy: Unpacking the problems and possibilities
James Smith (Menzies School of Health Research), Kalinda Griffiths (University of New South Wales), Moira Stronach (Northern Territory Department of Health), Liz Kasteel (Northern Territory Department of Health), Jenny Summerville (Northern Territory Primary Health Network), Julie Franzon (Northern Territory Primary Health Network), Michelle Ganzer (Northern Territory Department of Health), CCPMS Monitoring & Evaluation Working Group)

In 2010, the Northern Territory Government released a ten-year Chronic Conditions Prevention and Management Strategy (CCPMS). This was followed by the release of three separate implementation plans (2010-2012; 2014-2016; 2017-2020) across the CCPMS timeframe. A longer implementation timeframe was adopted to allow for the measurement of longer-term outcomes. The CCPMS and subsequent implementation plans clearly outlined guiding principles, key goals, key action areas, objectives, strategies and indicators/progress measures. In theory, the 'evaluation box was built and neatly wrapped' providing a useful platform to undertake monitoring and evaluation functions, which had been considered from the outset. However, it has recently surfaced that indicators/progress measures were poorly aligned to the objectives and strategies, and that in some instances data was not available to report against the indicators. Similarly, the indicators included in implementation plans changed across the life of the CCPMS, reflecting changes in policy direction and government priorities. This made it difficult to identify how best to measure the impact and outcomes of the CCPMS. That is, the 'structure of the evaluation box was weak'. In 2018, to address this conundrum, a multi-agency Monitoring and Evaluation Working Group, with independent Co-Chairs, was established to develop a retrospective Northern Territory CCPMS Monitoring and Evaluation Framework. In this presentation, we draw on multiple perspectives from the Working Group to track and discuss the process used to develop the framework. We will explain how we 'unwrapped, deconstructed and reconstructed the box'. We will explain how and why our multi-phased approach included: an indicator mapping process across multiple policy documents (2010-2018); preparing a retrospective logic model; identifying contemporary Indigenous evaluation principles; seeking expert advice on qualitative and quantitative measures; and prioritising indicators based on availability, utility or pre-existing reporting processes. In doing so, we will unpack the problems and possibilities encountered by the Working Group.

Chairs
avatar for Carolyn Hooper

Carolyn Hooper

Evaluation + Research Senior Consultant, Allen + Clarke
I'm a social scientist, with a preference for qualitative methods. I've been working in evaluation for seven years and I love the variety of opportunities I get to make a difference. My doctorate in Public Health means I often work on health-related evaluations, examining interventions... Read More →

Presenters
avatar for James Smith

James Smith

Father Frank Flynn Fellow and Professor of Harm Minimisation, Menzies School of Health Research
James is the Father Frank Flynn Fellow and Professor of Harm Minimisation at Menzies School of Health Research - with much of his work sitting at the health/education nexus. Previous to this role he was a 2017 Equity Fellow with the National Centre for Student Equity in Higher Education... Read More →
avatar for Jenny Summerville

Jenny Summerville

Performance and Quality Manager, NT PHN
Dr Jenny Summerville is the Performance and Quality Manager at Northern Territory PHN. She has more than 20 years experience coducting research and evaluation in academic and industry settings. Her work has spanned a variety of industry and sector contexts including health, community... Read More →


Tuesday September 17, 2019 2:00pm - 2:30pm AEST
C2.6

2:30pm AEST

Communities of Practice, mentoring and evaluation advice: using soft power approaches to build capacity
Florent Gomez (NSW Department of Finance, Services and Innovation)

In the same way that some countries use culture as a soft power approach to extend their influence, evaluation should give serious consideration to soft capacity building tools such as Communities of Practice. This approach can be incredibly effective in diffusing evaluative thinking across organisations that are less familiar with it.

A New South Wales government department which is not a traditional stronghold for evaluation – as compared to human services departments such as education, health or community services – has established a successful Evaluation Community of Practice since November 2017. The Community of Practice brings together staff with varying levels of evaluation maturity to ‘share the love for evaluation’. The intent is to offer a more informal and less intimidating forum for participants to share challenges and learning than a traditional expert-to-learner approach. Over 50 people gather at each quarterly event where presenters provide case studies, panel discussions and practical exercises such as collectively developing a program logic or crafting good survey questions.

After a year and a half, participants reported an increased understanding of what evaluation is about and of key tools such as program logic, as well as applying those learning back in their workplace. The Community of Practice has opened up the conversation on evaluation across the organisation. While a slow and diffuse process, there is now a growing interest in evidence-based approaches, outcome framing and evaluative thinking.

Other soft power approaches used involve staff mentoring and evaluation advice. These have proved to be particularly powerful in improving the quality of evaluations – and are not necessarily much more resource intensive than formal training. Provided at the initial stage, targeted evaluation advice contributes to getting the evaluation framing right which generates a better evaluation brief. This, in turn, results in better evaluation outcomes, where the evaluation produces evidence around what the organisation is interested to learn about.

Chairs
avatar for Carolyn Hooper

Carolyn Hooper

Evaluation + Research Senior Consultant, Allen + Clarke
I'm a social scientist, with a preference for qualitative methods. I've been working in evaluation for seven years and I love the variety of opportunities I get to make a difference. My doctorate in Public Health means I often work on health-related evaluations, examining interventions... Read More →

Presenters
avatar for Florent Gomez

Florent Gomez

Manager, Planning and Evaluation, NSW Department of Customer Service
avatar for Michelle Bowron

Michelle Bowron

NSW Department of Customer Service
Currently working in the NSW Department of Customer Service and responsible for delivering evaluation activities and increasing evaluation awareness. I have an interest in learning more about evaluation approaches and the value it adds to existing and future business initiatives... Read More →


Tuesday September 17, 2019 2:30pm - 3:00pm AEST
C2.6
 
Wednesday, September 18
 

10:30am AEST

Front-end loading: The value of formative evaluation in setting program focus: a case study of the Australian Volunteers Program
Keren Winterford (University of Technology Sydney), Anna Gero (Institute for Sustainable Futures, University of Technology Sydney), Jake Phelan (Austalia Volunteers Program)

This paper explores the practice of a formative evaluation for the Australian Volunteers Program and sets out why formative evaluation is valuable to setting program focus and defining approaches to impact evaluation. Reflections from independent evaluators and the Monitoring Evaluation and Learning team of the Australian Volunteers Program are provided within this presentation drawing together multi-stakeholder and practitioner perspectives on theory and practice of formative evaluation.

The overall objective of the formative evaluation presented in this paper was to map the global footprint of the Australian Volunteers Program in three impact areas in order to (i) establish a baseline; (ii) inform strategic options for strengthening engagement in the impact areas and; (iii) propose methodology for demonstrating outcomes in impact areas. The three impact areas of Inclusive economic growth; Human Rights; and Climate Change/Disaster Resilience/Food Security are informed by the Australian Government Volunteers Program Global Program Strategy. Rather than setting out evaluation findings, the paper explores the practice of collaborative evaluation design; use of mixed methods including key informant interviews, document review, and quantitative analysis to prepare working definitions of impact areas. We explore the practice of drawing on local (country contexts) and global measures (Sustainable Development Goals) to define impact areas and how we have made sense of these to apply to the Australian Volunteers Program.

The paper distinguishes the theory and practice of formative evaluation and sets out the unique contribution it offers to policy and programming agendas. The paper talks about the value of evaluation across multiple points in the project cycle and value of linking formative and summative evaluations as highlighted within this case. Informed by this case study, the presenters offer tips and tricks for those commissioning and conducting evaluations to ensure formative evaluations provide best contribution to policy and programming agendas.


Chairs
avatar for Kiri Parata

Kiri Parata

President, Australian Evaluation Society
Kia ora I'm Kiri, living on Kabi Kabi Country on the Sunshine Coast of Queensland. I share my time between here and my home in Aotearoa New Zealand where my iwi (tribal) affiliations are Te Ātiawa, Ngāti Toarangatira, Ngāti Raukawa, Ngāti Ruanui, and Ngāi Tahu. I'm completing... Read More →

Presenters
avatar for Keren Winterford

Keren Winterford

Research Director, Institute for Sustainable Futures, University of Technology Sydney
Dr Winterford has 20 years of work experience working in the international development sector, in multiple capacities with Managing Contractors, NGOs, as a private consultant, and more recently in development research. She currently provides research and consultancy services for numerous... Read More →
avatar for Farooq Dar

Farooq Dar

Monitoring, Evaluation and Learning Advisor, Australian Volunteers International
Farooq has accumulated 15+ years of experience as an international development practitioner designing and managing complex multi-sectoral humanitarian and development programs/projects, working on governance, compliance and policy issues across various countries around Asia including... Read More →
avatar for Anna Gero

Anna Gero

Research Principal, University of Technology Sydney
Anna Gero is a climate change and disaster risk leader and specialist with over 10 years experience in the Asia-Pacific region. She is an experienced project manager, and has led climate change and disaster resilience consultancies, evaluations and research projects since 2008 across... Read More →


Wednesday September 18, 2019 10:30am - 11:00am AEST
C2.6

11:05am AEST

Surprise! No one read your organisations annual corporate performance report. Now what?
Brooke Edwards (NSW Government)

Recent experience of a trend towards annual corporate performance reports leads me to question why alternative and more compelling performance reporting formats are being overlooked. What’s beyond the box? Or, what’s beyond the dusty corporate reports archive box? Isn’t it time we embraced new methods of sharing and showcasing our performance data?

With the benefit of hindsight I discuss the downside risks of pursuing a corporate performance report as the cornerstone of your M&E reporting and communication strategy, consider what we actually want to achieve through M&E performance reporting and present some alternative communication formats to get us really thinking outside the box!

Chairs
avatar for Kiri Parata

Kiri Parata

President, Australian Evaluation Society
Kia ora I'm Kiri, living on Kabi Kabi Country on the Sunshine Coast of Queensland. I share my time between here and my home in Aotearoa New Zealand where my iwi (tribal) affiliations are Te Ātiawa, Ngāti Toarangatira, Ngāti Raukawa, Ngāti Ruanui, and Ngāi Tahu. I'm completing... Read More →

Presenters
avatar for Brooke Edwards

Brooke Edwards

Evaluation Analyst, NSW Government
New(ish) to the field of Evaluation! Stared off in monitoring and reporting within a scientific research organisation and now working with the NSW Government completing process evaluations of grant management programs.


Wednesday September 18, 2019 11:05am - 11:10am AEST
C2.6

11:10am AEST

He Whetū Arataki (Guiding Star) youth leadership programme evaluation
Gill Potaka-Osborne (Whak ae Research Services), Teresa Taylor (Whak ae Research Services)

In 2018, Te Rūnanga o Ngāti Hauiti (tribal council) commissioned their research unit to complete an evaluation of their youth leadership programme that had been running for nine years without change. The programme purpose, 'to develop youth as leaders' - succession planning, was facilitated by tribal experts and elders who endeavored to fuse past and present in a way that resonated with youth. The evaluation invited tribal members to reflect and consider what had worked well, the challenges and how best to move forward. This evaluation models how indigenous communities can commission and conduct independent evaluations to meet tribal aspirations.


Chairs
avatar for Kiri Parata

Kiri Parata

President, Australian Evaluation Society
Kia ora I'm Kiri, living on Kabi Kabi Country on the Sunshine Coast of Queensland. I share my time between here and my home in Aotearoa New Zealand where my iwi (tribal) affiliations are Te Ātiawa, Ngāti Toarangatira, Ngāti Raukawa, Ngāti Ruanui, and Ngāi Tahu. I'm completing... Read More →

Presenters
avatar for Gill Potaka Osborne

Gill Potaka Osborne

Researcher, Whakauae Research Services
Ko Aotea te wakaKo Ruapehu te maungaKo Whanganui te awaKo Ātihaunui-ā-Pāpārangi, Ko Raukawa ki te Tonga nga iwiKo Ngāti Tuera, Ngāti Pamoana, Ngāti Pareraukawa ngā hapū.Ko Pungarehu, ko Parikino, ko Koriniti, Ko Ngātokowaru Marae ngā marae.E rere kau mai te awa nuimai I... Read More →
TT

Teresa Taylor

Kaimahi, T & T Consulting Limited
Indigenous evaluation practice.


Wednesday September 18, 2019 11:10am - 11:15am AEST
C2.6

11:15am AEST

What's beyond the box: Learning from 'tribal' communities and encouraging community ownership of evaluation - a collaborative approach, building on translational research, using an implementation science evaluation framework
Robert Simpson (Mackay Institute of Research and Innovation (MIRI) - Mackay Hospital and Health), Dr Bridget Abell (Australian Centre for Health Services Innovation)

An entertaining and interactive presentation exploring a community based program evaluation that combats the rising population health issues of obesity and diabetes across overweight and obese regional communities - Mackay, Isaac and the Whitsundays.

Evaluation can be part of inspiring communities to healthier life changes and combatting major social epidemics. This presentation discusses evaluation of a collaborative "tribal" approach to behavioural change and how implementation science frameworks can highlight facilitators and barriers to program sustainability and impact from various stakeholder viewpoints. Key features are innovative translational research, community partnerships/ownership of outcomes and evaluation of a tribal innovation from beyond traditional perspectives.


Chairs
avatar for Kiri Parata

Kiri Parata

President, Australian Evaluation Society
Kia ora I'm Kiri, living on Kabi Kabi Country on the Sunshine Coast of Queensland. I share my time between here and my home in Aotearoa New Zealand where my iwi (tribal) affiliations are Te Ātiawa, Ngāti Toarangatira, Ngāti Raukawa, Ngāti Ruanui, and Ngāi Tahu. I'm completing... Read More →

Presenters
avatar for Robert Simpson

Robert Simpson

Project Manager, Mackay Hospital and Health
Robert Simpson BA MBARob is a born and bred Queenslander specialising in enabling alignment of priorities to overcome challenging problems - whether these relate to communities, organisational, cultural, or individual behavioural change. Experienced evaluator, previously responsible... Read More →


Wednesday September 18, 2019 11:15am - 11:20am AEST
C2.6

11:20am AEST

Design tips for visualising your data
David Wakelin (ARTD Consultants)

Every day we create, analyse and visualise a lot of data. We need to effectively share our findings so they can be turned into actions. Making these small changes when visualising your data can make a big difference in whether your audience can understand and use your findings. I will share simple design tips to instil clarity in the visualisations you design to help your audience to see what you see, know what you know, understand your message and turn evidence into action.

Chairs
avatar for Kiri Parata

Kiri Parata

President, Australian Evaluation Society
Kia ora I'm Kiri, living on Kabi Kabi Country on the Sunshine Coast of Queensland. I share my time between here and my home in Aotearoa New Zealand where my iwi (tribal) affiliations are Te Ātiawa, Ngāti Toarangatira, Ngāti Raukawa, Ngāti Ruanui, and Ngāi Tahu. I'm completing... Read More →

Presenters
avatar for David Wakelin

David Wakelin

Senior Consultant, ARTD Consultants
I am a keen data analyst with a passion for data visualisation. I've been working on a wide range of projects lately and have seen immense value in being able to tell stories with data I am working with.


Wednesday September 18, 2019 11:20am - 11:25am AEST
C2.6

11:35am AEST

The whole box and dice: economic evaluation trends and forecasts
Mark Galvin (EY)

Recent government moves towards outcomes budgeting is the latest illustration that outcomes thinking is here to stay. Outcomes evaluation coupled with economic evaluation is increasing and increasingly interdependent, especially in the social policy and services space. With such anticipation, the risk of an empty box looms large. Demonstrating and valuing outcomes requires intentional and fit-for-purpose measurement approaches. Sharing approaches is critical to further innovation and support for robust public decision making.

This Ignite presentation will showcase changes in the policy landscape, as well as visual depictions of evaluation methodologies that situate 'traditional' social outcomes as benefits and how significant economic value is derived through effective services delivery and cost avoidance.


Chairs
avatar for Kiri Parata

Kiri Parata

President, Australian Evaluation Society
Kia ora I'm Kiri, living on Kabi Kabi Country on the Sunshine Coast of Queensland. I share my time between here and my home in Aotearoa New Zealand where my iwi (tribal) affiliations are Te Ātiawa, Ngāti Toarangatira, Ngāti Raukawa, Ngāti Ruanui, and Ngāi Tahu. I'm completing... Read More →

Presenters
avatar for Mark Galvin

Mark Galvin

Partner, EY
Finding solutions to complex problems. Rugby tragic. Beach holiday enthusiast. Dad to two awesome kids and Ollie the Irish terrier.I have over 20 years’ experience as a professional advisory consultant and has spent much of his career advising state and federal governments, Ministerial... Read More →
avatar for Alain Nader

Alain Nader

Senior Manager, EY
Over the past ten years I have delivered strategic advice and implementation support to a number of government agencies, both State and Federal. Areas of particular interest include examining the roles and responsibilities of government, improving citizen outcomes and the allocative... Read More →


Wednesday September 18, 2019 11:35am - 11:40am AEST
C2.6

11:40am AEST

Using e-diaries to collect evaluation data
Carolyn Hooper (Allen and Clarke Policy and Regulatory Specialists)

During an intervention evaluation, front-line service delivery staff made periodic diary entries using an on-line portal. Diarists responded to prompts specific to the evaluation questions. The output provided valuable insights to the day-to-day realities of those delivering the intervention; resulting in front-line staff having a strong voice in the evaluation report. The e-diary is an accessible, innovative method for collecting data, suited to situations where a detailed view of the work at the intervention delivery interface is valuable, but direct observation by an evaluator is problematic. Come and see how we did it.

Chairs
avatar for Kiri Parata

Kiri Parata

President, Australian Evaluation Society
Kia ora I'm Kiri, living on Kabi Kabi Country on the Sunshine Coast of Queensland. I share my time between here and my home in Aotearoa New Zealand where my iwi (tribal) affiliations are Te Ātiawa, Ngāti Toarangatira, Ngāti Raukawa, Ngāti Ruanui, and Ngāi Tahu. I'm completing... Read More →

Presenters
avatar for Carolyn Hooper

Carolyn Hooper

Evaluation + Research Senior Consultant, Allen + Clarke
I'm a social scientist, with a preference for qualitative methods. I've been working in evaluation for seven years and I love the variety of opportunities I get to make a difference. My doctorate in Public Health means I often work on health-related evaluations, examining interventions... Read More →


Wednesday September 18, 2019 11:40am - 11:45am AEST
C2.6

11:45am AEST

Lessons from the Dark Side: How Corporates do Client Experience
Emily Verstege (ARTD Consultants)

I've been in a corporate wilderness for the last four years, working with for-profit organisations to gather evidence to understand their clients better. I quickly realised corporations know lots about their clients in ways that we don't, as governments or non-profits. This Ignite presentation un-boxes client experience for evaluators, with anecdotes from the "dark side".

Chairs
avatar for Kiri Parata

Kiri Parata

President, Australian Evaluation Society
Kia ora I'm Kiri, living on Kabi Kabi Country on the Sunshine Coast of Queensland. I share my time between here and my home in Aotearoa New Zealand where my iwi (tribal) affiliations are Te Ātiawa, Ngāti Toarangatira, Ngāti Raukawa, Ngāti Ruanui, and Ngāi Tahu. I'm completing... Read More →

Presenters
avatar for Emily Verstege

Emily Verstege

Senior Manager, ARTD Consultants


Wednesday September 18, 2019 11:45am - 11:50am AEST
C2.6

1:35pm AEST

Personality preferences - Implications for influencing evaluation design and utilisation
Eve Barboza (Wholistic Learning Pty Ltd)

Can the personality preference of the evaluator influence the design and utilisation of evaluation? Can any differences in these personality preferences between evaluator and client / audience of the evaluation explain some of the controversies in evaluation practice? This session explores how personality preferences could be drawn on to inform the design of evaluation and influence the implementation and utilisation of evaluation findings. Drawing on some positive and negative experiences of the presenter we will explore personality preferences as a framework to inform and support your work to improve the design and utilisation of your evaluation projects

Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, still have heaps to learn. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health and you name it. Looking forward to catching up with... Read More →

Presenters
avatar for Eve Barboza

Eve Barboza

Director / Facilitator, Wholsitic Learning
BA Hons, MAPS (Member Australian Psychological Society) CAHRI (Certified Professional Australian Human Resources Institute) and an AES member since 1990. During a career spanning 25 years Eve has developed the knowledge and experience necessary to work as an evaluator, researcher... Read More →


Wednesday September 18, 2019 1:35pm - 1:40pm AEST
C2.6

1:40pm AEST

A live unboxing: The evaluation capacity building role
Liam Downing (Centre for Education Statistics and Evaluation)

In a session designed especially for those who LOVE watching those unboxing videos on YouTube, I will unbox, set up, and use a brand new evaluation capacity building role live on the AES 2019 stage. I will show you what's inside, how it works and what it can do. You can see if it's the right choice for you to build skills and grow the profession through capacity building. This Ignite presentation will also use props. PROPS!

Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, still have heaps to learn. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health and you name it. Looking forward to catching up with... Read More →

Presenters
avatar for Liam Downing

Liam Downing

Manager, Evaluation and Data, Quality Teaching Practice, NSW Department of Education
Liam is an experienced and impactful evaluation leader, with 17+ years of experience. He is focused on ensuring that evaluation is rigorous in its design, meaningful in informing next steps, and driven by building the capacity of as many people as possible to engage deeply in evaluative... Read More →


Wednesday September 18, 2019 1:40pm - 1:45pm AEST
C2.6

1:45pm AEST

Evolving from academic researcher to evaluator
Natalia Krzyzaniak (NPS MedicineWise)

In contrast to common perception, evaluation and research are two distinct disciplines. Both require the application of data collection and analysis skills and centre on the shared objective of answering a question. However, the purpose of each discipline, and dissemination of the data collected, differ. Entering the evaluation profession from a research background, requires a level of adaptation to become an efficient and successful evaluator. This presentation will walk the audience through my journey from a researcher to an emerging evaluator, outline the key similarities and differences between research and evaluation, and the upskilling required to become an efficient evaluator.

Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, still have heaps to learn. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health and you name it. Looking forward to catching up with... Read More →

Presenters
NK

Natalia Krzyzaniak

NPS MedicineWise
Natalia is a recent PhD graduate, majoring in Pharmacy, from the University of Technology Sydney. She currently holds a position as a Program Evaluation Officer at NPS MedicineWise, and is involved in the evaluation of educational programs delivered to health professionals and consumers... Read More →


Wednesday September 18, 2019 1:45pm - 1:50pm AEST
C2.6

1:50pm AEST

Getting past the imposter syndrome: you don't have to be an expert to help build evaluation capacity in your organisation.
Margaret Moon (SafeWork NSW)

If you're new to evaluation you might feel like an imposter at least some of the time. You get appointed to a new role with "evaluation" in the title and suddenly you're expected to be an expert!

This can be daunting.

But many of the skills and qualities that evaluators need are transferable. For example, a good evaluator needs the right mindset and a positive attitude, good critical thinking skills and penchant for asking lots of questions. These are excellent foundational skills.

This presentation will help emerging evaluators identify their strengths and feel more confident in building evaluation capacity.


Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, still have heaps to learn. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health and you name it. Looking forward to catching up with... Read More →

Presenters
avatar for Margaret Moon

Margaret Moon

Senior Project Officer, SafeWork NSW
I manage the evaluation program at SafeWork NSW. This involves commissioning evaluations of programs designed to improve safety in NSW and building evaluation capacity across the organisation. I have previously worked as a film editor at the Australian Broadcasting Corporation, as... Read More →


Wednesday September 18, 2019 1:50pm - 1:55pm AEST
C2.6

2:00pm AEST

The dance of evaluation: Engaging stakeholders to develop an evaluation framework across a highly diverse training organisation
Racheal Norris (GP Synergy), Linda Klein (GP Synergy)

This presentation will outline the processes and challenges involved in developing an efficient evaluation framework, using a state-wide vocational training organisation as a case-study. GP Synergy delivers an accredited General Practice training program, across eight highly diverse subregions of NSW and the ACT, for doctors wishing to specialise as General Practitioners. A small Evaluation Team was established in 2017 to develop a rigorous, adaptive evaluation system to monitor and report on delivery of educational activities.

Using evidence-based methodology, the team adopted a participatory approach and engaged stakeholders across three key levels:

Education Executive
An interactive program logic workshop was held to discuss and identify various evaluation priorities at the senior-level.

Medical Educators
The team worked closely with individual educators to design evaluation tools that were standardised, yet responsive to the unique needs of each region. This involved careful consideration of psychometric properties to ensure robust and reliable measures of key outcomes. A semi-automated reporting system was created to maximise efficiency of delivering timely feedback, and the team guided educators to correctly interpret and utilise this information for continuous improvement.

GP Registrars
The team consulted with registrars (trainees) to explore and develop pathways to "close the loop" and communicate evaluation findings and implications for the training program. This also involved educating registrars about the broader theoretical framework behind evaluation and how to provide useful, constructive feedback.

Evaluation at GP Synergy remains an evolving process, with ongoing multi-level engagement ensuring evaluation systems continue to be responsive and adaptable to stakeholder needs. The role of the Evaluation Team in educating stakeholders and colleagues about evaluation 'steps' has been fundamental to successful data collection and reflection on findings resulting in change. Insights will be offered to others developing evaluation frameworks/methods within settings where flexibility and responsiveness are key.


Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, still have heaps to learn. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health and you name it. Looking forward to catching up with... Read More →

Presenters
avatar for Racheal Norris

Racheal Norris

Evaluations Officer, GP Synergy
Racheal is an Evaluation Officer within the NSW & ACT Research and Evaluation Unit of GP Synergy. Racheal is involved in the collection and reporting of feedback from GP registrar and supervisor development workshops. Racheal also contributes to the ongoing development of a broader... Read More →
avatar for Linda Klein

Linda Klein

Deputy Director Research & Evaluation, GP Synergy
Linda Klein, BSc MSc PhDI have an academic background in psychology and public health, with over 30 years of practical experience in evaluation in sectors spanning health, education/training and business. At GP Synergy, I take primary responsibility for the evaluation of educational... Read More →


Wednesday September 18, 2019 2:00pm - 2:30pm AEST
C2.6

2:30pm AEST

Operationalising systems-thinking approaches to evaluating health system innovations: The example of HealthPathways Sydney
Carmen Huckel Schneider (University of Sydney), Sarah Norris (University of Sydney), Sally Wortley (University of Sydney), Angus Ritchie (University of Sydney), Fiona Blyth (University of Sydney), Adam Elshaug (University of Sydney), Andrew Wilson (University of Sydney)

There have been increasing calls to take a systems-thinking approach to evaluating health policies and programs - acknowledging the complexity of health systems and the many actors, institutions, relationships, drivers and values that impact on health system change. Several key frameworks have emerged that support systems-thinking, including "WHOs Framework for Action"; "NASSS - Non-Adoption, Abandonment, and Challenges to Scale-Up, Spread and Sustainability"; and the "Vortex Model". However little has been written on how to operationalise systems framework elements into practical evaluation studies comprising methodologically rigorous data collection and analysis methods - all while staying true to the principles of systems-thinking.

In this presentation we seek to unbox the challenge of operationalising a system-thinking approach to evaluating healthcare delivery innovations. We use the NASSS framework as our example to demonstrate how to expand system-thinking frameworks, progress towards theories and pose systems-thinking-driven, yet researchable questions. This requires crossing epistemological boundaries, and taking a 'multiple studies' approach adopting various methods of inquiry. We report on applying these principles to evaluate HealthPathways Sydney, a website for GPs to navigate care pathways for their patients through primary and specialist care. We followed a two phase approach, beginning with a series of sub-studies using standard qualitative and quantitative methods and reflected on the conduct of these studies to pinpoint system level factors (macro contexts, institutional settings, critical events, agents and relationships) that were necessary to understand in order to determine how the innovation interacted with the system. Our second phase adopted systems-thinking study methods including geo-spatial mapping, social network analysis, process tracing, frames analysis and situational analysis. Results were then synthesised into a rich case of the introduction of an innovation into the system. We uncovered progress towards desired outcomes, but also barriers to consolidating and embedding the technology when other system factors were in play.


Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, still have heaps to learn. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health and you name it. Looking forward to catching up with... Read More →

Presenters
avatar for Carmen Huckel Schneider

Carmen Huckel Schneider

Senior Lecturer, Health Policy, University of Sydney
I am Deputy Director at the Menzies Centre for Health Policy, and Program Director of the Master of Health Policy at the University of Sydney. I am co-lead of the Health Governance and Financing, and Applied Policy Analysis Groups at the Menzies Centre for Health Policy, a Senior... Read More →
SN

Sarah Norris

Senior Research Fellow, Menzies Centre for Health Policy
How broader approaches to evaluation can be applied to health technology evaluation, and vice versa.


Wednesday September 18, 2019 2:30pm - 3:00pm AEST
C2.6
 
Filter sessions
Apply filters to sessions.