This event has ended. Visit the official site or create your own event on Sched.
Conference registrations have closed | Presenter and chair briefing notes and other information available here.
Self-guided historical walking tours: These walking tours are accessed via the Sydney Culture Walks app and highlight Aboriginal history, heritage & culture: https://www.sydneyculturewalksapp.com/barani-redfern 
Back To Schedule
Monday, September 16 • 4:00pm - 4:30pm
How can implementation quality be evaluated? An example from a pilot initiative in Victorian child and family services.

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Feedback form is now closed.
Jessica Hateley-Browne (Centre for Evidence and Implementation), Tom Steele (Centre for Evidence and Implementation), Vanessa Rose (Centre for Evidence and Implementation), Bianca Albers (Centre for Evidence and Implementation), Robyn Mildon (Centre for Evidence and Implementation)

Background and aim: High-quality program implementation is a pre-condition to program effectiveness. However, evaluation of the implementation process is rare, resulting in uncertainty around interpretation of impact evaluations with null effects (i.e. was the program ineffective, or implemented poorly?). We report on an implementation evaluation of the Victorian Government's pilot of five manualised therapeutic programs for vulnerable families (four developed in the USA) across seven service provider agencies; the first evaluation of this nature and scope in Australia. The aim was to provide an indication of the comprehensiveness, pace and quality of program implementation to inform government decisions about if/ how such programs should be funded, implemented, supported and scaled.

Method: A real-world mixed-methods observational study design was used. The Stages of Implementation Completion checklist assessed implementation pace and comprehensiveness. Theory-based structured interviews were conducted with agency staff (N=29) to explore program appropriateness, acceptability and feasibility. Fidelity data were extracted from agency databases.

Results: Most (n=6) agencies were still in early implementation, having not yet achieved sustainability. Highly-concentrated and overlapping implementation activity was observed, reflective of funding pressures, putting implementation quality at risk. The programs were generally well-accepted, perceived as high-quality and a good fit. While most agency staff 'believed in' the programs, perceived appropriateness was compromised by the lack of adaptability for Aboriginal and Torres Strait Islander communities. Threats to feasibility included high demands on practitioners and lack of Australian-based implementation support (trainers, consultants). It was too early for valid fidelity assessments.

Conclusions: Policy-makers should afford agencies more time/resources to incorporate initiatives into 'business as usual'. Ongoing monitoring of implementation outcomes is highly recommended to facilitate data-driven decisions about when to commence impact evaluation (i.e. when sustainability is achieved, and fidelity has been demonstrated).


Leanne Kelly

Research & Development Coordinator, Windermere Child & Family Services
I am the research and development coordinator at Windermere Child & Family Services in Melbourne. I have recently submitted my PhD on the effectiveness of evaluation in small community development non-profits.

avatar for Dr Jessica Hateley-Browne

Dr Jessica Hateley-Browne

Senior Advisor, Centre for Evidence and Implementation
Jessica is a Senior Advisor at the Centre for Evidence and Implementation. She has a PhD in health psychology, and has expertise in behavioural science and implementation science. She has authored more than 40 journal articles that span the fields of health services, population health... Read More →

Monday September 16, 2019 4:00pm - 4:30pm AEST