X

Outcomes

This section lays out the stated goals of The VPS Academy, and makes some recommendations for creating a framework to evaluate the initiative.

View All Chapters
Chapter Overview
When evaluating innovation work, it is important to recognise the emergent nature of the initiative, and evaluate across a range of outcomes.

Theory of Change

The main aim of the VPS Academy is to rigorously investigate how to create an efficient model for scaling peer-learning in the Victorian Government.

A basic theory of change for the Academy looks like this

Theory of Change Model

The key goals for the academy that have been agreed by the Academy governance team are:

  • Build Social Capital
  • Increased connections for participants
  • Increased Trust
  • Increase Organisational Effectiveness
  • Participant Activities
  • Changes in professional behaviour
  • Changes in quality of work

Evaluating Outcomes

The VPS Academy is an innovation initiative, and as such should be evaluated using a framework which is specifically developed for initiatives in innovation contexts.

When evaluating innovation work, it is important to recognise the emergent nature of the initiative, and evaluate across a range of outcomes.

Evaluation Methodologies

There are several evaluation frameworks and methodologies which are considered best practice for emergent or experimental innovation initiatives, which include:

  1. Developmental Evaluation
  2. Integrated Reporting
  3. Innovation Accounting
  4. Lean Data & Social Impact Analysis

Thinking specifically about The VPS Academy and the stated goals, we recommend paying attention to the following:

Breadth of impact:

  • Participants - how many people have signed up as producers, and how many as peers? How many people become both producers and peers?
  • Diffusion - what is the average number of work connections of participants, outside the academy but within VPS? How influential are these people? Are they able to influence the practice of their team or department?

Depth of impact:

  • Engagement - how many interactions are happening between participants? Are participants meeting up in person? If so, what are they doing in that time?
  • Engagement - how many visits are there to the Academy website? How long do people spend there, and how many pages do they visit?
  • Attitudes - how much trust is in the network? Is it growing over time? What is the likelihood of participants sharing best practice with people outside own team? What is the likelihood of participants to collaborate on problem if requested by another participant? What is the likelihood a participant will contact someone in network if had a challenge?
  • Behaviour Change - what is changing in participants’ working lives because of this initiative? What is happening to their teams and departments? What is their attitude towards VPS because of this initiative?

Evaluation Methods

To evaluate the Academy we recommend using both qualitative and quantitative methods which establish a baseline, gather data during each cycle, and involve the participants of the Academy in the synthesis and prioritisation of the insights and outcomes.

Suggested starting point for Evaluation Methods, for people who do not have an evaluation background: http://www.betterevaluation.org/