際際滷

際際滷Share a Scribd company logo
Exploring The Danish
Approach to Evaluating
Public Sector Innovation
Webinar May 23rd 2018
Lene Krogh Jeppesen, Senior Consultant
Mail: lkj@coi.dk Twitter: @lenekrogh
Innovation is a new or
significantly changed way
of improving the workplaces
activities and results.
COI:
Established 2014
Covers all three layers
of government:
 national
 regional
 local
9,4 FTE
14-12-2017
14-12-2017
Minister for Public Sector Innovation,
Sophie L淡hde
(since november 28, 2016)
25.06.2018 3
Minister for
Public Sector
Innovation
Agency for the
modernization of the
public sector
The National Center for Public
Sector Innovation
Agency for Digitization
Agency for Governmental
Administration
Agency for Governmental IT-
services
COI strategic framework 2017-
14-12-2017 4
Agenda:
1. WHAT is the state of evaluating public
sector innovation in Denmark?
#Facts
2. WHY did COI decide to work in these
two fields?
#DefiningTheProblem
3. WHAT did COI do?
#ShapingTheSolution
4. NOW WHAT?
#NextSteps
25.06.2018 5
25.06.2018 6
1. WHAT is the state of
evaluating public sector
innovation in Denmark?
#Facts
Fact: How many?
25.06.2018 7
Fact: Who evaluates?
25.06.2018 8
External, for example consultants or
researchers have run or contributed to
the evaluation
Internal evaluation run by the
workplace
Fact: How?
25.06.2018 9
Fact: Purposes
25.06.2018 10
Learning to
improve
(innovation)
efforts
Examine
whether the
innovation
has
achieved its
goal
Spreading
ones
experiences
with others
Documenting
the value of
the innovation
to decision
makers
To better
manage the
innovation
process along
the way
Other
25.06.2018 11
2. WHY did COI decide to
work in these two fields
#DefiningTheProblem
Defining the problem pt. 1:
Based on the InnovationBarometer
25.06.2018 12
1. (not) knowing the value
2. the needs of the internal evaluators
3. (not) documenting results
Defining the problem pt. 1:
Based on field work
25.06.2018 13
1. A lack of systematic evaluation approach within public sector innovation
2. A wish for:
 More evaluation of innovation
 Tools to evaluate innovation
 Increase in internal evaluation capacity
 Network and knowledge sharing on evaluating public sector innovation
3. How do we increase internal evaluation capacity?
 New models and methods
 Innovators need evaluation skills
 Specific guides or tools to evaluate public sector innovation
 Network and knowledge sharing on evaluating public sector innovation
Two fields
crossing paths
 or not
crossing paths
25.06.2018 14
25.06.2018 15
3. WHAT did COI do?
#ShapingTheSolution
25.06.2018 16
Publication
3. Developing
and testing
prototypes
2. Community
building:
Involving
experts and
pracitioners
1. Exploring the
problem: What
troubles us
when evaluating
innovation?
Our approach:
COIs evaluation activities (2015-16):
Innovators getting evaluation skills
25.06.2018 17
Conference: HELP! Im
evaluating an innovative
activity
Network and
knowledge
sharing
Models and
methods
Publication: Guidance on
evaluating an innovative
activity by professor
Peter Dahler Larsen
Models and
methods
Working community on
evaluating innovation
Network and
knowledge
sharing
Specific
guides and
tools
Publication: Hands-on-
guide with dialogue tools
Specific
guides and
tools
Working community?
25.06.2018 18
 Who:
Participants: 30 consultants with knowledge on innovation/evaluation
Advisory board: 4 experts theory/practice
 Purpose:
To enhance evaluation of public sector innovation (more evaluations)
To make evaluation of innovation easier for the internal evaluators
(capacity building).
 Objectives:
To create networks, knowledge sharing and development of new
approaches between actors working on evaluating public innovation.
Process: Working Community
25.06.2018 19
Introducing the suggested
framework
Testing it on an innovative
activity
Introducing: Theory,
framework,
definitions
Further developing and
qualifying content in a
framework for evaluation of
innovation
Open space: Tools,
framework, skills, planning,
roles.
Developing the
framework Testing a prototype
Evaluating the work
Testing the prototype
25.06.2018 20
The concept: Four phases when evaluating innovation
25.06.2018 21
Dialogue tool - example
25.06.2018 22
An evaluation artefact
25.06.2018 23
25.06.2018 24
4. NOW WHAT?
#NextSteps
Since 2016
25.06.2018 25
 Distribution, introductory workshops and training
 Article and evaluation conferences:
 Combining spreading and evaluation
 Organizational evaluation capacity:
 When political leadership ask for evaluation of innovation
Current challenges:
25.06.2018 26
Attention & playmates
Thank you
Find our (limited) products and information in English on coi.dk/about-coi
Our tool kit is available for download  in Danish: coi.dk/evaluering .
Lene Krogh Jeppsen: lkj@coi.dk // @lenekrogh
25.06.2018 27

More Related Content

Exploring the danish approach to evaluating public sector innovation

  • 1. Exploring The Danish Approach to Evaluating Public Sector Innovation Webinar May 23rd 2018 Lene Krogh Jeppesen, Senior Consultant Mail: lkj@coi.dk Twitter: @lenekrogh
  • 2. Innovation is a new or significantly changed way of improving the workplaces activities and results. COI: Established 2014 Covers all three layers of government: national regional local 9,4 FTE 14-12-2017
  • 3. 14-12-2017 Minister for Public Sector Innovation, Sophie L淡hde (since november 28, 2016) 25.06.2018 3 Minister for Public Sector Innovation Agency for the modernization of the public sector The National Center for Public Sector Innovation Agency for Digitization Agency for Governmental Administration Agency for Governmental IT- services
  • 4. COI strategic framework 2017- 14-12-2017 4
  • 5. Agenda: 1. WHAT is the state of evaluating public sector innovation in Denmark? #Facts 2. WHY did COI decide to work in these two fields? #DefiningTheProblem 3. WHAT did COI do? #ShapingTheSolution 4. NOW WHAT? #NextSteps 25.06.2018 5
  • 6. 25.06.2018 6 1. WHAT is the state of evaluating public sector innovation in Denmark? #Facts
  • 8. Fact: Who evaluates? 25.06.2018 8 External, for example consultants or researchers have run or contributed to the evaluation Internal evaluation run by the workplace
  • 10. Fact: Purposes 25.06.2018 10 Learning to improve (innovation) efforts Examine whether the innovation has achieved its goal Spreading ones experiences with others Documenting the value of the innovation to decision makers To better manage the innovation process along the way Other
  • 11. 25.06.2018 11 2. WHY did COI decide to work in these two fields #DefiningTheProblem
  • 12. Defining the problem pt. 1: Based on the InnovationBarometer 25.06.2018 12 1. (not) knowing the value 2. the needs of the internal evaluators 3. (not) documenting results
  • 13. Defining the problem pt. 1: Based on field work 25.06.2018 13 1. A lack of systematic evaluation approach within public sector innovation 2. A wish for: More evaluation of innovation Tools to evaluate innovation Increase in internal evaluation capacity Network and knowledge sharing on evaluating public sector innovation 3. How do we increase internal evaluation capacity? New models and methods Innovators need evaluation skills Specific guides or tools to evaluate public sector innovation Network and knowledge sharing on evaluating public sector innovation
  • 14. Two fields crossing paths or not crossing paths 25.06.2018 14
  • 15. 25.06.2018 15 3. WHAT did COI do? #ShapingTheSolution
  • 16. 25.06.2018 16 Publication 3. Developing and testing prototypes 2. Community building: Involving experts and pracitioners 1. Exploring the problem: What troubles us when evaluating innovation? Our approach:
  • 17. COIs evaluation activities (2015-16): Innovators getting evaluation skills 25.06.2018 17 Conference: HELP! Im evaluating an innovative activity Network and knowledge sharing Models and methods Publication: Guidance on evaluating an innovative activity by professor Peter Dahler Larsen Models and methods Working community on evaluating innovation Network and knowledge sharing Specific guides and tools Publication: Hands-on- guide with dialogue tools Specific guides and tools
  • 18. Working community? 25.06.2018 18 Who: Participants: 30 consultants with knowledge on innovation/evaluation Advisory board: 4 experts theory/practice Purpose: To enhance evaluation of public sector innovation (more evaluations) To make evaluation of innovation easier for the internal evaluators (capacity building). Objectives: To create networks, knowledge sharing and development of new approaches between actors working on evaluating public innovation.
  • 19. Process: Working Community 25.06.2018 19 Introducing the suggested framework Testing it on an innovative activity Introducing: Theory, framework, definitions Further developing and qualifying content in a framework for evaluation of innovation Open space: Tools, framework, skills, planning, roles. Developing the framework Testing a prototype Evaluating the work Testing the prototype
  • 21. The concept: Four phases when evaluating innovation 25.06.2018 21
  • 22. Dialogue tool - example 25.06.2018 22
  • 24. 25.06.2018 24 4. NOW WHAT? #NextSteps
  • 25. Since 2016 25.06.2018 25 Distribution, introductory workshops and training Article and evaluation conferences: Combining spreading and evaluation Organizational evaluation capacity: When political leadership ask for evaluation of innovation
  • 27. Thank you Find our (limited) products and information in English on coi.dk/about-coi Our tool kit is available for download in Danish: coi.dk/evaluering . Lene Krogh Jeppsen: lkj@coi.dk // @lenekrogh 25.06.2018 27

Editor's Notes

  1. Im Lene Krogh Jeppesen and am a senior consultant at the Danish National Centre for Public Sector Innovation. In this talk I will introduce you to our take on public sector innovation, the challenge of evaluating public sector innovation, the approaches COI put together to evaluate Danish innovations, and their results so far.
  2. First a few words on what COI is and how we understand public sector innovation. We span all three layers of the public sector and have been around since 2014. It is a rather unique structure as far as we can tell in that there is a common ownership and financing both by the government, by the association of local municipalities and by the parallel association for the regional level organizations. There is 9 of us and a part time student. Most of us have a background in doing public sector innovation we know what its like getting our hands dirty and trying to change the public sector from within. To us innovation is a new or significantly changed way of improving the workplaces activities and results. In other words: DOING, not just talking and getting ideas.
  3. Some of you might have heard that we have what we think is the worlds first minister for public sector innovation: Sophie L淡jde. Here she is together with our director. She is one of two ministers within the ministry of finance where COI is now organizationally housed, but with a high degree of independence thanks to the aforementioned common ownership and financing both by the government, by the association of local municipalities and by the parallel association for the regional level organizations. Within the Ministry of Finance the Minister for Public Sector Innovation is responsible for the major agencies that constitute the machine room of the Danish public sector: The four agencies for Modernization of the public sector, Digitization, Governmental Administration, Governmental IT-services and us at the COI. We in COI have as the obligation to get more quality and efficiency in and across the public sector by strengthening the valuable outcome of innovation.
  4. Until the end of 2016 COI has prioritized building a strong foundation for the continuous work with public sector innovation in Denmark. Our InnovationBarometer the worlds first statistic on public sector innovation has shown that 80% of the Danish public workplaces are innovative and that three out of four innovations are inspired by or even copied from other workplaces. Thus killing myths on the sorry state of public sector innovation. Weve established an innovation internship, that builds relations and connections across the public sector, thus enabling the spreading of innovation. Our two networks for innovation leaders and innovators have focused on establishing deep relationships so that the hardships of being innovative within the public sector can be shared openly and solutions can be found. Weve developed dialogue tools and kits to enable innovators evaluate and spread innovation. Since 2017 we have worked on using these building blocks in new combinations and on new territories in order to achieve changes in the system and behaviors and to continuously ensure that (new) knowledge is used. We step in to the roles of dialogue- and leverage-partner, capacitybuilder and knowledge-partner. All of this also means that as we are 9 people supporting 800.000 public sector employees our focus is very much on enabling the actors in the public sector that make innovation happen on a day to day basis. Be it the consultant, projectmanager, middle management or toplevel managers and politicians. The work we did on evaluating public sector innovation is one example of this. We knew there were still challenges to doing this and we worked with the innovators and evaluators to find a path forward a structure that can help innovators.
  5. So, now for todays actual agenda: Evaluating public sector innovation. Ive structured the talk in 4 sections: WHAT is the state of evaluating public sector innovation in Denmark? #Facts WHY did COI decide to work in these two fields? #DefiningTheProblem the two fields being innovation and evaluation WHAT did COI do? #ShapingTheSolution NOW WHAT? #NextSteps
  6. Ill start with some facts from our InnovationBarometer, which is the worlds first statistic on public sector innovation in accordance with the Oslo Manual. Weve collected data twice regarding 2013/14 and 2015/16. The following numbers are from the first data collection. We havent fully crunched the numbers from the second collection.
  7. From the statistic we know that 44% of the public sector innovations have been evaluated
  8. We know that a vast majority of those are internal evaluations.
  9. We know that nearly a third of the evaluations are based on the workplaces own professional assesments whereas only 11% are based on studies among the citizens and/or companies that the innovations are meant to serve.
  10. We know that learning is very high on the purpose-agenda, whereas documenting the value to decision makers and using the evaluation to better manage the innovaiton proces along the way only happens in about 1 in five of the evaluations.
  11. So, knowing these facts how does COI see the problems with public sector innovation?
  12. Our definition of innovation is that it is an actual improvement but how do we know if it is actually in improvement? How do we know if public sector innovations show any value and are an improvement if only 44% are evaluated? Seeing as so many of the evaluations are done internally: How can we establish a basis for better internal evaluation practices? Especially considering how relatively mature the innovation approach is in the public sector in Denmark: How can we continue to argue that innovation is a proper approach in the public sector when we have so little focus on documenting results? We chose to supplement these defining questions with field work to gather even more knowledge.
  13. The field work amongst some of the places in the public sector, where we knew they actively did work with evaluation of innovation showed us: That the numbers were right: There is a lack of systematic evaluation approach within public sector innovation We also saw a wish for: More evaluation of innovation Tools to evaluate innovation Increase in internal evaluation capacity Network and knowledge sharing on public sector innovation The evalutors pointed to several ways to increase internal evaluation capacity several ideas for actions and solutions for building capacity: New models and methods Innovators need evaluation skills Specific guides or tools to evaluate public sector innovation Network and knowledge sharing on public sector innovation
  14. One of our realizations was that it really was a matter of two different mindsets and paradigms having trouble crossing paths. Whereas the evaluator work retrospectively the innovator works futurespectively. Whereas the evaluator works in a linear manner, the innovator works in a circular manner. Whereas the evaluator is dedicated to measuring, the innovator is dedicated to the process. In the intersection they unite around a systematic approach to their respective work fields. The intersection is not crowded and we do want more people to play with in this intersection.
  15. So what did we actually do to find answers to this complex of problems?
  16. Having explored the problem we started community building, then developed and tested prototypes in order to publish tools and guides to help innovators evaluate more and better.
  17. Aside from showing you specifically how we worked - I wanted to give you an idea of the timeline. All of this took place in under af year. The conference was held in late november 2015 and the handson guide went to print over the summer of 2016 I want to show you that these four main activities combined was the first part of a solution to the problems with evaluating innovation. We kind of launched the intensive work towards a solution with a conference this initiated the community building. We had talks on evaluation of innovation in theory and practice and worked on an evaluation case. We published a 50 page booklet which was written by an esteemed Danish professor of evaluation. It gives an overview of the field of evaluation the various theoretic approaches and some overall advice on evaluating innovation. We initiated a working community to help us codesign exactly how we could help innovators evaluate The output from the working community was published as a hands on guide with dialogue tools. And now Ill dig a little deeper in to the working community, where we co-designed the hands-on guide with experts and practitioners.
  18. So what is a working community? To us it was a group of participants who helped us co-design what they actually needed to help the field of public sector innovations do more and better evaluations. An advisory board of researchers and practioners kept us on track and were useful for discussing general direction and implementation of the products. During the three sessions when the community met, we switched between different types of work: Presentations on evaluation of innovation in theory and practice so the participants felt they could take something away from the meetings as well as give us something. Discussions Hands-on work with an evaluation case and joint exercises.
  19. This was the overall flow of the three meetings in the co-design process. The entire proces took place over five weeks. There was a weeks gap between the first to meetings and 4 weeks between the last two.
  20. It was just half-day meetings, so the key in planning it was to be structured and to be structured in the testing during the meetings and in making sure we got all the insights from the practioners and used them to develop relevant tools. We continued testing and adjusting the prototype after the last meeting.
  21. The result was a model consisting of four phases when evaluating innovation. Ill just briefly give you a bit of insight in to why we chose these phases: CLARIFY: Why evaluate? Because anchoring an innovative project in the organization is a challenge. The dialogue should start early - and be returned to continually. We should connect innovation and evaluation in the beginning. So they do not run off in different directions. PLAN: How to evaluate? Because evaluation is not something that "just" happens neither is innovation for that matter. Because the evaluation must be structured and systematic and in order to be able to make a valid assessment. DO IT! Create knowledge! Because we should describe and retain the knowledge we create during the innovation- and evaluation process. We use this to adjust and steer the way. To come to a conclusion on our evaluation: Does the innovation create value? USE IT! Use the knowledge We need knowledge from the evaluation of the innovation for further work. How can this knowledge be diffused, used for further learning and reported to relevant users? I wouldnt have guessed in advance that the clarify phase would be so important to practioners. This is really where there is still plenty of problems. If you dont have a clear idea in your organization on what youre working on you will never succed with doing a good evaluation or with getting people to use the knowledge from the evaluations.
  22. This is just an example of one of the 10 dialogue tools that have become part of the hands-on guide. Aside from the nice print version they can be downloaded from our website in ppt or pdf. Its A3 in size so you can sit your project group down and have a joined discussion around the questions and thus establish a common ground for the evaluation and innovation process. The gudiebook that frames the 10 dialogue tools consists of: An introduction: Why evaluate innovation? Providing arguments and definitions What are the requirements for your evaluation? Evidence, tendencies and experiences focus on structure. Important terms: validity, control groups and baseline An overview of 8 qualitative and quantitative methods that you might choose for your evaluation: Observation, interview, service journeys, RCT, register (count things!), vox pop, Survey/questionaire, registerbased analysis. All described focusing on: pro/cons and how to collect/analyse data as well as what kind of information this method will give me and what tools/requirement I need. This is not targeted at larger scale program evaluations and I dont expect people to be able to do rcts based on this. But for the innovation projects run internally in the public sector workplaces this should help provide structure and a basis for doing better evaluations.
  23. As with our other work at COI we put a lot of effort in to delivering our products in nice packages. The design and the quality of it makes them in to artefacts that legitimize innovation work at the public sector workplaces. And when we put an effort in to making tools and materials that are not only build on a solid foundation of knowledge but also look nice, this encourages people to also put in the effort when they work with this.
  24. So where did we and do we go from this point?
  25. We launched and distributed the publications. Our materials are free for the Danish public sector employees. We ordered 1000 prints of the guidance-booklet written by the professor and we are pretty much out of those now. It can only be downloaded from our website. We ordered 1500 prints of the hands-on-publication with the dialogue tools weve distributed 600 of those, so for the hard working innovative evaluators theres still nice supplies to get from us. Ive done some introductory workshops introducing the evaluation kit and one training session, where I spend three days over half a year taking a unit through working with the kit on specific evaluations they were doing. We havent evaluated the kit and I honestly dont know the effect it has had on the community. I dont expect it to solely push the numbers higher than the 44% evaluated but I do hope over a longer time to push the field in the right direction. Ive co-written an article on evaluating innovation this model is from that article where we take a stab at describing the role of the evaluator. Ive done a learning session at the major Danish evaluation conference where we focused on how to further integrate spreading and evaluating innovation. What actions can you take in your evaluation if you want to make it easier for yourself to spread the innovation? And this year Im doing a session at the conference on organisational evaluation capacity. Working with a municipality where the local politicians have asked for evaluations of innovations in order for them to make better decisions.
  26. This is still a field that lacks attention. I still see innovators giving up. They get overwhelmed by the complexity of evaluating and choose to do nothing. And they still dont get started early enough. And neither political nor administrative leaders ask for the right kind of evaluation or give it continuous attention. In the intersection between the fields of evaluation and innovation we are still lacking playmates.
  27. That was it thank you for showing an interest in our work. I hope it made some sort of sense to you and Im happy to answer any questions you might have.