This document provides an overview of developmental evaluation, which aims to support social innovation in complex environments. It discusses how a developmental evaluator works collaboratively with innovators to conceptualize, design, and test new approaches through an ongoing process of adaptation. The evaluator helps elucidate the innovation process, track implications and results, and facilitate real-time decision making. Complexity concepts are important to developmental evaluation. The case study presented examines how developmental evaluation can support innovation from the initial stages.
1 of 46
More Related Content
A Case Study on the Use of Developmental Evaluation for Navigating Uncertainty during Social Innovation
1. A Case Study on the Use of
Developmental Evaluation
for Navigating Uncertainty
during Social Innovation
Chi Yan Lam, MEd
AEA 2012 @chiyanlam
October 25, 2012
Assessment and Evaluation Group, Queens University
際際滷s available now at www.chiyanlam.com
2. The signi鍖cant
problems we have
cannot be solved at the
same level of thinking
with which we created
them.
http://yareah.com/wp-content/uploads/2012/04/einstein.jpg
3. Developmental Evaluation
in 1994
collaborative, long-term
partnership
purpose: program
development
observation: clients who
eschew clear,speci鍖c,
measurable goals
4. Developmental Evaluation
in 2011
takes on a responsive,
collaborative, adaptive
orientation to evaluation
complexity concepts
systems thinking
social innovation
5. Developmental Evaluation (Patton, 1994, 2011)
DE supports innovation development Evaluator works collaboratively with
to guide adaptation to emergent and social innovators to conceptualize,
dynamic realities in complex design, and test new approaches in
environments long-term, ongoing process of
adaptation, intentional change and
DE brings to innovation and adaptation development.
the processes of:
Primary functions of evaluator:
asking evaluative questions
elucidate the innovation and
applying evaluation logic adaptation processes
gathering and reporting eval data track their implications and results
to inform support project/
program/product, and/or facilitate ongoing, real-time data-
organizational development in real based decision-making in the
time. Thus, feedback is rapid. developmental process.
6. Developmental Evaluation (Patton, 1994, 2011)
DE supports innovation development to Evaluator works collaboratively with
guide adaptation to emergent and social innovators to conceptualize,
dynamic realities in complex design, and test new approaches in
environments long-term, ongoing process of
adaptation, intentional change and
DE brings to innovation and adaptation development.
the processes of:
Primary functions of evaluator:
asking evaluative questions
elucidate the innovation and
applying evaluation logic adaptation processes
gathering and reporting eval track their implications and results
data to inform support project/
program/product, and/or facilitate ongoing, real-time
organizational development in real data-based decision-making
time. Thus, feedback is rapid. in the developmental process.
7. Developmental Evaluation (Patton, 1994, 2011)
Improvement
DE supports innovation development
to guide adaptation to emergent and
dynamic realities in complex
Evaluator works collaboratively with
social innovators to conceptualize,
design, and test new approaches in
environments long-term, ongoing process of
adaptation, intentional change and
DE brings to innovation and adaptation development.
the processes of:
asking evaluative questions vs Primary functions of evaluator:
elucidate the innovation and
applying evaluation logic adaptation processes
gathering and reporting eval data track their implications and results
to inform support project/
Development facilitate ongoing, real-time data-
program/product, and/or
organizational development in real based decision-making in the
time. Thus, feedback is rapid. developmental process. .
10. DE: novel, yet-
to-be developed
empirical &
practical basis
Research on
Evaluation:
scope and
limitations
utility &
suitability
diff. context
Legitimize DE
12. Research Purpose
to learn about the capacity of developmental
evaluation to support innovation
development.
(from nothing to something)
12
13. Research Questions
1.
To what extent does Assessment Pilot Initiative qualify as a
developmental evaluation?
2.
What contribution does developmental evaluation make to
enable and promote program development?
3.
To what extent does developmental evaluation address the
needs of the developers in ways that inform program development?
4.
What insights, if any, can be drawn from this development about the
roles and the responsibilities of the developmental evaluator?
13
14. Social Innovation
SI aspire to change and transform social
realities (Westley, Zimmerman, & Patton,
2006)
generating novel solutions to social
problems that are more effective, ef鍖cient,
sustainable, or just than existing solutions and
for which the value created accrues primarily
to society as a whole rather than private
individuals (Phills, Deiglmeier, & Miller, 2008)
14
16. Simple Complicated Complex C
predictable
replicable
predictable
replicable
unpredictable
dif鍖cult to replicate
h
a
known known unknown
many variables/parts many interacting
causal if-then working in tandem in variables/parts
models
sequence
requires expertise/training systems thinking? o
complex dynamics?
causal if-then models (Westley, Zimmerman, Patton, 2008) s
18. Complexity Concepts
understanding dynamical behaviour of
systems
description of behaviour over time
metaphors for describing change
how things change
NOT predictive, not explanatory
(existence of some underlying principles; rules-
driven behaviour)
18
19. Complexity Concepts
Nonlinearity (butter鍖y 鍖aps its wings, black swan); cause and
effect
Emergence: new behaviour emerge from interaction... cant really
predetermine indicators
Adaptation: systems respond and adapt to each other, to
environments
Uncertainty: processes and outcomes are unpredictable,
uncontrollable, and unknowable in advance.
Dynamical: interactions within, between, among subsystems change
in an unpredictable way.
Co-evolution: change in response to adaptation. (growing old
together)
19
20. Systems Thinking
Pays attention to the in鍖uences and
relationships between systems in reference to
the whole
a system is a dynamic, complex, structured
functional unit
there is 鍖ow and exchanges between systems
systems are situated within a particular
context
20
22. Practicing DE
Adapative to context, agile in methods,
responsive to needs
evaluative thinking - critical thinking
bricoleur
purpose-and-relationship-driven not
[research] method driven(Patton, 2011, p.
288)
23. Five Uses of DE (Patton, 2011, p. 194)
Five Purposes and Uses
1. Ongoing development in adapting program, strategy,
policy, etc.
2. Adapting effective principles to a local context
3. Developing a rapid response
4. Preformative development of a potentially broad-
impact, scalable innovation
5. Major systems change and cross-scale developmental
evaluation
23
24. Method & Methodology
Questions drive method (Greene, 2007; Teddlie and Tashakkori,
2009)
Qualitative Case Study
understanding the intricacies into the phenomenon and
the context
Case is a speci鍖c, unique, bounded system (Stake,
2005, p. 436).
Understanding the systems activity, and its function and
interactions.
Qualitative research to describe, understand, and infer
meaning.
24
25. Data Sources
Three pillars of data
1. Program development records
2. Development Artifacts
3. Interviews with clients on the signi鍖cance
of various DE episodes
25
26. Data Analysis
1. Reconstructing evidentiary base
2. Identifying developmental episodes
3. Coding for developmental moments
4. Time-series analysis
26
28. Assessment Pilot Initiative
Describes the innovative efforts of a team of 3
teacher educators promoting contemporary
notions of classroom assessment
Teaching and Learning Constraints ($, time, space)
Interested in integrating Social Media into Teacher
Education (classroom assessment)
The thinking was that assessment learning
requires learners to actively engage with peers
and challenge their own experiences and
conceptions of assessment.
28
30. Book-ending: Concluding
Conditions
22 teacher candidates participated in a hybrid, blended
learning pilot. They tweeted about their own experiences
around trying to put into practice contemporary notions of
assessment
Guided by the script: Think Tweet Share - grounded in e-
learning and learning theories
Developmental evaluation guided this exploration, between
the instructors, evaluator, and teacher candidates as a
collective in this participatory learning experience.
DE became integrated; Program became agile and
responsive by design
30
33. Key Developmental Episodes
Ep 1: Evolving understanding in using social media
for professional learning.
Ep 2: Explicating values through Appreciative
Inquiry for program development.
Ep 3: Enhancing collaboration through structured
communication
Ep 4: Program development through the use of
evaluative data
Again, you can't connect the dots looking forward; you can only connect them looking backwards. - Steve Jobs
33
34. (Wicked) Uncertainty
uncertain about how to proceed
uncertain about in what direction to proceed (given many choices)
uncertain how teacher candidates would respond to the intervention
the more questions we answered , the more questions we raised.
Typical of DE:
Clear, Measurable, and Speci鍖c Outcomes
Use of planning frameworks.
Traditional evaluation cycles wouldnt work.
34
35. How the innovation came to
be...
Reframing what constituted data
not intentional, but an adaptive response
informational needs concerning
development; collected, analyzed,
interpreted
relevant theories, concepts, ideas;
introduced to catalyze thinking. Led to
learning and un-learning.
35
36. Major Findings
RQ1: To what extent does API qualify as a
developmental evaluation?
1. Preformative development of a potentially
broad-impact, scalable innovation
2. Patton: Did something get developed?
(Improvement vs development vs innovation)
36
37. RQ2: What contribution does DE make to
enable and promote program development?
1. Lent a data-informed process to innovation
2. Implication: responsiveness
program-in-action became adaptive to
the emergent needs of users
3. Consequence: resolving uncertainty
37
38. RQ3: To what extent does DE address the
needs of developers in ways that inform
program development?
1. De鍖nition - de鍖ning the problem
2. Delineation - narrowing down the problem space
3. Collaboration - collaboration processes; drawing on
collective strength and contributions
4. Prototyping - integration and synthesis of ideas to ready
a program for implementation
5. Illumination - iterative learning and adaptive development
6. Evaluation - formal evaluation processes to reality-test
38
39. Implications to
Evaluation
One of the 鍖rst documented case study into
developmental evaluation
Contributions into understanding, analyzing
and reporting development as a process
Delineating the kinds of roles and
responsibilities that promote development
The notion of design emerges from this
study
39
41. Program as co-created
Attending to the theory of the program
DE as a way to drive the innovating process
Six foci of development
Designing programs?
43. Design+Design Thinking
Design is the systematic exploration into the complexity of options (in
program values, assumptions, output, impact, and technologies) and
decision-making processes that results in purposeful decisions about the
features and components of a program-in-development that is informed by
the best conception of the complexity surrounding a social need.
Design is dependent on the existence and validity of highly situated and
contextualized knowledge about the realities of stakeholders at a site of
innovation. The design process 鍖ts potential technologies, ideas, and
concepts to recon鍖gure the social realities. This results in the emergence of
a program that is adaptive and responsive to the needs of program users.
(Lam, 2011, p. 137-138)
43
44. Implications to
Evaluation Practice
1. Manager
2. Facilitator of learning
3. Evaluator
4. Innovation thinker
44
45. Limitations
Contextually bound, so not generalizable
but it does add knowledge to the 鍖eld
Data of the study is only as good as the data collected from
the evaluation
better if I had captured the program-in-action
Analysis of the outcome of API could help strength the case
study
but not necessary to achieving the research foci
Cross-case analysis would be a better method for generating
understanding.
45
Welcome! Thanks for coming today. It truly exciting to speak on the topic of develpmental evaluation and many thanks to the evaluation use tig for making this possible. Today, I want to share the results of a project I”ve been working on for some time, and in doing so, challenge our collective thinking around the space of possibilty created by developmental evaluation. For this paper, i want to focus on the innovation space and on the process of innovating. The slides are already available at www.chiyanlam.com and will be available shortly under the Eval Use TIG elibrary. \n
Let me frame with presentation with a quote by Albert Einstein.\n
in 1994, Patton made the observation that some clients resisted typical formative/summative evaluation approaches because of the work that they do. They themselves don’t see a point in freezing a program in time in order to have it assessed and evaluated. He described an approach where he worked collaboratively, as part of the team, to help render evaluative data that would help these program staff adapt and evolve their programs. So, the process of engaging in “developmental evaluation” becomes paramount.\n
Fast forward to 2011, ... These are concepts that I will only touch on briefly. \n
Developmental evaluation is positioned to be a response to evaluators who work in complex space. \n\nFirst described by Patton in 1994, and futher elaborated in 2011, Developmental Evaluation proposes a collaborative and participatory approach to involving the evaluator in the development process. \n\n\n
\n\n
Developmental evaluation is positioned to be a response to evaluators who work in complex space. \n\nFirst described by Patton in 1994, and futher elaborated in 2011, Developmental Evaluation proposes a collaborative and participatory approach to involving the evaluator in the development process. \n\n\n
So what underlies DE is a commitment to reality testing. Patton positions it as one of the approaches one could taken within a utilization-focused framework: formative/summative/developmental. \n
You only need to conferences like this one to hear the buzz and curiosity over developmental evaluation. In february, a webinar was offered by UNICEF, and over 500+... \n
If we move beyond the excitement and buzz around DE, we see that DE is still very new. There is not a lot of empirical or practical basis to the arguments. If we are serious about the utilization about DE, \n
So, in the remaining time, I want us to dig deep into a case of developmental evaluation. \n
\n
\n
The observation that Patton made is very acute back in 1994 --- SI don’t rest, programs don’t stand still, addressing social problems means aiming a moving target that shifts as society changes...\n
\n
Let’s put it in a learning context. Simple would be teaching you CPR. I”ll keep the steps simple, rehearse it many times, so that you can do it when needed. \nComplicated would be teaching teacher candidates how to plan a lesson while taken into consideration curriculum expectations, learning objectives, isntructional methods/strategies, and assessment methods\nComplex - preparing TC to become a professional. We have many diff. parts (prac, foci, prof classes), think, behave, and participate like a contributing member of the profession. \n
\n
\n
\n
\n
\n
\n
\n
Document analysis of 10 design meetings over a 10-month period\n reveals the developmental “footprint”\n tracks and identifies specific developmental concerns being unpacked at a certain point in time of the project\n Interviews with the core design team members (2 instructors, 1 lead TA)\n illuminates which aspects of the DE was found helpful by the program designers\n
\n
\n
\n
The instructors of the case were responsible for teaching teacher candidates enrolled in a teacher education program classroom assessment. \n\nIn the field of teacher education, particularly in classroom assessment, the field is experiencing a very tricky situation where teachers are not assessing in the ways that we know helps with student learning. Much of what teachers do currently focuses on traditional notions of testing. At the level of teacher education, the problem is felt more acutely, because teacher candidates are not necessarily exhibiting the kinds of practice we would like to see from them. At my institution, we have two instructors responsible for delivering a module in classroom assessment in 7 hours total. In brief, there are many constraints that we’ve to work around, many of which we have little control over.\n\nWhat we do have control over is how we deliver that instruction. After a survey of different options, we were interested in integrating social media into teacher education as a way of building a community of learners. Our thinking was that assessment learning requires learners to actively engage with peers and to challenge their own experiences and conceptions of assessment.\n\nThat was the vision that guided our work. \n
So those were the beginning conditions. let me fast forward and describe for you how far we got in the evaluation. Then we’ll look at how the innovation came to be.\n\n
\n
\n
\n
The Development was marked by several key representative episodes:\n1 - creating a learning environment\n2 - use of AI; to help explicate values, so to gain clarity into the QUALITY of the program. \n3 - an example of how DE promotes collaboration\n4 - the use of DE findings to engage clients in sense-making in order to formulate next steps. \n
When I look back at the data, uncertainty was evident throughout the evaluation. the team was uncertain about...\n
... typically data in an evaluation are made at the program-level, descirbing quality... \n
So let’s return now and unpack the case. \n