際際滷

際際滷Share a Scribd company logo
ChiYan Lam, MEd
CES 2013
Insights on Using Developmental Evaluation
for Innovating:
A Case Study on the Co-creation of an
innovative program
@chiyanlam
June 11, 2013
Assessment and Evaluation Group, Queens University
際際滷s available at www.chiyanlam.com
1Monday, 10 June, 13
Acknowledgements
2Monday, 10 June, 13
The signi鍖cant
problems we have
cannot be solved at the
same level of thinking
with which we created
them.
http://yareah.com/wp-content/uploads/2012/04/einstein.jpg
3Monday, 10 June, 13
4Monday, 10 June, 13
Impersonal
Hard to Focus
Passive
5Monday, 10 June, 13
Learn from peers
Re鍖ect on prior
experiences
Meaning-making
Active construction
of knowledge
+
6Monday, 10 June, 13
Dilemma
 Barriers to Teaching and Learning: $, time, space.
 PRACTICUM = out of sight, out of touch.
 Instructors became interested in integrating Web
technologies to Teacher Education to open up
possibilities
 The thinking was that assessment learning
requires learners to actively engage with peers
and challenge their own experiences and
conceptions of assessment.
7
7Monday, 10 June, 13
So what happened...?
 22 teacher candidates participated in a hybrid, blended
learning pilot. They tweeted about their own
experiences around trying to put into practice
contemporary notions of assessment.
 Guided by the script:Think Tweet Share
 Developmental evaluation guided this exploration,
between the instructors, evaluator, and teacher candidates
as a collective in this participatory learning experience.
 DE became integrated; Program became agile and
responsive by design
8
8Monday, 10 June, 13
9
9Monday, 10 June, 13
Research Purpose
to learn about the capacity of developmental
evaluation to support innovation
development.
(from nothing to something)
10
10Monday, 10 June, 13
Research Questions
1.	

 To what extent does Assessment Pilot Initiative qualify as a
developmental evaluation?
2.	

 What contribution does developmental evaluation make to
enable and promote program development?
3.	

 To what extent does developmental evaluation address the
needs of the developers in ways that inform program development?
4.	

 What insights, if any, can be drawn from this development about the
roles and the responsibilities of the developmental evaluator?
11
11Monday, 10 June, 13
Literature Review
12Monday, 10 June, 13
Developmental Evaluation
in 1994
 collaborative, long-term
partnership
 purpose: program
development
 observation: clients who
eschew clear, speci鍖c,
measurable goals
13Monday, 10 June, 13
Developmental Evaluation
in 2011
 takes on a responsive,
collaborative, adaptive
orientation to evaluation
 complexity concepts
 systems thinking
 social innovation
14Monday, 10 June, 13
Developmental Evaluation
DE supports innovation development
to guide adaptation to emergent and
dynamic realities in complex
environments.
DE brings to innovation and adaptation
the processes of:
 asking evaluative questions
 applying evaluation logic
 gathering and reporting eval data
to inform support project/
program/product, and/or
organizational development in real
time.Thus, feedback is rapid.
Evaluator works collaboratively with
social innovators to conceptualize,
design, and test new approaches in
long-term, ongoing process of
adaptation, intentional change and
development.
Primary functions of evaluator:
 elucidate the innovation and
adaptation processes
 track their implications and results
 facilitate ongoing, real-time data-
based decision-making in the
developmental process.
(Patton, 2011)
15Monday, 10 June, 13
Developmental Evaluation
DE supports innovation development to
guide adaptation to emergent and
dynamic realities in complex
environments.
DE brings to innovation and adaptation
the processes of:
 asking evaluative questions
 applying evaluation logic
 gathering and reporting eval
data to inform support project/
program/product, and/or
organizational development in real
time.Thus, feedback is rapid.
Evaluator works collaboratively with
social innovators to conceptualize,
design, and test new approaches in
long-term, ongoing process of
adaptation, intentional change and
development.
Primary functions of evaluator:
 elucidate the innovation and
adaptation processes
 track their implications and results
 facilitate ongoing, real-time
data-based decision-making
in the developmental process.
(Patton, 2011)
16Monday, 10 June, 13
Developmental Evaluation
DE supports innovation development
to guide adaptation to emergent and
dynamic realities in complex
environments
DE brings to innovation and adaptation
the processes of:
 asking evaluative questions
 applying evaluation logic
 gathering and reporting eval data
to inform support project/
program/product, and/or
organizational development in real
time.Thus, feedback is rapid.
Evaluator works collaboratively with
social innovators to conceptualize,
design, and test new approaches in
long-term, ongoing process of
adaptation, intentional change and
development.
Primary functions of evaluator:
 elucidate the innovation and
adaptation processes
 track their implications and results
 facilitate ongoing, real-time data-
based decision-making in the
developmental process.
(Patton, 2011)
Improvement
vs
Development .
17Monday, 10 June, 13
Developmental
Evaluation
is
reality
testing.
18Monday, 10 June, 13
1. Social Innovation
 SI aspire to change and transform social
realities (Westley, Zimmerman, & Patton, 2006)
 SI is about generating novel solutions to
social problems that are more effective,
ef鍖cient, sustainable, or just than existing solutions
and for which the value created accrues
primarily to society as a whole rather than
private individuals (Phills, Deiglmeier, & Miller,
2008)
19
19Monday, 10 June, 13
2. Complexity Thinking
20
Situational Analysis Complexity Concepts
Sensitizing frameworks that
attunes the evaluators to
certain things
20Monday, 10 June, 13
Simple Complicated Complex
 predictable
 replicable
 known
 causal if-then
models
 unpredictable
dif鍖cult to replicate
 unknown
 many interacting
variables/parts
 systems thinking?
complex dynamics?
 predictable
 replicable
 known
 many variables/parts
working in tandem in
sequence
 requires expertise/training
 causal if-then models (Westley, Zimmerman, Patton, 2008)
C
h
a
o
s21Monday, 10 June, 13
http://s3.frank.itlab.us/photo-essays/small/apr_05_2474_plane_moon.jpg
22Monday, 10 June, 13
Complexity Concepts
 understanding dynamical behaviour of
systems
 description of behaviour over time
 metaphors for describing change
 how things change
 NOT predictive, not explanatory
 (existence of some underlying principles; rules-
driven behaviour)
23
23Monday, 10 June, 13
Complexity Concepts
 Nonlinearity (butter鍖y 鍖aps its wings, black swan); cause and
effect
 Emergence: new behaviour emerge from interaction... cant really
predetermine indicators
 Adaptation: systems respond and adapt to each other, to
environments
 Uncertainty: processes and outcomes are unpredictable,
uncontrollable, and unknowable in advance.
 Dynamical: interactions within, between, among subsystems change
in an unpredictable way.
 Co-evolution: change in response to adaptation. (growing old
together)
24
24Monday, 10 June, 13
3. Systems Thinking
 Pays attention to the in鍖uences and relationships
between systems in reference to the whole
 a system is a dynamic, complex, structured
functional unit
 there is 鍖ow and exchanges between sub-
systems
 systems are situated within a particular
context
25
25Monday, 10 June, 13
Complex Adaptive Dynamic Systems
26Monday, 10 June, 13
Practicing DE
 Adaptive to context, agile in methods,
responsive to needs
 evaluative thinking - critical thinking
 bricoleur
 purpose-and-relationship-driven not
[research] method driven(Patton, 2011, p.
288)
27Monday, 10 June, 13
Five Purposes and Uses:
1. Ongoing development in adapting program, strategy,
policy, etc.
2. Adapting effective principles to a local context
3. Developing a rapid response
4. Preformative development of a potentially broad-
impact, scalable innovation
5. Major systems change and cross-scale developmental
evaluation
28
(Patton, 2011, p. 194)
Five Uses of DE
28Monday, 10 June, 13
Method & Methodology
 Questions drive method (Greene, 2007; Teddlie and Tashakkori,
2009)
 Qualitative Case Study
 understanding the intricacies into the phenomenon and
the context
 Case is a speci鍖c, unique, bounded system (Stake,
2005, p. 436).
 Understanding the systems activity, and its function and
interactions.
 Qualitative research to describe, understand, and infer
meaning.
29
29Monday, 10 June, 13
Data Sources
Three pillars of data:
1. Program development records
2. Development Artifacts
3. Interviews with clients on the signi鍖cance of
various DE episodes
30
30Monday, 10 June, 13
Data Analysis
1. Reconstructing evidentiary base
2. Identifying developmental episodes
3. Coding for developmental moments
4. Time-series analysis
31
31Monday, 10 June, 13
32
!
32Monday, 10 June, 13
How the innovation
came to be...
33Monday, 10 June, 13
Key Developmental Episodes
 Ep 1: Evolving understanding in using social media
for professional learning.
 Ep 2: Explicating values through Appreciative
Inquiry for program development.
 Ep 3: Enhancing collaboration through structured
communication
 Ep 4: Program development through the use of
evaluative data
34
Again, you can't connect the dots looking forward; you can only connect them looking backwards. - Steve Jobs
34Monday, 10 June, 13
(Wicked) Uncertainty
 uncertain about how to proceed
 uncertain about in what direction to proceed (given many choices)
 uncertain about the effects and outcome to how teacher
candidates would respond to the intervention
 the more questions we answered , the more questions we raised.
 Typical of DE:
 Clear, Measurable, and Speci鍖c Outcomes
 Use of planning frameworks.
 Traditional evaluation cycles wouldnt work.
35
35Monday, 10 June, 13
How the innovation came to
be...
 Reframing what constituted data
 not intentional, but an adaptive response
to emergent needs:
 informational needs concerning
development; collected, analyzed, interpreted
 relevant theories, concepts, ideas; introduced
to catalyze thinking. Led to learning and un-
learning.
36
36Monday, 10 June, 13
Major Findings
RQ1: To what extent does API qualify
as a developmental evaluation?
1. Preformative development of a potentially
broad-impact, scalable innovation
2. Patton: Did something get developed?
(Improvement vs development vs innovation)
37

37Monday, 10 June, 13
RQ2: What contribution does DE
make to enable and promote program
development?
1. Lent a data-informed process to innovation
2. Implication: responsiveness
 program-in-action became adaptive to
the emergent needs of users
3. Consequence: resolving uncertainty
38
38Monday, 10 June, 13
RQ3: To what extent does DE address
the needs of developers in ways that
inform program development?
1. De鍖nition - de鍖ning the problem
2. Delineation - narrowing down the problem space
3. Collaboration - collaboration processes; drawing on
collective strength and contributions
4. Prototyping - integration and synthesis of ideas to ready
a program for implementation
5. Illumination - iterative learning and adaptive development
6. Evaluation - formal evaluation processes to reality-test
39
39Monday, 10 June, 13
Implications to
Evaluation
 One of the 鍖rst documented case study into
developmental evaluation
 Contributions into understanding, analyzing
and reporting development as a process
 Delineating the kinds of roles and
responsibilities that promote development
 The notion of design emerges from this
study
40
40Monday, 10 June, 13
Implication to Theory
41Monday, 10 June, 13
 Program as co-created
 Attending to the theory of the program
 DE as a way to drive the innovating process
 Six foci of development
 Designing programs?
42Monday, 10 June, 13
Design and
Design Thinking
43
43Monday, 10 June, 13
Design+Design Thinking
Design is the systematic exploration into the complexity of options (in
program values, assumptions, output, impact, and technologies) and
decision-making processes that results in purposeful decisions about the
features and components of a program-in-development that is informed by
the best conception of the complexity surrounding a social need.
Design is dependent on the existence and validity of highly situated and
contextualized knowledge about the realities of stakeholders at a site of
innovation.The design process 鍖ts potential technologies, ideas, and
concepts to recon鍖gure the social realities.This results in the emergence of
a program that is adaptive and responsive to the needs of program users.
(Lam, 2011, p. 137-138)
44
44Monday, 10 June, 13
Implications to
Evaluation Practice
1. Manager
2. Facilitator of learning
3. Evaluator
4. Innovation thinker
45
45Monday, 10 June, 13
Limitations
 Contextually bound, so not generalizable
 but it does add knowledge to the 鍖eld
 Data of the study is only as good as the data collected from
the evaluation
 better if I had captured the program-in-action
 Analysis of the outcome of API could help strength the case
study
 but not necessary to achieving the research foci
 Cross-case analysis would be a better method for generating
understanding.
46
46Monday, 10 June, 13
ThankYou!
Lets Connect.
@chiyanlam
chi.lam@QueensU.ca
www.chiyanlam.com
47Monday, 10 June, 13

More Related Content

Insights on Using Developmental Evaluation for Innovating: A Case Study on the Co-Creation of an Innovative Program

  • 1. ChiYan Lam, MEd CES 2013 Insights on Using Developmental Evaluation for Innovating: A Case Study on the Co-creation of an innovative program @chiyanlam June 11, 2013 Assessment and Evaluation Group, Queens University 際際滷s available at www.chiyanlam.com 1Monday, 10 June, 13
  • 3. The signi鍖cant problems we have cannot be solved at the same level of thinking with which we created them. http://yareah.com/wp-content/uploads/2012/04/einstein.jpg 3Monday, 10 June, 13
  • 6. Learn from peers Re鍖ect on prior experiences Meaning-making Active construction of knowledge + 6Monday, 10 June, 13
  • 7. Dilemma Barriers to Teaching and Learning: $, time, space. PRACTICUM = out of sight, out of touch. Instructors became interested in integrating Web technologies to Teacher Education to open up possibilities The thinking was that assessment learning requires learners to actively engage with peers and challenge their own experiences and conceptions of assessment. 7 7Monday, 10 June, 13
  • 8. So what happened...? 22 teacher candidates participated in a hybrid, blended learning pilot. They tweeted about their own experiences around trying to put into practice contemporary notions of assessment. Guided by the script:Think Tweet Share Developmental evaluation guided this exploration, between the instructors, evaluator, and teacher candidates as a collective in this participatory learning experience. DE became integrated; Program became agile and responsive by design 8 8Monday, 10 June, 13
  • 10. Research Purpose to learn about the capacity of developmental evaluation to support innovation development. (from nothing to something) 10 10Monday, 10 June, 13
  • 11. Research Questions 1. To what extent does Assessment Pilot Initiative qualify as a developmental evaluation? 2. What contribution does developmental evaluation make to enable and promote program development? 3. To what extent does developmental evaluation address the needs of the developers in ways that inform program development? 4. What insights, if any, can be drawn from this development about the roles and the responsibilities of the developmental evaluator? 11 11Monday, 10 June, 13
  • 13. Developmental Evaluation in 1994 collaborative, long-term partnership purpose: program development observation: clients who eschew clear, speci鍖c, measurable goals 13Monday, 10 June, 13
  • 14. Developmental Evaluation in 2011 takes on a responsive, collaborative, adaptive orientation to evaluation complexity concepts systems thinking social innovation 14Monday, 10 June, 13
  • 15. Developmental Evaluation DE supports innovation development to guide adaptation to emergent and dynamic realities in complex environments. DE brings to innovation and adaptation the processes of: asking evaluative questions applying evaluation logic gathering and reporting eval data to inform support project/ program/product, and/or organizational development in real time.Thus, feedback is rapid. Evaluator works collaboratively with social innovators to conceptualize, design, and test new approaches in long-term, ongoing process of adaptation, intentional change and development. Primary functions of evaluator: elucidate the innovation and adaptation processes track their implications and results facilitate ongoing, real-time data- based decision-making in the developmental process. (Patton, 2011) 15Monday, 10 June, 13
  • 16. Developmental Evaluation DE supports innovation development to guide adaptation to emergent and dynamic realities in complex environments. DE brings to innovation and adaptation the processes of: asking evaluative questions applying evaluation logic gathering and reporting eval data to inform support project/ program/product, and/or organizational development in real time.Thus, feedback is rapid. Evaluator works collaboratively with social innovators to conceptualize, design, and test new approaches in long-term, ongoing process of adaptation, intentional change and development. Primary functions of evaluator: elucidate the innovation and adaptation processes track their implications and results facilitate ongoing, real-time data-based decision-making in the developmental process. (Patton, 2011) 16Monday, 10 June, 13
  • 17. Developmental Evaluation DE supports innovation development to guide adaptation to emergent and dynamic realities in complex environments DE brings to innovation and adaptation the processes of: asking evaluative questions applying evaluation logic gathering and reporting eval data to inform support project/ program/product, and/or organizational development in real time.Thus, feedback is rapid. Evaluator works collaboratively with social innovators to conceptualize, design, and test new approaches in long-term, ongoing process of adaptation, intentional change and development. Primary functions of evaluator: elucidate the innovation and adaptation processes track their implications and results facilitate ongoing, real-time data- based decision-making in the developmental process. (Patton, 2011) Improvement vs Development . 17Monday, 10 June, 13
  • 19. 1. Social Innovation SI aspire to change and transform social realities (Westley, Zimmerman, & Patton, 2006) SI is about generating novel solutions to social problems that are more effective, ef鍖cient, sustainable, or just than existing solutions and for which the value created accrues primarily to society as a whole rather than private individuals (Phills, Deiglmeier, & Miller, 2008) 19 19Monday, 10 June, 13
  • 20. 2. Complexity Thinking 20 Situational Analysis Complexity Concepts Sensitizing frameworks that attunes the evaluators to certain things 20Monday, 10 June, 13
  • 21. Simple Complicated Complex predictable replicable known causal if-then models unpredictable dif鍖cult to replicate unknown many interacting variables/parts systems thinking? complex dynamics? predictable replicable known many variables/parts working in tandem in sequence requires expertise/training causal if-then models (Westley, Zimmerman, Patton, 2008) C h a o s21Monday, 10 June, 13
  • 23. Complexity Concepts understanding dynamical behaviour of systems description of behaviour over time metaphors for describing change how things change NOT predictive, not explanatory (existence of some underlying principles; rules- driven behaviour) 23 23Monday, 10 June, 13
  • 24. Complexity Concepts Nonlinearity (butter鍖y 鍖aps its wings, black swan); cause and effect Emergence: new behaviour emerge from interaction... cant really predetermine indicators Adaptation: systems respond and adapt to each other, to environments Uncertainty: processes and outcomes are unpredictable, uncontrollable, and unknowable in advance. Dynamical: interactions within, between, among subsystems change in an unpredictable way. Co-evolution: change in response to adaptation. (growing old together) 24 24Monday, 10 June, 13
  • 25. 3. Systems Thinking Pays attention to the in鍖uences and relationships between systems in reference to the whole a system is a dynamic, complex, structured functional unit there is 鍖ow and exchanges between sub- systems systems are situated within a particular context 25 25Monday, 10 June, 13
  • 26. Complex Adaptive Dynamic Systems 26Monday, 10 June, 13
  • 27. Practicing DE Adaptive to context, agile in methods, responsive to needs evaluative thinking - critical thinking bricoleur purpose-and-relationship-driven not [research] method driven(Patton, 2011, p. 288) 27Monday, 10 June, 13
  • 28. Five Purposes and Uses: 1. Ongoing development in adapting program, strategy, policy, etc. 2. Adapting effective principles to a local context 3. Developing a rapid response 4. Preformative development of a potentially broad- impact, scalable innovation 5. Major systems change and cross-scale developmental evaluation 28 (Patton, 2011, p. 194) Five Uses of DE 28Monday, 10 June, 13
  • 29. Method & Methodology Questions drive method (Greene, 2007; Teddlie and Tashakkori, 2009) Qualitative Case Study understanding the intricacies into the phenomenon and the context Case is a speci鍖c, unique, bounded system (Stake, 2005, p. 436). Understanding the systems activity, and its function and interactions. Qualitative research to describe, understand, and infer meaning. 29 29Monday, 10 June, 13
  • 30. Data Sources Three pillars of data: 1. Program development records 2. Development Artifacts 3. Interviews with clients on the signi鍖cance of various DE episodes 30 30Monday, 10 June, 13
  • 31. Data Analysis 1. Reconstructing evidentiary base 2. Identifying developmental episodes 3. Coding for developmental moments 4. Time-series analysis 31 31Monday, 10 June, 13
  • 33. How the innovation came to be... 33Monday, 10 June, 13
  • 34. Key Developmental Episodes Ep 1: Evolving understanding in using social media for professional learning. Ep 2: Explicating values through Appreciative Inquiry for program development. Ep 3: Enhancing collaboration through structured communication Ep 4: Program development through the use of evaluative data 34 Again, you can't connect the dots looking forward; you can only connect them looking backwards. - Steve Jobs 34Monday, 10 June, 13
  • 35. (Wicked) Uncertainty uncertain about how to proceed uncertain about in what direction to proceed (given many choices) uncertain about the effects and outcome to how teacher candidates would respond to the intervention the more questions we answered , the more questions we raised. Typical of DE: Clear, Measurable, and Speci鍖c Outcomes Use of planning frameworks. Traditional evaluation cycles wouldnt work. 35 35Monday, 10 June, 13
  • 36. How the innovation came to be... Reframing what constituted data not intentional, but an adaptive response to emergent needs: informational needs concerning development; collected, analyzed, interpreted relevant theories, concepts, ideas; introduced to catalyze thinking. Led to learning and un- learning. 36 36Monday, 10 June, 13
  • 37. Major Findings RQ1: To what extent does API qualify as a developmental evaluation? 1. Preformative development of a potentially broad-impact, scalable innovation 2. Patton: Did something get developed? (Improvement vs development vs innovation) 37 37Monday, 10 June, 13
  • 38. RQ2: What contribution does DE make to enable and promote program development? 1. Lent a data-informed process to innovation 2. Implication: responsiveness program-in-action became adaptive to the emergent needs of users 3. Consequence: resolving uncertainty 38 38Monday, 10 June, 13
  • 39. RQ3: To what extent does DE address the needs of developers in ways that inform program development? 1. De鍖nition - de鍖ning the problem 2. Delineation - narrowing down the problem space 3. Collaboration - collaboration processes; drawing on collective strength and contributions 4. Prototyping - integration and synthesis of ideas to ready a program for implementation 5. Illumination - iterative learning and adaptive development 6. Evaluation - formal evaluation processes to reality-test 39 39Monday, 10 June, 13
  • 40. Implications to Evaluation One of the 鍖rst documented case study into developmental evaluation Contributions into understanding, analyzing and reporting development as a process Delineating the kinds of roles and responsibilities that promote development The notion of design emerges from this study 40 40Monday, 10 June, 13
  • 42. Program as co-created Attending to the theory of the program DE as a way to drive the innovating process Six foci of development Designing programs? 42Monday, 10 June, 13
  • 44. Design+Design Thinking Design is the systematic exploration into the complexity of options (in program values, assumptions, output, impact, and technologies) and decision-making processes that results in purposeful decisions about the features and components of a program-in-development that is informed by the best conception of the complexity surrounding a social need. Design is dependent on the existence and validity of highly situated and contextualized knowledge about the realities of stakeholders at a site of innovation.The design process 鍖ts potential technologies, ideas, and concepts to recon鍖gure the social realities.This results in the emergence of a program that is adaptive and responsive to the needs of program users. (Lam, 2011, p. 137-138) 44 44Monday, 10 June, 13
  • 45. Implications to Evaluation Practice 1. Manager 2. Facilitator of learning 3. Evaluator 4. Innovation thinker 45 45Monday, 10 June, 13
  • 46. Limitations Contextually bound, so not generalizable but it does add knowledge to the 鍖eld Data of the study is only as good as the data collected from the evaluation better if I had captured the program-in-action Analysis of the outcome of API could help strength the case study but not necessary to achieving the research foci Cross-case analysis would be a better method for generating understanding. 46 46Monday, 10 June, 13