This document discusses three paths for implementing learning analytics: 1) "Do it to" which involves a top-down planned approach and is complex and likely to fail, 2) "Do it for" which focuses on understanding teachers' contexts and involving them in the process, and 3) "Do it with" which takes an emergent approach through bricolage and improvisation exploiting local resources. The best approach involves balancing all three paths.
1 of 42
Download to read offline
More Related Content
ASCILITE 2014. Three paths for Learning Analytics
1. Three paths for learning analytics and beyond:
Moving from rhetoric to reality
Colin Beer
CQUniversity
David Jones
University of Southern
Queensland
Rolley Tickner
CQUniversity
4. Oz/NZ Horizon Report
Year Time Frame Label
2010 4 to 5 years Visual data analysis
2012 1 year or less (#2) Learning analytics
2013 1 year or less (#1) Learning analytics
2014 2 to 3 years Learning analytics
http://flickr.com/photos/boskizzi/3241710/ http://flickr.com/photos/boskizzi/
5. Learning analytics is essential for penetrating
the fog that has settled over much of higher
education
(Siemens and Long, 2011, p. 40)
Learning analytics can contribute to course
design, student success, faculty
development, predictive modelling and
strategic information
(Diaz & Brown, 2012)
Learning analytics has been identified as a
key future trend in learning and teaching
(Johnson et al, 2013; Lodge & Lewis, 2012;)
http://www.flickr.com/photos/rchughtai/2121560287/
6. http://bit.ly/16uz8vU
Im not familiar with (m)any universities that
have taken a systems-level view of LA.
Most of what Ive encountered to date is specific
research projects or small deployments of LA. I have
yet to see a systemic approach to analytics use/adoption
http://www.flickr.com/photos/mikecogh/5959192031/
7. Basing decisions on data and evidence
seems stunningly obvious
(Siemens and Long, 2011, p. 31)
http://www.flickr.com/photos/johnhaydon/5042881685/
8. Management fashion is "relatively transitory collective
beliefs, disseminated by the discourse of knowledge
entrepreneurs, that a management technique is at the
forefront of rational management progress
(Abrahamson and Fairchild, 2003)
Amplified by hyperbole, the fashionable vision
may exert a strong, if transitory, normative pull
among managers.
(Swanson and Ramiller, 2004)
http://flickr.com/photos/boskizzi/3241710/
http://flickr.com/photos/boskizzi/3241710/
25. Early Alert Student Indicators (EASI)
Formal project proposed during 2012
12+ iterations of the project initiation
documentation
Details on project scope, deliverables,
budget, milestones and quality
3 x major budget revisions
Project officially started in 2014 term 1 with
an institution wide pilot
26. Early Alert Student Indicators (EASI)
Over 85,000 nudges delivered
Used by a majority of teaching staff
Deemed by senior management to be a
successful project
Contributed to Tier 1 & 2 L&T awards
Project was delivered on time and on budget
31. Failures of rationality
Loss of information
http://flickr.com/photos/tonymangan/754511201/
Pitfalls
Complex and likely to fail
Resistance Compliance
Disappearing data
Tail wagging the dog
Do it to
32. DAout hito tro Planning Learning
Weick & Quinn (1999) Episodic change Continuous change
Brews & Hunt (1999) Planning Failures school of rationality
Learning school
Seely Brown & Hagel
Push system Pull systems
(2005)
Hutchins (1991) Supervisor reflection
Loss of information
and intervention
Local adjustment
Truex et al (2000) Traditional design Emergent design
March (1991) Exploitation Exploration
Boehm & Turner (2003) Plan-driven Agile
http://flickr.com/photos/tonymangan/754511201/
Pitfalls
Complex and likely to fail
Resistance Compliance
Disappearing data
Tail wagging the dog
Mintzberg (1989) Deliberate strategy Emergent Strategy
Kurtz & Snowden
Idealistic Naturalistic
(2007)
34. Arguably, teachers are the primary change agents
in any educational system.
(Mor & Mogilevsky, 2013, p.1 )
Academics are pivotal to implementing changes in
Learning and teaching
http://www.flickr.com/photos/davidking/2202649444/
(Radloff, 2008)
35. Teachers operate in a complex and dynamic domain
the background knowledge and practices of their
students constantly change, the technologies and
resources at their disposal are perpetually evolving,
and the guidance and directives they receive are
frequently updated
http://www.flickr.com/photos/davidking/2202649444/
36. ..underlined the importance of understanding context,
and
of involving teachers in the process of developing
and
deploying analytics (Sharples et al, 2013, p. 15)
37. Any attempt to introduce wide-scale educational
analytics and accountability processes thus requires
a thorough
understanding of the pedagogical and technical
context in which the data are generated.
(Lockyer et al., 2013, p. 2)
38. ..drop the language of planning, controlling, and
measuring through which organisations, teams
and projects have been managed so far.
That language stems from heavy and slow-changing
industries and infrastructures.
(Cioborra, 2002)
http://www.flickr.com/photos/9422878@N08/7788404750/
39. The power of bricolage, improvisation and hacking is
that these activities are highly situated; they exploit,
in full, the local context and resources at hand, while
often pre-planned ways of operating appear to be
derooted, and less effective because they do not fit
the contigencies of the moment.
(Cioborra, 2002)
http://www.flickr.com/photos/9422878@N08/7788404750/
40. Competitive advantage related to ICTs can only stem
from the cognitive and organisational capability to
convert such systems, applications and data into
practical, situated, and unique knowledge for action.
(Cioborra, 2002 )
http://www.flickr.com/photos/9422878@N08/7788404750/
Welcome to the gold rush. It seems like everyone is talking about learning analytics and its an unusual university that isnt investing time and effort into learning analytics
CQUniversity is no exception with a long running learning analytics research project starting to project into CQUniversitys learning and teaching arena
Learning analytics is the use of data about learners and their contexts, for the purposes of understanding and optimising learning and the environments in which it occurs
As an indication of the increasing level of interest in learning analytics,
A quick scan of ASCILITE proceedings over the last few years shows are marked increase in the number of publications that included learning analytics in the title.
We know that it has certainly risen to the attention of the sector.
We would argue that this is primarily because one conception of learning analytics aligns nicely with the tendency toward the techno-rationalist approach to management that is dominant in Australian universities.
Were being rational by doing this, so we should do it.
This makes sense as learning analytics has the potential to contribute greatly to learning and teaching across the sector.
Learning analytics seems to be the new light that will help us achieve the goal of data-driven decision making and help us deal with the uncertainty and change facing higher education.
That said, we are yet to see any large scale LA implementations although small scale projects are cropping up all over
Basing decisions on data and evidence? Lets start with the obvious. Of course we want to do this. We know that senior management like to do this and that academics should (and like) to make decisions based on data and evidence.
(Of course its also obvious that most human beings tend to fail badly at this)
The rapid increase in level of interest in LA over the last few years is showing some of the hallmarks of a management fad or fashion
According to Gibson & Tesone, Fads are innovations that appear to be rational and functional and are aimed at encouraging better institutional performance.
Fads often speak to managers in that they appear to be common-sense and appeal to organisational rationality around efficiency and effectiveness, and this makes counter-argument difficult.
We believe that learning analytics talks strongly to management due to its potential to complement existing business intelligence efforts and the developing hype has the potential to swamp deliberate mindful learning analytics implementations
And I bet you are expecting the next slide to be the Gartner hype cycle which always seems to appear in these sorts of presentations
We prefer the Birnbaum cycle which is more pessimistic than gartner which assumes that every fad finds a plateau of productivity.
Fads tend to follow a cycle and over a decade ago Birnbaum wrote a book about fads in higher education and proposed this cycle to describe management fads.
We think that learning analytics is somewhere in this cycle at the moment.
Brings me to our argument
We are excited about the possibilities of learning analytics.
David is a faculty member at USQ teaching a course with 300+ students and Rolley and I are part of the the central learning and teaching support area at CQUniversity.
for us, the potential of learning analytics to contribute to learning and teaching is very exciting
The trouble is that based on our experience we expect the implementation to be where it goes wrong
The increasing hype around learning analytics generates a desire by institutions to get on the bandwagon, and as said before this can swamp mindful implementation
The last large scale information-based revolution in Universities was ERP systems.
ERPs promised to reduce costs, increase growth, improve business processes, heighten productivity and increase agility.
Im not sure of any of these systems where the reality actually lived up to the rhetoric that was being bandied around when they were being adopted?
This is something we would all like to avoid with learning analytics.
Today well use an example of a real learning analytics project to describe what we think are the three possible approaches that institutions might take when implementing learning analytics from a learning and teaching perspective.
These are:
Do it to the teachers
Do it for the teachers
Do it with the teachers
Early alert student indicators or EASI is a learning analytics application currently in operation at CQUniversity
EASI was conceived out of a strategic imperative to help improve student retention by providing teaching staff with better information on students who might be struggling during the term
It combines data from the student information system with student activity data from Moodle and combines them into a view for teaching staff
At the core, EASI is simply a webpage that is linked from every Moodle course and provides a list of students sorted on an estimate of their success or EOS throughout the term.
The system allows teachers to conduct nudges, whereby students who might be less engaged than they could be can be nudged by their teacher in the hope that they re-engage
The page is by default sorted by the automatically generated EOS that is calculated using an algorithm that combines factors such as GPA, previous fails of this course, pass/fail rates and the like, with each students current level of Moodle activity at this particular point in the term as indicated by the number of clicks they have made within the Moodle course site
The EOS is displayed using traffic light colours beside each student. Note that in this case all the students are yellow as its using fabricated data that we use for testing purposes.
Just to explain the columns that are being displayed here, we have:
Student name and number
If they have failed this course previously a number will appear beside the student in this column.
If the student is in their first term at CQU a star will appear in the first term column
Any nudges that have been performed on this student will be displayed in the nudges column
Then we have the number of courses they have attempted, their GPA for this program, their program of enrolment, campus, course load this term, passrate, courses passed, gender and age
The total of their Moodle activity appears in the sigma column while each weeks activity appears in the corresponding week column
MAV
Now there is a lot of data being displayed by default so we added a settings button whereby staff could reorder the column display or remove columns that they arent interested in.
Teaching staff can also export the data from the page into excel should they wish to slice and dice some more
Or they can filter the page by Moodle groups
A real course looks something like this after the first few weeks of term. The student names have been removed to protect the innocent.
If anyone wants to see the system in more detail, just grab one of us during one of the breaks.
Perhaps the most important part of the page is its ability to facilitate and record nudges with students
You can filter and select multiple students and use the inbuilt mailmerge tool to email multiple students in a personalised manner
You can also record events that have occurred outside of the EASI system such as face-to-face meetings, phone conversations or emails.
The system also allows you make notes about each student and gives you the ability to share these notes with other teaching staff who might be teaching these students this term.
These nudges are recorded in the nudges column and is also represented in the weekly timeline in the Weekly Moodle activity area so you can ascertain whether or not the nudge has had any impact
The system began with the creation of a project initiation document or PID that started around the beginning of 2013.
This document wove its way through many managers and committees and consequently many changes were made from what was originally proposed.
The PID and subsequent project documents conformed to the project management framework and included all the usual detail that any project plan requires.
There were three major budget changes which lead to more project revisions. I hasten to add that the these revisions were all in a downward direction.
The project officially started this year with an institution wide pilot conducted in term 1, this year.
As of the middle of last week there were over 85,000 nudges delivered by the system to students.
Most teaching staff have used the system with 90% of staff surveyed saying that it has positively contributed to their learning and teaching.
Management like it due to its potential to help with student retention
Most importantly the project adhered to the project plan and has so far been delivered on time and on budget.
Now on the surface this project would appear to be a fairly successful example of ICT planning and implementation in that it represents the typical approach to ICT implementation in higher education.
A small group of people at the institutional level made some plans around a new system that could be used to contribute to a strategic imperative, in this case, student retention.
There was a formal project set up based on approach that was idealistic in that followed a deliberate plan or strategy.
On the surface this would appear to be a successful example of the Do it to path
However the reality is very different.
Our thinking behind what was actually produced began back in 2008 where a group of us were publishing around what was then known as academic analytics.
We are part of the central learning and teaching area and are tasked with supporting teaching academics, particularly around the use of technology.
This means we were interacting on a daily basis with teaching staff so we were getting a sense of their context and its inherent complexity.
Wed read papers and come to conferences like this and talk with folk from other universities around what they were doing and wed take these ideas on board and go back and tinker, conduct experiments with teaching academics within their course contexts.
We were fortunate in that we all came from IT backgrounds which dovetailed with our learning and teaching roles. So we could design the system based on an assumption that it will change.
As an example of this there have been over 100 changes recorded in the change log in this year alone. Which are built upon dozens of preceding small scale trials or iterations
[better picture to represent chage]
Here is what was essentially our first attempt at something like EASI from back in term 2, 2009
We tried this across a couple of courses in collaboration with a couple of interested course coordinators.
Needless to say it didnt work very well but we learned a lot, as did the staff. As they used the system, they got to know what they didnt know, and fed that back into future iterations.
The whole time we were still tinkering, tweaking and figuring out what works and doesnt work.
So despite appearances, the approach we took was more like a do it for or a do it with approach. This has that the EASI system has coevolved in conjunction with the teaching staff who use the system.
Now is probably a good time to examine the three likely paths that learning analytics implementations will take.
Weve spoken about the do it to approach and our paper details some of the obvious pitfalls associated with this approach
There are a number of the pitfalls associated with the typical top-down ICT implementations that were all probably familiar with.
But on the positive side it does provide for some concrete KPIs. Finish project X by time Y and show Z
One of the flaws with this path is that it uses a planning approach and there are questions around the suitability of this approach in complex settings.
In complex settings it has been said that learning approaches might be more suitable.
The debate between the planning and learning schools has been one of the most pervasive in the management literature and on the screen are some of the authors in this area.
The Do it for path describes an approach where innovative teaching staff with perhaps central IT staff or venders, get together and develop something that the teachers can use to enhance student learning
This has sometimes been referred to as the technologists alliance and there are some pitfalls associated with this approach.
The chasm talks about the gap between these early adopters and the remaining majority.
There are two parts to the blackbox pitfall. The first is that new systems and process often fail to impact on what is transpiring with the courses during the teaching process.
The other part of the blackbox pitfall talks about what is happening within the teachers head and how new systems or process often fail to meaningfully impact on teachers conceptions of learning and teaching.
We dont know how relates specifically to learning analytics in that its relatively new, and we dont yet have a handle on how to make the best use of it.
These next few slides will hopefully give you a sense of the do it with path.
Its recognized within the literature that teachers are a primary change agent within the education system.
For learning analytics this means that it needs to engage with their contexts which will increase the likelihood of adoption, use and innovation.
But increasingly the context in which teachers find themselves is incredibly complex.
Existing institutional and individual approaches arent dealing with this well.
Teaching staff cant deal with this by themselves.
The institutional environment needs to have teams engaged in the day to day teaching context.
Not in designing courses or facilitating post-course surveys, but involved, helping and responding to situations during the teaching process.
This is what we mean by the do it with approach.
This focus on context is echoed in the learning analytics literature.
University e-learning is not exactly the same as big data.
Knowledge of the specifics of individual students, courses and pedagogies is essential for generating value
.
This isnt just education academics wanting to do their own thing and work around the system
Its a long and well-known fact from the management and the Information Systems literature.
Traditional strategic management processes being employed by universities today are an anathema for the actual requirements of these institutions.
It is bricolage and bottom up work harnessed appropriately to avoid its weaknesses that generate the most benefit as they deal with the realities of the situation.
And that has certainly been our experience with the EASI project
David will expand on some of this tomorrow.
In fact, it is from bricolage that organisations gain advantage.
You do not get any significant advantage from adopting and using the same systems and processes as every other university (i.e. best practice).
So to wrap up, With learning analytics we believe its a matter of balance. The three paths that have been described here are not mutually exclusive and a successful implementation will probably need to span the three paths.
There is developing hype around learning analytics which means universities are keen to quickly get on board.
The typical way they achieve this is by taking a do it to path which has primacy at the moment.
Were suggesting that a more balanced approach that engages with teachers within their contexts will lead to more sustainable and meaningful learning analytics implementations.
This aligns with the sentiments of Henry Mintzberg, a notable author and academic in business and management world who suggests we need a more natural approach to strategy that is centred around a balance between deliberate and emergent processes.