Once upon a time, education consisted of small, intimate Socratic circles where learners gathered around a sage and where the sage actually knew the first names of his or her learners. The massification of higher education, whether with increasing enrolments in higher and distance education, or Massive Open Online Courses (MOOCs), has often removed the personal link between educator and learner, resulting in higher education increasingly looking like a cattle farm with more than 10,000 head of cattle, where the mere suggestion to name them individually, is bizarre. Except for the number of students, there is the “fog of uncertainty around how to allocate resources, develop competitive advantages, and most important, improve the quality and value of the learning experience” (Long & Siemens, 2011, p. 40). Faced with thousands of nameless students, also increases our uncertainty, not quite knowing whether we intervene at the right places, use resources optimally and whether our interventions will make any difference to the success of our learners. Learning analytics, according to a number of authors (e.g., Ferguson, 2012; van Barneveld, Arnold & Campbell, 2012) promises to clear the fog…
Just in case you are not familiar with the concept of learning analytics, and think it is the same as business intelligence or academic analytics, let me, briefly, recap some aspects. Learning analytics is “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” ( in Ferguson, 2012, p. 2; emphasis added). Take care to note that while there are different initiatives in higher education institutions to harvest and analyse data. These initiatives take place on institutional level, regional systems or national and international reporting frameworks, and focus on groups or cohorts of students. Most of these initiatives are furthermore retrospective and long-term. Learning analytics zoom in on individual students on a specific course or at departmental level with the aim to influence, in real-time, their learning. For further clarification on the terms see Long and Siemens (2012), Ferguson (2012) and van Barneveld, Arnold and Campbell (2012).
Higher education institutions have always been involved in the mining and analyses of data, but the data and analyses were mostly on the level of cohort analyses, and the success and retention rates of groups of nameless students. Long and Siemens (2011, p. 32) state that not only has higher education been rather inefficient in the use of data, but “Evaluating student dropouts on an annual basis leaves gaping holes of delayed action and opportunities for intervention.”
Except for the ineffective use of data, the level of analysis and the delayed response rates, we also have to take cognisance that the nature of available data has changed dramatically. Google’s Marissa Mayer (in Long & Siemens, 2012) suggests that data has changed in three ways:
- The speed at which data has become available, in real time, is unprecedented
- The amount and scale of data is growing exponentially
- The nature of data has changed as a result of the use of “sensors, smart grids, and connected devices” (p. 34)
These changes not only make our current approaches and policy frameworks to data-management and decision-making outdated, but necessitates that we review the harvesting and use of real-time data, to influence learning. “Learning analytics is essential for penetrating the fog that has settled over much of higher education” (Long & Siemens, 2011, p. 40). [If you are curious to explore the potential and implications of learning analytics, please check the references at the end of this blog].
There is no doubt in my mind that learning analytics has huge potential to enable educators and departments to respond, in real-time, to individual learners or groups of learners registered in the same course. I firmly belief that learning analytics can help to clear the “fog of uncertainty around how to allocate resources, develop competitive advantages, and most important, improve the quality and value of the learning experience” (Long & Siemens, 2011, p. 40). There is however, some fog left in learning analytics as research field and as analytical tool that we need to take cognisance of, such as…
- Like all analyses and tools, the “black box” of learning analytics and the algorithms we use, flow from a number of epistemological and ontological assumptions and ways of seeing the world, and learning. Not only do the way we envision learning analytics flow from these often unconscious assumptions, but they also have the potential to perpetuate and contribute to the sedimentation of these assumptions into organizational structures and practices. It is therefore crucial that we should also critically explore and deconstruct our assumptions about learning. Learning, in my view, is a complex and multi-layered phenomenon. The student journey consists of mostly non-linear, multidimensional, interdependent interactions at different phases in the nexus between student, institution and broader societal factors. We just cannot afford to use incorrect constructs in learning analytics. One example of a simplistic (if not crude) construct is the notion that “Student Success = Academic Preparation + Performance + Effort” (Arnold, Fritz, Kunnen, 2012, slide 9). There is ample evidence that student success is not so simple and much more complex (Subotzky and Prinsloo, 2012). If our analytics models are based on crude constructions of student success, learning analytics will not clear the mist.
- There is also a need for ample and agile institutional systems in order to respond, in real-time, to the data we harvest. It is one thing to know more about our students, but if we cannot respond in integrated, effective and holistic ways, learning analytics will not deliver on its promise.
- There are a number of ethical issues in harvesting more data from our students that we will need to interrogate and explore. Though some authors (e.g., Ferguson, 2012, Slade & Prinsloo, 2012) point to these, one of the most pertinent issues is that knowing more increases our responsibility. It is one thing to know that you have 10,000 cattle. The moment you know their names, you are in trouble…
- Learning analytics presupposes having and optimizing our analyses of the digital data of students. I am writing this blog from a large open distance education provider in an emerging economy and I am painfully aware of how the institution’s move into digitized and online teaching and learning, is often ham-fisted and uncomfortable, for students and faculty alike. While the current institutional discourse focuses on concerns regarding access to and the cost of access to technologies, we also need to explore the huge potential of having more real-time data on which to base our interventions. As Long and Siemens (2012) indicate, learning analytics offers huge potential to customize not only the support we offer to students, but also the possibility to customize curricula.
- Learning analytics allows us to notice when students disappear from discussion forums, or when they have not handed in an assignment. It is, however, one thing to notice that a learner has disappeared, (however temporarily) from the radar, and it is quite something else to create an adaptive system to welcome the learner back after an intervention. Often learners disappear when an assignment is due, or when they have failed an assignment. Should we pick this up, in real-time, and the date of the assignment has passed, how do we create opportunities for them to submit their assignments after the closing date? How adaptive are our systems to create alternative learning journeys?
In conclusion: “If you have three pet dogs, give them names. If you have 10,000 head of cattle, don’t bother” (David Gelernter in Long & Siemens, 2011, p. 32). Learning analytics provides higher education the opportunity to give our students names, customize administration, assessment, student support and curricula. Knowing the names does, however, increase our responsibility…
Arnold, K., Fritz, J., & Kunnen, E. (2012, 20 January). Using analytics to intervene with underperforming college students. Retrieved from http://www.slideshare.net/ekunnen/using-analytics-to-intervene-with-underperforming-college-students
Ferguson, R. (2012). The state of learning analytics in 2012: a review and future challenges. Technical Report KMI-12-01, Knowledge Media Institute. Milton Keynes, UK: Open University. Retrieved from http://kmi.open.ac.uk/publications/techreport/kmi-12-01
Long, P., & Siemens, G. (2011).Penetrating the fog.Analytics in learning and education. EDUCAUSE Review, September, 31-40.
Slade, S., & Prinsloo, P. (2012). Learning Analytics: Ethical Issues and Dilemmas. In press.
Subotzky, G., & Prinsloo, P. (2012). Turning the tide: a socio-critical model and framework for improving student success in open distance learning at the University of South Africa. Distance Education, 32(2), 177-193. DOI: org/10.1080/01587919.2011.584846
Van Barneveld, A., Arnold, K.E., & Campbell, J.P. (2012). Analytics in higher education: establishing a common language. ELI paper 1: 2012. EDUCAUSE. Retrieved from http://net.educause.edu/ir/library/pdf/ELI3026.pdf