It is not a good time in higher and distance education to say: “I don’t know.” The use of technology is increasingly changing the higher education landscape; we face unprecedented changes in funding regimes; the private for-profit higher education sector is growing fast and the increasing diversity of our student profiles and levels of preparedness and literacy are making it almost impossible to predict and plan based on our predictions. As higher and distance education institutions respond to this constant flux, we rely more and more on analysis and data to get a grip on what to do next. Though the belief that knowing the past will help us to understand the future is as old as humanity, there is a proliferation of initiatives to harvest and analyse ‘real-time’ data to increase our understanding (and often, control).We harvest more and more data from students, faculty and operational systems and processes trying to get one step ahead and lessen the amount of surprises and unforeseen changes in our context which no one foreseen.
I firmly belief that knowing more and having access to more data, historical and real-time data, may help us to understand and respond to the various challenges our faculty, students and systems face. Knowing more and having more data is however not a magic bullet and often results in an epistemic arrogance making us even more prone to the unforeseen and the unexpected. Considering what higher education has, up to now, done with what we already know, the promise which ‘big data’ holds to help us to understand and plan better, may not be realized. Often what we already know about our students are left unused, or used in “bang-bang” approaches as we shoot at noises we hear in the dark.
Higher education is increasingly living “under the sword of data” (Wagner & Ice, 2012, p.40) and we celebrate learning analytics as the “the new black” (Booth, 2012, p.52). Learning analytics proposes that if we know more about our students by harvesting the different data trails they leave on course management and student information systems (Oblinger 2012), as well as systems external to higher education (e.g. Twitter, Facebook, etc), we may be able to understand student behaviour better in order to determine their, and our, next moves. Higher education institutions will then be able to suggest courses and actions to students on a similar basis Amazon suggests books you are likely to buy based on analysing your past behaviour. Most institutions also use analytics to identify at-risk students early enough to ensure that institutions do not waste unnecessary resources on students who will, in any event, not make the grade (literally). If these ‘at risk’ students are allowed to register, they are put in appropriate streams according to algorithms harvesting a range of data from primary school and secondary school histories, demographic data and increasingly, data trails and digital records students leave on social networking sites. There are claims that we should have analytics producing 360 degree views of students (Crow, 2012), providing us with flawless ways to prevent wasting resources (on both students’ and the institution;s side).
As I reflected on knowing too little, doing too little with what we already know and our commitment to know even more about our students, I could not help to think of the work of Nicholas Taleb (2007). According to him, the human mind suffers from three ailments, the “triplet of opacity” which encompasses
- “The illusion of understanding, or how everyone thinks he knows what is going on in the world that is more complicated (or random) than they realize;
- The retrospective distortion, or how we can assess matters only after the fact, as if they were in a rearview mirror…
- The overvaluation of factual information and the handicap of authoritative and learned people, particularly when they create categories – when they ‘Platonify’” (Taleb, 2007, p.8)
We often believe that the more we know, the more we will understand – and the causal link between knowing more and understanding does not necessarily exist. We assume that by categorizing what we know, we prove our understanding but Taleb (2012) warns that “Categorizing is necessary for humans, but it becomes pathological when the category is seen as definitive, preventing people from considering the fuzziness of boundaries, let alone revising their categories ” (p.15). There is a real danger that our obsession with categorizing students as Generation X, Y and Z, or as “Millennials” or as ‘at risk’ reduces the complexities within different cohorts of students, resulting in homogenous Platonic categories which provide us with the ‘evidence’ to plan interventions and predict future behaviour. Our seeming obsession with predictive modelling and categorization speaks of the “weight of the epistemic arrogance of the human race” (Taleb, 2007, p.17).
As we gather more data resulting in the number of possible different variables impacting on student retention and success increase, we look for linear relationships forgetting that “linear relationships are truly the exception” (Taleb 2007, p.89). Humans, and specifically higher education administrators, are “explanation-seeking animals who tend that everything has an identifiable cause and grab the most apparent one as the explanation” (Taleb, 2007, p.119). Taleb does not suggest that there are no causal links between variables, but he suggests that we should use the word – ‘because’- “sparingly” and “with care” (2007, p.120). While there is a need to plan and strategize after making sense of the highly dynamic higher education landscape where flux is the new normal, we should be careful of the danger of epistemic arrogance which “bears a double effect: we overestimate what we know, and underestimate uncertainty” (Taleb, 2007, p.140). Knowing the limitations of our predictions and knowledge does not result in a paralysis and an inability to plan and predict, but rather allows as to “…plan while bearing in mind such limitations. It just takes guts” (Taleb, 2007, p.157; emphasis added).
Juxtaposed to epistemic arrogance and the “pretence of knowledge”, Taleb proposes “epistemic humility” which he describes as follows: “Think of someone heavily introspective, tortured by the awareness of his (sic) own ignorance. He lacks the courage of the idiot, yet has the rare guts to say ‘I don’t know’. He does not mind looking like a fool or, worse, an ignoramus. He will not commit, and he agonizes over the consequences of being wrong. … This does not necessarily mean that he lacks confidence, only that he holds his own knowledge to be suspect” Taleb (2007, p.190).
In closing: I started this blog stating that it is not a good time in higher education to say “I don’t know.” As “explanation-seeking animals” we constantly look for causal links in the belief that, somehow, life and the lives of our students are more predictable than they really are. We harvest more and more data, looking for causes and links in our endeavour to manage students’ learning more effectively. We re-engineer big processes, make big scientific claims based on big data and celebrate the potential of big data mining and analyses in a continuous ritual of epistemic arrogance. We state our claims with the courage of an idiot and banish those who dare to question or who hold their own (and others’) knowledge to be suspect.
A good place to realise the huge potential of learning analytics in higher and distance education is with a strong dose of “epistemic humility.”
[Image retrieved from http://www.dataminingtechniques.net/, 3 September 2012]
Booth, M. (2012). Learning analytics: the new black. EDUCAUSE Review, July/August, 52-53.
Crow, M.M. (2012). No more excuses. EDUCASE Review July/August, 14-22.
Oblinger, D.G. (2012). Let’s talk analytics. EDUCAUSE Review, July/August, 10-13.
Taleb, N. (2007). The black swan. The impact of the highly improbable. London, UK: Penguin Books.
Wagner, E., & Ice, P. (2012). Data changes everything: delivering on the promise of learning analytics in higher education. EDUCAUSE Review July/August, 33-42.