Seeing Jesus in toast: Irreverent ideas on some of the claims pertaining to learning analytics

Jesus in toast“In 2004 Diane Duyser sold a decade-old grilled cheese sandwich that bore a striking resemblance to the Virgin Mary. She got $28,000 for it on eBay” (Matter, 2014), and in 2009 Linda Lowe found an image of Jesus staring at her from a piece of toast. The phenomenon is called pareidolia and has been explained as complete natural, and all too human (Liu, Li, Feng, Li, Tian, & Lee, 2014; Tanne, 2014). Pareidolia is described as “psychological phenomenon involving a stimulus (an image or a sound) wherein the mind perceives a familiar pattern of something where none actually exist.”

In the world of data and predictive analysis, a relatively similar phenomenon is called apophenia – to see patterns where none actually exist (boyd & Crawford, 2012). In the context of higher education where we have access to ever increasing volumes, velocity and variety of student digital data, apophenia is uncomfortable companion in the analysis of student data. Harvesting and combining student data from disparate sources opens up the opportunity and risk infer relations unthinkable ten years ago.

As we engage with an ever increasing number and scope of data students we may be tempted to rush to look for patterns without considering our own assumptions and epistemologies, how we select our data, how we slice and dice, how we clean our data sets, and how we deal with the often uncomfortable silences in our data and analysis. We may be tempted to make claims on the quality and impact of student engagement based on the number of clicks, their participation in online discussion forums, and the number of downloads of resources. From this evaluation on the depth and quality of their engagement (often based on the quantification of our definition of ‘engagement’) we then design personalised assessments, curricula and the allocation of resources. Increasingly our predictive analyses also utilise the immense speed and scope of algorithmic decision destining students for a learning journey over which they have very little control or insight regarding its machinations.

It may be worthwhile to heed the words of Silver (2012) who warns that in noisy systems with underdeveloped theory – there is a real danger to mistake the noise for a signal, and not realising that the noise pollutes our data with false alarms and “setting back our ability to understand how the system really works” (p. 162). In a context where our abilities to harvest and analyse student data may outpace not only our regulatory frameworks, but more importantly, our theoretical and ethical dispositions, the latest issue of the Journal of Learning Analytics comes as a welcome relief.

In this issue the relationship between learning analytics and theory is recognised as important (“Why theory matters more than ever in the age of big data”) and this relationship is explored in contexts such as “Theory-led design of instruments and representations of learning analytics”, a “‘Beyond time-on-task: The relationship between spaced study and certification in MOOCs.” Other contributions include, inter alia, “Does seeing one another’s gaze affect group dialogue? A computational approach” and “Learning analytics to support teachers during synchronous CSCL: Balancing between overview and overload.”

As the amount, velocity and variety of student data increase, so will the noise and the potential to see patterns which either don’t exist, or patterns that do not contribute to understanding student success as the result of increasingly messy and complex, non-linear interactions between context, students and institutions at the intersection of curricula, pedagogies, assessment and institutional (in)efficiencies and operations. We therefore need to slow down our conversations (e.g. Selwyn, 2014) on the potential of learning analytics and create spaces to think critically about our epistemologies and ontologies that shape our harvesting and analyses of student data.

The plea to slow down our discussions on and embrace of the potential of learning analytics seem out of place with higher education as an increasingly privatised and/or costly commodity, characterised by an obsession on the return on investments, just-in-time products delivered by just-in-time labour aiming to get the products off the shelves in the fastest possible time. In contrast to slowing down we would like to speed up our search for signal amidst the noise. There may, however, be a danger that we may find Jesus in toast.

Let me state it clear that there is no doubt in my mind that evidence or data and the ethical harvesting and analysis of student data can and should inform the management of teaching and learning, the development of curricula, and assessment and student support strategies and interventions.My question is not whether we should harvest and analyse student data, but rather how do we engage with student data and the search for relationships that matter in the light of the fact that higher education is an open and recursive system? How do we engage with evidence of what works where the evidence does not tell us whether the intervention was appropriate and ethical?

I agree with Biesta (2007, 2010) that the issue is not about the usefulness of evidence, but rather how we define evidence, what we include and exclude, acknowledging our assumptions about data and an honesty about what the implications of our research design decisions. Current evidence-based decision making practices favour technocratic modes that assume that “the only relevant research questions are questions about the effectiveness of educational means and techniques, forgetting, among other things, that what counts as ‘effective’ crucially depends on judgments about what is educationally desirable” (Biesta, 2007, p. 5). We need to understand our limitations of our designs when we explore, in an open, semiotic and recursive system, how interventions work. We need to acknowledge our search for the “magic bullet of causality” (Biesta, 2010, p. 496). “Much talk about ‘what works’ … operates on the assumption of a mechanistic ontology that is actually the exception, not the norm in the domain of human interaction” (Biesta, 2010, p. 497).

There is a real danger that we think of and apply learning analytics as if education is a closed and isolated environment such as a laboratory setting where we can limit the amount of variables and report on those variables that made a difference. Contra to such an understanding it would be safer (and most probably closer to the reality) to think of education in terms of the Cynefin framework’s (Snowden & Boone, 2007) proposal of simple, complicated, complex and chaotic environments. In all of these four environments it is possible to harvest and analyse evidence, but with very different results. I have a strong suspicion that we think of education as simple and at most complicated, while education is, most probably, rather complex if not chaotic at times. In complicated environments, Snowden and Boone (2007) suggest that cause-and-effect relationships are discoverable but that there is more than one ‘right’ answer. In complex systems, there are no right answers and though it may be possible to trace correlation, causation becomes almost impossible to prove, and more importantly, to replicate.

At many educational conferences when I listen to reports and evidence on interventions that resulted in an increase in student retention and success, I cannot help to see Jesus smiling to me from a piece of toast.

Image credit: Adapted from


Biesta, G. (2007). Why “what works” won’t work: evidence-based practice and the democratic deficit in educational research, Educational Theory, 57(1), 1–22. DOI: 10.1111/j.1741-5446.2006.00241.x.

Biesta, G. (2010). Why ‘what works’ still won’t work: from evidence-based education to value-based education, Studies in Philosophy of Education, 29, 491–503. DOI 10.1007/s11217-010-9191-x.

boyd, D., & Crawford, K. (2012). Critical questions for big data: Provocations for a cultural, technological, and scholarly phenomenon. Information, communication & society, 15(5), 662-679.

Liu, J., Li, J., Feng, L., Li, L., Tian, J., & Lee, K. (2014). Seeing Jesus in toast: Neural and behavioral correlates of face pareidolia. Cortex, 53, 60-77.

Selwyn, N. (2014). Distrusting educational technology. Critical questions for changing times. New York, NY: Routlegde.

Silver, N. 2012. The signal and the noise: Why most predictions fail – but some don’t. New York, NY: Routledge.

Snowden, D. J., & Boone, M. E. (2007). A leader’s framework for decision making. Harvard Business Review, 85(11). Retrieved from

Tanne, J. H. (2014). Seeing Jesus in a piece of toast and other scientific discoveries win Ig Nobel awards. BMJ, 349, g5764.

About opendistanceteachingandlearning

Research professor in Open Distance and E-Learning (ODeL) at the University of South Africa (Unisa). Interested in teaching and learning in networked and open distance and e-learning environments. I blog in my personal capacity and the views expressed in the blog does not reflect or represent the views of my employer, the University of South Africa (Unisa).
This entry was posted in Uncategorized and tagged , , , , . Bookmark the permalink.

3 Responses to Seeing Jesus in toast: Irreverent ideas on some of the claims pertaining to learning analytics

  1. Pingback: Seeing Jesus in toast: Irreverent ideas on some of the claims pertaining to learning analytics « Analyzing Educational Technology

  2. Pingback: Seeing Jesus in toast: Irreverent ideas on some...

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s