Humans and Learning Analytics

Some time ago, when we were just starting out on our Learning Analytics journey, there was one article in particular that helped kick-start our thinking. “Academic Analytics” by John Campbell and Diana Oblinger (Campbell, Oblinger, & DeBlois, 2007). At that time we were tinkering in the backend of our then Blackboard database, looking for correlations between student online behavior and their resulting grades. At the time, this article made a lot of sense as it spoke about how the plethora of data collected by universities can be better harnessed to improve learning and teaching. However, when looking back at this paper, there are some significant contrasts with between this paper and what are now thinking.

Note that this is not a criticism of the paper, but just shows how much our thinking has shifted in the seven years since it was written. The paper represented the thinking of the time that was heavily influenced by business intelligence principles borrowed from industry. I recently re-read the “Institutional Benefits” section with talks about “A new approach to making decisions” and the following paragraph caught my eye:

“At its simplest level, decision making can be based on intuition – an individual can draw conclusions based on accumulated experience, without specific data or analysis. In higher education many institutional decisions are too important to be based on intuition, anecdote, or presumption; critical decisions require facts and the testing of possible solutions”

There are some subtle assumptions here that link to my research interests around learning analytics that form the remainder of this post.

There seems to be an assumption that the analysis of data affords better decision-making over accumulated experience and intuition. This appears to be an accepted notion in the business intelligence sense where decision-making models are set in mostly linear environments (Mika). A classic sense-making approach set in relatively simple, stable environments. However, we have shown (Beer, Jones, & Clark, 2012), and indeed it has been known for some time (Mason, 2008a, 2008b), that university learning and teaching environments are not simple, stable systems.

In complex environments, the interactions of agents within the system make predictive modeling and decision-making difficult except at the most abstract levels. This limits the contribution that retrospective data analysis can have in a complex system. It is an easy trap to fall into as complex systems display coherence in retrospect; coherence that is not evident in real-time (Kurtz & Snowden, 2003).

To go one step further, there is also an assumption that data analysis will lead to rational decisions being made on the analysis. It is assumed that when faced with a choice, human beings will choose rationally. People’s sense-making is tied to their pre-existing cognitive frameworks based on internal mental models of the environment (Endsley, 2001). The objectivity and rationality in sits not with the choice they make, but with the process they take of trying to match what is sensed to parameters acquired through experience (deMattos, Miller, & Park, 2012). Humans are not rational by definition, especially when faced with complex and often ambiguous data (Simon, 1955). I also suspect that the issue with human objectivity and rationality also applies to the actual analysis of the data in the first place?

To me, this is but one aspect where current learning analytics research is lacking. People. People are at the center of learning analytics. It is people that decide what information they need; it is people who decide what is the best representation of the data is in their context; it is people who need to take action as a result of what the data represents and it is people who find that their information needs change as a result of the information/action cycle. When I’m reading analytics papers and articles, I see a lot of references to data, information and knowledge, but not too many that integrate human cognition and action into learning analytics. Or am I just looking in the wrong places?

Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Paper presented at the ASCILITE2012 Future challenges, sustainable futures, Wellingtone, New Zealand.

Campbell, J. P., Oblinger, D. G., & DeBlois, P. B. (2007). Academic Analytics. Educause Review(July/August 2007). Retrieved from

deMattos, P. C., Miller, D. M., & Park, E. H. (2012). Decision making in trauma centers from the standpoint of complex adaptive systems. Management Decision, 50(9), 1549-1569. doi:10.1108/00251741211266688

Endsley, M. R. (2001). Designing for situation awareness in complex systems. Paper presented at the Proceedings of the Second International Workshop on symbiosis of humans, artifacts and environment.

Kurtz, C. F., & Snowden, D. J. (2003). The new dynamics of strategy: Sense-making in a complex and complicated world. IBM Systems Journal, 42(3), 462-483. Retrieved from

Mason, M. (2008a). Complexity theory and the philosophy of education. Educational Philosophy and Theory, 40(1), 15. doi:10.1111/j.1469-5812.2007.00412.x

Mason, M. (2008b). What Is Complexity Theory and What Are Its Implications for Educational Change? Educational Philosophy and Theory, 40(1), 35-49. Retrieved from

Mika, A. Multi-ontology, sense-making and the emergence of the future. Futures, 41, 279-283. doi:10.1016/j.futures.2008.11.017

Simon, H. A. (1955). A behavioral model of rational choice. The quarterly journal of economics, 99-118.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s