Learning analytics is still an emerging concept in that it has not got a broadly accepted definition. Johnson et al. (2013) define learning analytics as the collection and analysis of data in education settings in order to inform decision-making and improve learning and teaching. Siemens and Long (2011) say that learning analytics is “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (p. 34). Others have said that learning analytics provides data related to students’ interactions with learning materials which can inform pedagogical sound decisions about learning design (Fisher, Whale, & Valenzuela, 2012). Suffice to say that there is uncertainty around exactly what learning analytics is. I also think its safe to say that there is uncertainty around how learning analytics can contribute to decision-making at the various levels of the institution (unit, program, faculty, institution).
In thinking about this I decided to look a little closer at uncertainty and how it influences decision-making. Firstly, what is uncertainty? There are three main conceptualizations of uncertainty according to Lipshitz & Strauss (1997):
- Inadequate understanding
- Undifferentiated alternatives
- Lack of information
They go on to talk about five strategies for how people manage uncertainty:
- Assumption-based reasoning
- Weighing pros and cons
They presented some data that talked about how decision-makers dealt with uncertainty and I thought the results were interesting:
- Inadequate understanding was primarily managed by reduction
- Incomplete information was primarily managed by assumption-based reasoning
- Conflict among alternatives was primarily managed by weighing pros and cons
- Forestalling was equally likely to be used as a backup with all forms of uncertainly
So what does this have to do with learning analytics?
The tools and processes is only the first step for learning analytics, the integration of these tools and processes into the practices of learning and teaching is far more difficult (Elias, 2011). This is on top of the enormous complexity inherit in learning and teaching (Beer, Jones, & Clark, 2012). Seems like a great big pile of uncertainty to me? I wonder how inadequate understanding and incomplete information will influence the management of learning analytics implementations?
Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Paper presented at the ASCILITE2012 Future challenges, sustainable futures, Wellingtone, New Zealand.
Elias, T. (2011). Learning analytics: Definitions, processes and potential. Learning, 23, 134-148.
Fisher, J., Whale, S., & Valenzuela, F.-R. (2012). Learning Analytics: a bottom-up approach to enhancing and evaluating students’; online learning (pp. 18). University of New England: Office for Learning and Teaching.
Johnson, L., Adams, S., Cummins, M., Freeman, A., Ifenthaler, D., Vardaxis, N., & Consortium, N. M. (2013). Technology Outlook for Australian Tertiary Education 2013-2018: An NMC Horizon Report Regional Analysis. In T. N. M. Consortium (Ed.): New Media Consortium.
Lipshitz, R., & Strauss, O. (1997). Coping with uncertainty: A naturalistic decision-making analysis. Organizational Behavior and Human Decision Processes, 69(2), 149-163.
Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. Educause Review, 46(5), 9. Retrieved from Educause Review Online website: http://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education