The following post is airing something that I am thinking about with regards to my PHD. It is some early thinking, so apologies in advance if it turns out to be nonsense.
Something that has concerned me for a while with regards to learning analytics, is the use of terms like ‘prediction’ and ‘understanding’. For example:
“Research in learning analytics and its closely related field of education data mining, has demonstrated much potential for understanding and optimizing the learning process”
(Siemens & Baker, 2012)
The Society for Learning Analytics Research defines Learning Analytics as: “… the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs”
(Siemens & Baker, 2012).
“… learning analytics (LA) offers the capacity to investigate the rising tide of learner data with the goal of understanding the activities and behaviors associated with effective learning”
(Macfadyen, Dawson, Pardo, & Gasevic, 2014)
“The intention is to develop models, algorithms, and processes that can be widely used. Transferability is a key factor here; analytic and predictive models need to be reliable and valid at a scale beyond the individual course or cohort”
(Ferguson et al., 2014)
“Learning analytics can penetrate the fog of uncertainty around how to allocate resources, develop competitive advantages, and most important, improve the quality and value of the learning experience” (Siemens & Long, 2011)
“Learning analytics is a technology on the rise in higher education. Adapted from business analytics, this approach is being used to track and predict student engagement and success in higher education.” (Lodge, 2011)
“There is evidence to suggest that learning analytics can be successfully used to predict students at risk of failing or withdrawing and allow for the provision of just in time intervention.”
“While the use of learning analytics to track and predict student success in higher education is rapidly becoming mainstream practice in higher education institutions, it is predominantly being used to predict and prevent student attrition.”
(Lodge & Lewis, 2012)
“Sophisticated systems might even recommend learning activities, predict a student’s success or give advice”
(Dyckhoff, Lukarov, Muslim, Chatti, & Schroeder, 2013)
“… much of the early work has focused on the statistical prediction of outcomes, chiefly grades or retention, by relating these target variables to predictor variables harvested from students’ demographic and institutional variables, and their interaction with the LMS”
These quotes are from a two-minute search through my Endnote library looking for terms like ‘prediction’ or ‘understanding’. There are a lot more but you get the point. The trouble is that I am becoming increasingly skeptical about that ability of learning analytics to contribute to prediction or even understanding. The following is an attempt to explain this skepticism based on my thinking, where learning analytics is data or information arising from interactions occurring within and between complex adaptive systems (Beer, Jones, & Clark, 2012).
Broadly speaking, there are assumptions inherent in terms like ‘prediction’ and ‘understanding’. They both, to some extent, assume that certainty and full knowledge can be attained, and that the agents and systems involved are fixed and will not evolve (Allen & Boulton, 2011). Likewise, prediction is based on a snapshot in time and cannot capture the impact of interactions between the agents and systems after the snapshot is taken. The snapshot is essentially only a view of something that is in transition (Allen & Boulton, 2011). There are also assumptions related to the stability, immovability or changelessness of:
- The initial system’s starting state
- The mechanisms that link the variables
- The internal responses inside each agent or element
- The system’s environment
The only way that prediction becomes possible is when the system or agent is isolated from external influences; something that can only ever occur in laboratory conditions (Allen & Boulton, 2011). Uncertainty is always present when we are talking about complex systems. The only way we can banish uncertainty is by making assumptions; something I am not sure is possible when we are talking about systems with many agents interacting in non-linear ways. For learning analytics, the power of prediction implicit in deterministic models can only be realised if the assumptions made in their creation are true (Allen & Boulton, 2011). My sense is that there are many people looking closely at the predictive potential of learning analytics, myself included. I am beginning to question why, especially when prediction, control and complete understanding are always an illusion, except in exceptional, controlled, closed and fixed situations (Allen & Boulton, 2011).
I am speculating, but I wonder if the predictive potential we perceive might be in learning analytics, how much of it is because we have constrained the system. For example, most universities have a single learning management system, a single student information system, some form of homogenized approach to course design, delivery so on and so forth. Have we suppressed the evolutionary potential to such an extent that we have created an environment that has made prediction and understanding possible?
My quick scan of the Endnote library also revealed a couple of articles that fit with some of the things I have mentioned here. The following from Doug Clow:
“Learning analytics is widely seen as entailing a feedback loop, where ‘actionable intelligence’ is produced from data about leaners and their contexts, and interventions are made with the aim of improving learning”
I intend to further explore this paper for a couple of reasons. It seems to align nicely with my thinking around the applicability of situation awareness and there also appears to be a, albeit limited, attempt at distributed cognition with regards to operationalization of learning analytics. And the following from Jason Lodge has some real gems that I need to look at more closely than I have to date:
“LA is unable to elucidate the student approach to learning, relationships between apparent levels of engagement online and overall student experiences, and is therefore limited as a measure of the process and pathways students may undertake to complete their learning, let alone for higher cognitive processes or ways of being” (Lodge & Lewis, 2012)
Allen, P., & Boulton, J. (2011). Complexity and limits to knowledge: The importance of uncertainty. The SAGE handbook of complexity and management, 164-181.
Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Paper presented at the ASCILITE2012 Future challenges, sustainable futures, Wellingtone, New Zealand.
Clow, D. (2014). Data wranglers: human interpreters to help close the feedback loop. Paper presented at the Proceedings of the Fourth International Conference on Learning Analytics And Knowledge.
Dyckhoff, A. L., Lukarov, V., Muslim, A., Chatti, M. A., & Schroeder, U. (2013). Supporting action research with learning analytics. Paper presented at the Proceedings of the Third International Conference on Learning Analytics and Knowledge.
Ferguson, R., Clow, D., Macfadyen, L., Essa, A., Dawson, S., & Alexander, S. (2014). Setting learning analytics in context: overcoming the barriers to large-scale adoption. Paper presented at the Proceedings of the Fourth International Conference on Learning Analytics And Knowledge.
Lodge, J. (2011). What if student attrition was treated like an illness? An epidemiological model for learning analytics. Paper presented at the ASCILITE – Australian Society for Computers in Learning in Tertiary Education Annual Conference 2011. http://www.editlib.org/p/43627
Lodge, J., & Lewis, M. (2012). Pigeon pecks and mouse clicks: Putting the learning back into learning analytics
. Paper presented at the ASCILITE 2012,, Wellington.
Macfadyen, L. P., Dawson, S., Pardo, A., & Gasevic, D. (2014). Big Data in Complex Educational Systems: The Learning Analytics Imperative and the Policy Challenge. Research & Practice in Assessment, 9(Winter, 2014), 11. Retrieved from http://go.galegroup.com/ps/i.do?action=interpret&v=2.1&u=cqu&it=JIourl&issn=2161-4210&authCount=1&p=AONE&sw=w&selfRedirect=true
Rogers, T. (2015). Critical realism and learning analytics research: epistemological implications of an ontological foundation. Paper presented at the Proceedings of the Fifth International Conference on Learning Analytics And Knowledge, Poughkeepsie, New York.
Siemens, G., & Baker, R. S. J. d. (2012). Learning analytics and educational data mining: towards communication and collaboration. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Vancouver, British Columbia, Canada. http://delivery.acm.org/10.1145/2340000/2330661/p252-siemens.pdf?ip=18.104.22.168&acc=ACTIVE SERVICE&CFID=145100934&CFTOKEN=10069569&__acm__=1345679404_18f65a315d7b4ba9014a8f150ad6189c
Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. Educause Review, 46(5), 9. Retrieved from http://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education