Are we to blame?

This post is an exercise in procrastination brought about by late afternoon writer’s block. I’m currently preparing a paper that examines a rare example of an institution-wide learning analytics implementation using a theoretical lens. The purpose of the paper is to contribute to the theoretical understanding of learning analytics implementation and to represent this as an initial set of design principles or heuristics for practitioners. However, I am noticing a fascinating irony in the research, an irony that is further reinforced by my experience with an enterprise-wide learning analytics implementation over the last five years.

Basically, I suspect the biggest barrier to organisation-wide learning analytics implementation is the organisation itself. I’m reminded of the oft cited quote:

“We have met the enemy and he is us”

(https://en.wikipedia.org/wiki/Pogo_(comic_strip))

Learning analytics, and particularly its implementation, seems to me to sit in an organisational no-mans-land. That it involves data and employs some information technology seems to see it pushed far too often in the direction of the IT department. But we know that this doesn’t work:

“All too frequently, LA is conceptualised as a technical solution to an education problem. As such oversight and management of LA is assigned to core administrative units (e.g. IT units) who establish the various rules and regulations guiding access to the data and adoption of the technologies”

(Dawson et al., 2018)

There seems to be a problem whereby we struggle to link what we know theoretically about learning analytics implementation, with how we approach implementation. For example:

  • We know that learning analytics is, by and large, an applied research field (Dawson, Gašević, Siemens, & Joksimovic, 2014)
  • We know that learning analytics is a multidisciplinary concept (Dawson et al., 2018)
  • We know that one-size-fits-all approaches are fundamentally flawed with learning analytics (Gašević, Dawson, Rogers, & Gasevic, 2016)
  • We know that our organizations are mired in technical, social and cultural challenges when it comes to learning analytics adoption (Macfadyen & Dawson, 2012)

So if we know these things as a sector, why is it that the gap between our research-based knowledge of learning analytics and our knowledge of how to implement learning analytics continues to grow (Colvin, Dawson, Wade, & Gasevic, 2017), and why do we see these mistakes repeated? I am wondering just how much of an impact our organisational structures / arrangements have on something like learning analytics, which needs to span our internal organisational boundaries? I also wonder if my anecdotal knowledge of the politicking that happens with learning analytics implementation across the sector, is somehow linked with this apparent homelessness?

If we are struggling to apply what we know when we conceptualise our learning analytics implementations, it follows that we will struggle to implement an approach that favours learning and adaptation; something that is needed while learning analytics  remains generally undertheorised (Dawson, Mirriahi, & Gasevic, 2015).

Just a thought.

References

Colvin, C., Dawson, S., Wade, A., & Gasevic, D. (2017). Addressing the Challenges of Institutional Adoption. In C. Lang, G. Siemens, A. Wise, & D. Gasevic (Eds.), Handbook of Learning Analytics (Vol. 1, pp. 281 – 289). Australia: Society for Learning Analytics Research.

Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014). Current state and future trends: A citation network analysis of the learning analytics field. Paper presented at the Proceedings of the fourth international conference on learning analytics and knowledge.

Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gasevic, D. (2018). Rethinking learning analytics adoption through complexity leadership theory. Paper presented at the Proceedings of the 8th International Conference on Learning Analytics and Knowledge.

Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, 28, 68-84.

Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Journal of Educational Technology & Society, 15(3), 149-163.

 

Advertisements

The tensions around learning analytics

This post is some thinking around my PhD resulting from some conversations and presentations from this year’s wonderful ALASI2018 conference held recently in Melbourne.

In a recent post, I mentioned that technology or solutions to problems do not just spring into existence. They emerge from a complex network of interactions between people and technology (Hannon, 2013; Introna, 1996, 2007). This is especially true for learning analytics which aims to help people better understand some other people and their learning environments, using some technology, and with what is at best, incomplete data. So learning analytics does not just spring into existence, but results from cycles of interaction between people, technology (data), and educational objects (objects being units, assessments, students, staff, problems etc). A real example might help explain this better:

Some colleagues and I have been researching and tinkering with learning analytics since 2007. We noted that the information systems available to our teaching staff were woefully inadequate at providing the information and affordances that these staff needed, when they needed it, during their day-to-day activities. Our university has high proportions of online, low-socioeconomic and first-in-family students, so our initial focus around this information and affordance deficit was how do we provide teachers with an evidence-informed approach to interacting with their students? We also wanted to make it much easier for teachers to access engagement and performance information about their students, during the term when it could be acted upon. This lead to a series of research-development-feedback cycles that resulted in the EASICONNECT system back in 2014.

In the following video, Damo explains a little about EASICONNECT

 

The EASI system is a basic risk/intervention type of learning analytics that allows academic staff to view their students’ activity and performance based around an indicator of success for each student. Staff can “nudge” (mailmerge) their students from the same web page, a facility that has proven to be very popular with teaching staff. To date, 1,077,732 EASI nudges have been delivered by teachers to 88.9% of all our HE students. While only 63% of units use EASI, and only 40% use it to nudge their students, this 40% represents 88.9% of all our students (perhaps pointing to EASI’s utility for teachers with larger undergraduate classes). We also noted a significant increase in activity (on average) by students who were sent an engagement nudge. EASI was not developed by a vendor, but was developed locally using an incremental, user-centered approach.

However, the point is that EASI did not just spring into existence back in 2014. It was the result of a whole range of interactions and adaptations prior to the formal implementation. Prior to, and during EASI’s implementation, the developers were conducting what I’d retrospectively suggest were cycles of interaction between people, technology and an education problem as per the following diagram.

Screen Shot 2018-11-29 at 09.08.24

Figure 1. Interaction Cycles

One of the workshops I attended at ALASI2018 was talking about complexity leadership and how there exists a tension between administrative/mainstream systems, processes and mindsets, and adaptive systems, process and mindsets. Innovation and problem solving (adaptive) in organisations always develops a tension with established, mainstream ways of doing things, and this is an area where leadership is often lacking. The following diagram builds upon what was shown at the conference.

Screen Shot 2018-11-29 at 09.25.05.png

Figure 2. Adapted from presentation slides by University of South Australia

The key point here is that there is an area of tension when these small scale innovations or solutions try to scale into the mainstream “business as usual” environment. This is where Siemens, Dawson & Eshleman (2018) say that leadership is required and complexity leadership provides a model that can help. Knowing that this tension exists and understanding its drivers, can potentially help us develop processes and policies that cater for this inevitable tension, and allows us to move forward with learning analytics. I think this is a vitally important concept for learning analytics where one-size-fits-all and single lens approaches simply don’t work. It also fits with our experience with EASI in this area of tension between bespoke, evidence-informed approaches to learning analytics and the orthodox preference for commercial-off-the-shelf, vendor-supplied solutions.

The above diagram also helps articulate the continuum between complicated and complex (Snowden & Boone, 2007). The left hand side is the domain of management where managers strive for predictability and order, while risk, change, information flow and diversity is shunned. The right hand side is the domain of leadership where leaders are comfortable with change, acceptable of failure, and strive to increase diversity and information flow (Freeburg, 2018). I would suggest that while organisations generally need both sides of this diagram, the left side can often abrogate the right. In an era of cascading complexity and change, the adaptations required for prosperity or even survival, will most likely come from the right side of the diagram (in my opinion).

From the perspective of learning analytics, I believe this diagram helps explain why learning analytics research is accelerating while learning analytics practice slowly pushes through this area of tension. From the perspective of my PhD, how does the meso-level practitioner, operating on the right hand side, help the “coal-face” teachers, navigate or work around this middle area of tension and conflict?

References

Freeburg, D. (2018). Leadership and innovation within a complex adaptive system: Public libraries. Journal of Librarianship and Information Science, 0(0), 0961000618810367. doi:10.1177/0961000618810367

Hannon, J. (2013). Incommensurate practices: Sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29, 168-178.

Introna, L. D. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20-39.

Introna, L. D. (2007). Maintaining the reversibility of foldings: Making the ethics (politics) of information technology visible. Ethics and Information Technology, 9(1), 11-25.

Siemens, G., Dawson, S., & Eshleman, K. (2018). Complexity: A leader’s framework for understanding and managing change in higher education. Educause Review, 53(6), 27 – 42. Retrieved from Educause Review website:

Snowden, D., & Boone, M. E. (2007). A Leader’s Framework for Decision Making. Harvard Business Review, 85(11), 68-76

 

 

 

A little more on the single lens problem

In my previous post I talked about how a singular view and unilateral approach to learning analytics is actually a misinterpretation of the underlying system and indeed the concept of learning analytics. Universities tend to invest in learning and teaching systems (generally) based on orthodox, hierarchical and normative approaches modeled from the business world (Hannon, 2013). The business model is known as a “planning approach”, aims to achieve managerially planned organisational change (Hannon, 2013). Technological change projects employing this approach are reported to have failure rates of up to 70% in the business world with comparable results in universities (Alvesson & Sveningsson, 2015; Malikowski, Thompson, & Theis, 2007). Note that failure does not mean that the systems are malfunctioning; it means that there is little evidence of significant impact upon teaching practices (Malikowski et al., 2007). This, I believe is something very important for organisations to consider when it comes to learning analytics implementation.

The tendency for universities to focus on learning analytics from a technology-centred perspective, overlooks the “sociomateriality” of technology systems more generally. The sociomaterial approach to practice acknowledges that practice is a complex entanglement of humans and technology (Hannon, 2013). In other words, technology does not miraculously spring into existence, it results from a complex and socially situated design process (Introna, 2007). To me, this sits at the heart of the single lens issue. If learning analytics is conceptualised and enacted by a single entity, it is only representative of that particular entity’s view of the world, their foci and more pragmatically, their performance indicators. From an organisational perspective I can see the attraction; here we have this complex and increasingly emotive thing called learning analytics. That’s data and some associated technology isn’t it?  Let’s throw it at that organisational unit over there to make it happen. This is a valid approach if we are talking about a system like a finance or payroll or even a student information system where the unknowns are known, and not unknowable. However, learning analytics at its core, is a decision support system that helps humans better understand themselves, other humans and technology-based learning environments based on what can only ever be absurdly incomplete data.

So if top-down approaches to learning analytics are not going to work and bottom-up approaches are incompatible with how our organisations are structured and operationalised, what can we do? In my PhD I am proposing that meso-level practitioners, with their knowledge of the complex learning analytics concept and their local knowledge of the organisation, are a key part to the solution. The meso-level is intermediate between the coal-face and the large-scale policy and institutional processes (Hannon, 2013). Much of what (Siemens, Dawson, & Eshleman, 2018) talk about in their recent Educause article resonates with me and the meso-level practitioner concept. However, countering the current operating norms and their entrenched mindsets is a monumental challenge. Our mindsets around organisations and how they operate is so entrenched, that even voicing alternatives is almost treated as blasphemy. However, I hope that considering learning analytics from other perspectives might help move some of these mindsets and keep learning analytics practices in touch with rapidly advancing learning analytics research.

References

Alvesson, M., & Sveningsson, S. (2015). Changing organizational culture: Cultural change work in progress: Routledge.

Hannon, J. (2013). Incommensurate practices: Sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29, 168-178.

Introna, L. D. (2007). Maintaining the reversibility of foldings: Making the ethics (politics) of information technology visible. Ethics and Information Technology, 9(1), 11-25.

Malikowski, S., Thompson, M., & Theis, J. (2007). A model for research into course management systems: bridging technology and learning theory. Journal of educational computing research, 36(2)(2007), 24. doi:https://doi.org/10.2190/1002-1T50-27G2-H3V7

Siemens, G., Dawson, S., & Eshleman, K. (2018). Complexity: A leader’s framework for understanding and managing change in higher education. Educause Review, 53(6), 27 – 42. Retrieved from Educause Review website: https://er.educause.edu/articles/2018/10/complexity-a-leaders-framework-for-understanding-and-managing-change-in-higher-education

 

Learning analytics: The single-lens problem

The following is essentially “thinking out aloud” with regards to a particular section of my PhD. Please forgive any over-generalisations as every organisation’s context and every learning analytics implementation is unique.

As anticipated (Beer & Jones, 2014), hype and faddism does appear to be impacting on the ability of learning analytics to make meaningful and sustained contributions to learning and teaching. The hype around a concept like learning analytics and a fear of missing out can fuel a rushed approach to implementation by universities. On top of this, Australian public universities are corporate bureaucracies with rigid hierarchical structures (Kenny, 2009; Rowlands, 2013) that can hinder implementation. The strict hierarchical nature of Australian universities is constraining their ability to innovate, which in turn constrains their ability to adapt to emerging challenges such as reduced public funding, increased accountability and growing questions about the future viability of formal university degrees (Siemens, Dawson, & Eshleman, 2018). Irrespective of arguments for and against, this is the operating environment for most, if not all Australian universities, and part of my PhD is looking at how to innovate with learning analytics within these hierarchical surroundings. The problem specifically is that it is, in essence, multi-discipline action research and how we go about it within these hierarchical structures is unknown.

The following uses university IT as an oft cited example of how university structures and operating norms can impact upon an organisation’s ability to innovate. Almost every facet of learning and teaching from content delivery to assessment and grading employs information technology in some capacity. Most universities clearly delineate between the technical systems support and maintenance and the people who use these systems to deliver learning and teaching i.e. The IT department is organisationally separate from academic areas which are separate from student support areas and so on. Traditionally, IT departments are responsible for the procurement, development, installation, maintenance and operation of all a university’s IT assets. While this approach to IT systems is appropriate for administrative/deterministic situations, in nondeterministic situations it impedes and frustrates innovation and organisational learning (Deakin-Crick, Huang, Godfrey, Taylor, & Carhart, 2018; Siemens et al., 2018). Based on the assumption proffered by the burgeoning learning analytics literature that it not exclusively an IT, business intelligence, or academic concept, and that learning and teaching environments generally have been described by many as extraordinarily complex, I would argue that learning analytics cannot be considered as deterministic or administrative systems. In other words, you cannot manage a learning analytics implementation project as you would manage a finance or logistics system implementation.

The tradition notion of “command and control” that is applied in the application of technology is associated with stability and changelessness. Command and control is less effective where innovation and adaptation are required (Siemens et al., 2018). The administrative burden associated with established, invariant IT systems is prohibitively restrictive when applied to a design-based or experimental project. Strict outcomes based management does not lead to direct linear outcomes with these sorts of project and actually inhibits the flow of information and the opportunities for the cross-fertilisation of ideas (Siemens et al., 2018). The notion that up-front requirements gathering exercises can be conducted in advance of, and in isolation from the operating environment is staggeringly flawed in the context of learning analytics as a design-based or action research concept.

The example above highlights just one issue when the organisation considers learning analytics from a purely technological perspective; you end up with a technology-centred design. Technology-centred design is less than useful in nondeterministic situations as it increases the likelihood for human error, results in sub-optimal human-machine interactions and requires the humans to adapt to the technology (Endsley, 2001, 2016). Given that learning analytics is about understanding and optimising learning and the environments in which it occurs (Siemens & Long, 2011), what is required is human or user-centred design. User-centred design integrates information in ways that fit the goals, tasks and needs of the human users and results in reduced errors, improved performance, improved user acceptance and satisfaction, and improved productivity (Endsley, 2016). User-centred design is time consuming and is more difficult to achieve, elements that can make it less attractive for an organisation in a hurry.

While the above singles out the impact of an overly dominant IT lens being applied to learning analytics, other single lens conceptualisations of learning analytics are just as fraught. Considering learning analytics using only an IT lens, or only a data/BI lens, or only an academic lens (teachers and students) are fundamentally flawed and misinterpret the underlying system and the learning analytics concept. However our organisation structures, managerial mindsets and operating norms tend to promote this way of doing things. Perhaps this goes someway to explaining why organisational practices are lagging further and further behind compared with learning analytics research (Dawson et al., 2018).

References
Beer, C., & Jones, D. T. (2014). Three paths for learning analytics and beyond: Moving from rhetoric to reality. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education: Rhetoric and Reality, Dunedin, New Zealand. Conference Publication retrieved from http://ascilite2014.otago.ac.nz/files/fullpapers/185-Beer.pdf

Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gasevic, D. (2018). Rethinking learning analytics adoption through complexity leadership theory. Paper presented at the Proceedings of the 8th International Conference on Learning Analytics and Knowledge.

Deakin-Crick, R., Huang, S., Godfrey, P., Taylor, C., & Carhart, N. (2018). Learning Journeys and Infrastructure Services: a game changer for effectiveness.

Endsley, M. R. (2001). Designing for situation awareness in complex systems. Paper presented at the Proceedings of the Second International Workshop on symbiosis of humans, artifacts and environment.

Endsley, M. R. (2016). Designing for situation awareness: An approach to user-centered design. Boca Raton, FL: CRC press.

Kenny, J. D. (2009). Managing a modern university: Is it time for a rethink? Higher Education Research and Development, 28, 629-642. doi:doi:10.1080/07294360903206934

Rowlands, J. (2013). Academic boards: Less intellectual and more academic capital in higher education governance? Studies in Higher Education, 38, 1274-1289. doi:10.1080/03075079.2011.619655

Siemens, G., Dawson, S., & Eshleman, K. (2018). Complexity: A leader’s framework for understanding and managing change in higher education. Educause Review, 53(6), 27 – 42. Retrieved from Educause Review website: https://er.educause.edu/articles/2018/10/complexity-a-leaders-framework-for-understanding-and-managing-change-in-higher-education

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. Educause Review, 46(5), 9.

Learning analytics implementation struggles

In a previous post I talked a little about how universities (well their leaders anyway) are attracted to the predictive potential of learning analytics, and how this approach is fundamentally flawed. In a subsequent post I suggested that situation awareness can provide a useful theoretical platform that may help universities broaden their learning analytics implementation approaches beyond prediction-based endeavours. This new post has been inspired by a couple of fascinating LAK18 papers that talk about learning analytics implementation.

Universities in Australia generally fall into two camps based on their leadership approach to learning analytics implementation; top-down and bottom-up (Dawson et al., 2018).

Top-down or instrumental approaches to learning analytics adoption are often based on preconceived recipes or prescribed methodologies and are all too often doomed to failure (Dawson et al., 2018). The top-down implementation of analytics related technology is the easy part, getting uptake and generating impact based on how and if people use the technology is where these systems fail (Marchand & Peppard, 2013). In other words, these systems are easy to install but are unlikely generate the desired uptake and impact. It is well known that top-down implementation approaches are “less than ideal” when it comes to learning analytics (Beer & Jones, 2014, p. 244).

Bottom-up or emergent approaches to learning analytics takes a much more consultative approach and usually begin on a small scale (Dawson et al., 2018). However, bottom up approaches are difficult to scale up beyond the local context to “a more holistic and complex organisational level” (Dawson et al., 2018, p. 236). So while the bottom-up approach might meet the needs of a small number of learning and teaching contexts, it may fail to scale beyond this due to the diversity of contexts found in a typical university.

As LA research continues to grow there is a very real danger of a widening gulf between identified research needs and outcomes and applied practice. (Dawson et al., 2018, p.242)

While seemingly discrete, I suspect these two approaches are unfortunately linked. Technology adoption in Australian Higher Education is dominated by vanilla implementations and centralised approaches (D. T. Jones & Clark, 2014). So even if the learning analytics system at an institution has been developed with a bottom-up approach and has a track record of uptake and impact, it may still be perceived as “feral” or “risky” due to its decentralised and perhaps unconventional origins (Spierings, Kerr, & Houghton, 2014).

In talking to various folk at the recent ALASI2017 conference, there seems to be a trend whereby universities are wanting to quickly bring learning analytics to the enterprise. In at least one case that I am aware of, a platform developed using a bottom-up approach, is being replaced with a commercial off-the-shelf product that is to be implemented using a top-down, centralised approach, in spite of the evidence against such an approach.

Once an innovation such as [learning analytics] achieves a high public profile, it can create an urgency to ‘join the bandwagon’ that swamps deliberative, mindful behavior (Beer & Jones, 2014, p. 243)

There are two things at play here that I’m thinking about with regards to my PhD studies. The first is how learning analytics is being conceptualised by these universities. No matter the university, learning analytics is an applied research project (Dawson et al., 2018) and not an IT project. I suspect that this mistake/misinterpretation also contributes to the high failure rates experienced by analytics related projects (Marchand & Peppard, 2013). The second is the role of meso-level practitioners and they can potentially contribute to bridging the gap between the two approaches (Hannon, 2013). Meso-level practitioners assuage the tension between the small-scale, local interactions, and the large-scale policy and institutional processes (C. Jones, Dirckinck‐Holmfeld, & Lindström, 2006; Uhl-Bien, Marion, & McKelvey, 2007).

As a personal aside with regards to LAK18, my application to the doctoral consortium got accepted and I was very much looking forward to attending. Unfortunately, I had to withdraw at the last minute due to a family illness. It is a fantastic event that sees the world’s foremost experts in learning analytics gather to talk and share their stories. I dearly hope I can make LAK19 next year.

References

Beer, C., & Jones, D. T. (2014). Three paths for learning analytics and beyond: Moving from rhetoric to reality. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education: Rhetoric and Reality, Dunedin, New Zealand. Conference Publication retrieved from http://ascilite2014.otago.ac.nz/files/fullpapers/185-Beer.pdf

Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gasevic, D. (2018). Rethinking learning analytics adoption through complexity leadership theory. Paper presented at the Proceedings of the 8th International Conference on Learning Analytics and Knowledge.

Hannon, J. (2013). Incommensurate practices: Sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29, 168-178.

Jones, C., Dirckinck‐Holmfeld, L., & Lindström, B. (2006). A relational, indirect, meso-level approach to CSCL design in the next decade. International Journal of Computer-Supported Collaborative Learning, 1(1), 35-56.

Jones, D. T., & Clark, D. (2014). Breaking Bad to Bridge the Reality / Rhetoric Chasm. Paper presented at the ASCILITE2014 Rhetoric and Reality, Dunedin, New Zealand. Conference publication retrieved from http://ascilite2014.otago.ac.nz/

Marchand, D. A., & Peppard, J. (2013). Why IT Fumbles Analytics. Harvard Business Review, 91(1), 104-112.

Spierings, A., Kerr, D., & Houghton, L. (2014). What drives the end user to build a feral information system? Feral Information Systems Development: Managerial Implications, 161-188.

Uhl-Bien, M., Marion, R., & McKelvey, B. (2007). Complexity leadership theory: Shifting leadership from the industrial age to the knowledge era. The Leadership Quarterly, 18, 298-318. doi:10.1016/j.leaqua.2007.04.002

 

Learning analytics is about the here and now

Summary of this post for the time poor
Leaning analytics needs to focus less on predicting the future and more on what’s happening right now.

A vexing question
This post extends my previous posts about learning analytics and the limitations of predictive modelling. Knowing that one-size-fits-all approaches to learning analytics are not going to work, presents us with a dilemma. How do we approach our learning analytics implementations? There is almost an implied understanding that learning analytics and prediction go hand-in-hand. If this is not the case, what is it that we need to be doing? This post is some early thinking around this question.

Lessons from other industries
Historically, most industries designed and developed their systems from a technology-centred perspective (Endsley, 2016). As the systems become more sophisticated, operators had to cope with an exponential growth in available information provided by the systems. People can only pay attention to a certain amount of information at once and the gap between the volume of data provided by the systems and the operators ability to distill the information required for their tasks continued to grow. This was because the design and development of these systems centred around technical considerations rather than a detailed understanding of the operator’s tasks and goals. In industrial and military settings, much of what has been attributed to human error is the direct result of technology-centred designs that are ill-suited for augmenting human performance across the wide range of conditions and circumstances found in real world environments (Endsley, 2016).

In complex systems, the problem becomes more pronounced as the elements in the system interact and change constantly. Automation that was quite suitable for linear, repetitive tasks breaks down in complex systems. Data and information on what worked previously is no longer useful as the context evolves and adapts. Trying to automate our way out of “human error” in complex environments leads to more complexity, more cognitive load and catastrophic errors (Endsley, 2001, 2016). Complex environments with many interacting and adapting agents are inherently unpredictable therefore the utility of retrospective data and information is limited. In these environments, it is the operators’ ability to understand the situation as a whole that forms the basis of sound decision-making. The ability of the operators to understand the system as a whole is known as situation awareness (Endsley, 2016).

“Situation awareness is the engine that drives the train for decision making and performance in complex, dynamic systems” (Endsley, 2016).

In complex and dynamic environments, decision making is highly dependent on situation awareness – a constantly evolving picture of the state of the environment. Situation awareness is goal oriented whereby the goals of the job determine the elements within the environment that people need to be aware of. There are three broad levels to situation awareness. The lowest level is the perception of the status, attributes and relevance of elements in the environment. The next level is understanding what the data and cues perceived mean in relation to what the endeavour is trying to achieve. Once the person is able to perceive the elements in the environment and what they mean, they can project what this means into the near future which is the third level of situation awareness.

“The best way to support human performance is to better support the development of high levels of situation awareness” (Endsley, 2016)

What does this have to do with learning analytics?
Learning analytics is data and information derived from agent interactions within a complex system and aims to enhance our understanding about learners and the environment in which they learn (Colvin et al., 2015; Macfadyen, Dawson, Pardo, & Gasevic, 2014). Given the limited value of retrospective data and predictive modelling, situation awareness affords an alternative way of thinking about learning analytics. Rather than thinking of learning analytics as a tool for prediction, we think of it as a tool to help us determine what is happening right now, a mechanism for enhanced situation awareness. As David has touched on previously, our information systems in higher education are wholly inadequate when it comes to providing teachers with the tools they need to manage the situated present. I believe that this is a gap where learning analytics can provide enormous value. However it does change the approach we take with learning analytics implementation.

References
Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., . . . Corrin, L. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching.

Endsley, M. R. (2001). Designing for situation awareness in complex systems. Paper presented at the Proceedings of the Second International Workshop on symbiosis of humans, artifacts and environment.

Endsley, M. R. (2016). Designing for situation awareness: An approach to user-centered design. Boca Raton, FL: CRC press.

Macfadyen, L. P., Dawson, S., Pardo, A., & Gasevic, D. (2014). Big Data in Complex Educational Systems: The Learning Analytics Imperative and the Policy Challenge. Research & Practice in Assessment, 9(Winter, 2014), 11.

 

Learning analytics and magic beans

My PhD is broadly about helping universities implement learning analytics. This post relates to some of the things that I’m seeing around the Australian Higher Education Sector with regards to learning analytics.

There are two broad trajectories that universities tend to take when implementing learning analytics (Colvin et al., 2015). One trajectory is focused on measurement and broader performativity precepts and retention interventions. The second trajectory is underpinned by pursuit of understanding where the emphasis is on learning, and recognition that retention is consequential to broader teaching, learning and engagement experiences for students.

The first trajectory seems to be where a lot of universities are at the moment, a situation that is at loggerheads with the second trajectory. Rather than seeing these trajectories converge as I hoped, they seem to be diverging in a worrying way.

The first trajectory, in particular, fits with the perpetual concern that universities have with student attrition and its (real and potential) impact on their bottom lines. However, it is becoming more apparent that this approach is flawed, especially when considered in relation to how universities approach the adoption of technology – single centralised systems that are implemented top-down, often using external consultants and off-the-shelf enterprise software (Jones & Clark, 2014).

It is becoming increasingly evident that one-size-fits-all approaches to learning analytics do not work (Colvin et al., 2015). Meaning-making from learning analytics data is dependent on a sound understanding of the learning and teaching context and requires a human in the sense-making loop (Clow, 2014). Simplistic approaches (such as those proposed by consulting companies peddling off-the-shelf software solutions) are doomed to fail (Macfadyen & Dawson, 2012). The use of generalised models encapsulated in these simplistic approaches poses a threat to the potential of learning analytics to improve the quality of learning and teaching practice (Gašević, Dawson, Rogers, & Gasevic, 2016; Liu, Rogers, & Pardo, 2015). These generalised models and simplistic approaches are especially absurd when you consider the remarkably complex and diverse learning and teaching contexts involved.

When algorithms are black boxes, this prevents academics from identifying teaching or curriculum issues that may be at play
 (Liu et al., 2015)

Learning analytics aside, such approaches are also incompatible with the actual nature of student attrition as a problem construct. Student attrition is only rarely caused by a single problem that an external agency like a university can assist with (Beer & Lawson, 2016). It is the complex interplay between multiple, ever-changing variables that results in student attrition, a notion that contrast with simplistic approaches to solutions (Beer & Lawson, 2017). The nature of student attrition further reinforces the point that one-size-fits-all approaches to learning analytics implementations aimed at helping with student attrition do not work. However, as organisations, we are still drawn to these simplistic solutions that are often proffered by consulting companies with their array of glossy brochures and anecdotal evidence.

Universities as organisations have long struggled to overcome their active inertia  preferring to apply familiar approaches. In my mind, the consulting companies are well aware of this and know exactly which buttons to push to peddle their solution. As such, I worry that we will see universities adopting off-the-shelf learning analytics systems with sexy names that are inherently rigid and are based on generalised models. The lure of predictive models based on mysterious (often proprietary) algorithms is strong and has always been a successful consulting tactic. This, despite ample evidence showing that predicting outcomes from systems that involve humans is utterly futile except in a very narrow set of circumstances (Allen & Boulton, 2011).

The only way that prediction becomes possible is when the system or agent is isolated from external influences; something that can only ever occur in laboratory conditions (Allen & Boulton, 2011)

Directly addressing student attrition through one-shot projects and special funding has had little to no impact on the problem in the past. Limiting the potential of learning analytics by focusing only on student attrition is unlikely to meaningfully contribute in the long term. Learning analytics is complexly entangled with learning and teaching contexts and thinking about it as just another IT project to be outsourced to the snappiest vender is a mistake. These sorts of projects fail more than they succeed, often because they lack the contextualisation to be useful across diverse contexts (Goldfinch, 2007). Learning analytics requires a learning approach, something that institutions are not going to achieve by buying off-the-shelf and limiting their learning analytics to a single dimension.

References

Allen, P., & Boulton, J. (2011). Complexity and limits to knowledge: The importance of uncertainty. In P. Allen, S. Maguire, & B. McKelvey (Eds.), The SAGE Handbook of Complexity and Management (pp. 164-181). London, England: SAGE.

Beer, C., & Lawson, C. (2016). The problem of student attrition in higher education: An alternative perspective. Journal of Further and Higher Education, 1-12. doi:10.1080/0309877X.2016.1177171

Beer, C., & Lawson, C. (2017). Framing attrition in higher education: A complex problem. Journal of Further and Higher Education, 1-12. doi:10.1080/0309877X.2017.1301402

Clow, D. (2014). Data wranglers: Human interpreters to help close the feedback loop. Paper presented at the Proceedings of the fourth international conference on learning analytics and knowledge, Indianapolis, IN, USA.

Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., . . . Corrin, L. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching.

Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, 28, 68-84.

Goldfinch, S. (2007). Pessimism, computer failure, and information systems development in the public sector. Public Administration Review, 67(5), 917-929.

Jones, D. T., & Clark, D. (2014). Breaking Bad to Bridge the Reality / Rhetoric Chasm. Paper presented at the ASCILITE2014 Rhetoric and Reality, Dunedin, New Zealand. Conference publication retrieved from http://ascilite2014.otago.ac.nz/

Liu, D. Y.-T., Rogers, T., & Pardo, A. (2015). Learning analytics-are we at risk of missing the point. Paper presented at the Proceedings of the 32nd ascilite conference, Perth, Australia.

Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Journal of Educational Technology & Society, 15(3), 149-163.