Complicated or Complex Article

I have been trying to catch up on some readings lately and the following article caught my eye “Complicated or Complex? Analytics Treats Them Differently” at the icrunchdatanews.com website. The article talks about something that that interest me, the difference between complicated and complex in terms of analytics. It’s something that we have written about previously and is very much an area that needs more work.

The article drops the usual buzzwords and phrases straight from a consultant’s handbook such as:

“the stakes have never been higher for managers to make better decisions with analyzable information”

and…

“They need powerful, high-performance analytics that can process the Big Data”

Actually, with the burgeoning faddism around analytics and big data, it’s getting difficult to avoid these sorts of phrases and find stuff that meaningful. However, they do go on to make some points that aligns with our thinking around learning analytics:

“Business intelligence and drill-down queries are insufficient”

This is a valid point and links nicely to the contextual nature of learning analytics (pedagogical intent, task being undertaken etc) and the difficulties associated with learning analytics and the re usability paradox.

“Delegate more decisions to employees”

It’s about time! This is something we were alluding to in our complexity paper some time ago and is a key part of the IRAC framework. Information needs to be represented appropriately in context with the task it’s associated with, it needs to be coupled with affordances for action (what’s the point of data if it doesn’t lead to action?) and it needs to be able to change as the context changes.

Decision-making ponderings

As the previous post indicates, I’m having a closer look at decision-making and uncertainty with regard to learning analytics. The following is really just some thinking out loud so apologies in advance for any errors or misinterpretations. Note that in this post I’m thinking about learning analytics through a learning and teaching lens.

It has been said that people based their decisions on their internal representations of the context and not the context as sensed (Zachary, Rosoff, Miller, & Read, 2013). These internal representations of the context are richer, more stylized, incorporate multiple levels of abstraction, and take on a structure that enables rapid retrieval of relevant decision-making heuristics and procedures (Zachary et al., 2013). This is known as recognition-primed decision-making (RPD).

According to Zachary et al. (2013) there are four context awareness levels:

  • Perception. What is there.
  • Comprehension. What does it mean.
  • Projection. How might it evolve.
  • Sense-making. How does it make sense.

This has some alignment with the situation awareness levels as described by Endsley that I’ve talked about in an earlier post (Endsley, 2001):

  • Level 1. Perception of the elements in the environment
  • Level 2. Comprehension of the current situation
  • Level 3. Projection of the future status

Zachary et al. (2013) say that the situation awareness theory and RPD work best in contexts that involve well-defined problem-solving in bounded problem domains, such as piloting aircraft and controlling complex mechanical systems. With regard to learning analytics, I’m seeing this as how it can contribute to operational, perhaps, real-time contexts inside the course, during the term.

They also talk about narrative reasoning where the observer/participant constructs, analyses and explains complex situations through a narrative (story telling) process (Zachary et al., 2013). They go on to say that people almost universally use story narratives to represent, reason about, and make sense of contexts involving multiple interacting agents, using motivations and goals to explain both observed and possible future actions (Zachary et al., 2013). With regard to learning analytics, I’m seeing this as how it can contribute to the retrospective understanding and sharing of what transpired within the operational contexts.

The message here for me is that learning analytics should aim to contribute to both operational/real-time components, and the reflective/retrospective components, as they are not mutually exclusive. This gets very interesting from an information systems and complexity science perspective when we start to think about affordances for distributed cognition and disintermediation.

References

Endsley, M. R. (2001). Designing for situation awareness in complex systems. Paper presented at the Proceedings of the Second International Workshop on symbiosis of humans, artifacts and environment.

Zachary, W., Rosoff, A., Miller, L. C., & Read, S. J. (2013). Context as a Cognitive Process: An Integrative Framework for Supporting Decision Making. Paper presented at the STIDS.

Learning analytics and uncertainty

Learning analytics is still an emerging concept in that it has not got a broadly accepted definition. Johnson et al. (2013) define learning analytics as the collection and analysis of data in education settings in order to inform decision-making and improve learning and teaching. Siemens and Long (2011) say that learning analytics is “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (p. 34). Others have said that learning analytics provides data related to students’ interactions with learning materials which can inform pedagogical sound decisions about learning design (Fisher, Whale, & Valenzuela, 2012). Suffice to say that there is uncertainty around exactly what learning analytics is. I also think its safe to say that there is uncertainty around how learning analytics can contribute to decision-making at the various levels of the institution (unit, program, faculty, institution).

In thinking about this I decided to look a little closer at uncertainty and how it influences decision-making. Firstly, what is uncertainty? There are three main conceptualizations of uncertainty according to Lipshitz & Strauss (1997):

  • Inadequate understanding
  • Undifferentiated alternatives
  • Lack of information

They go on to talk about five strategies for how people manage uncertainty:

  • Reductionism
  • Forestalling
  • Assumption-based reasoning
  • Weighing pros and cons
  • Suppression

They presented some data that talked about how decision-makers dealt with uncertainty and I thought the results were interesting:

  • Inadequate understanding was primarily managed by reduction
  • Incomplete information was primarily managed by assumption-based reasoning
  • Conflict among alternatives was primarily managed by weighing pros and cons
  • Forestalling was equally likely to be used as a backup with all forms of uncertainly

So what does this have to do with learning analytics?

The tools and processes is only the first step for learning analytics, the integration of these tools and processes into the practices of learning and teaching is far more difficult (Elias, 2011). This is on top of the enormous complexity inherit in learning and teaching (Beer, Jones, & Clark, 2012). Seems like a great big pile of uncertainty to me? I wonder how inadequate understanding and incomplete information will influence the management of learning analytics implementations?

 References

Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Paper presented at the ASCILITE2012 Future challenges, sustainable futures, Wellingtone, New Zealand.

Elias, T. (2011). Learning analytics: Definitions, processes and potential. Learning, 23, 134-148.

Fisher, J., Whale, S., & Valenzuela, F.-R. (2012). Learning Analytics: a bottom-up approach to enhancing and evaluating students’; online learning (pp. 18). University of New England: Office for Learning and Teaching.

Johnson, L., Adams, S., Cummins, M., Freeman, A., Ifenthaler, D., Vardaxis, N., & Consortium, N. M. (2013). Technology Outlook for Australian Tertiary Education 2013-2018: An NMC Horizon Report Regional Analysis. In T. N. M. Consortium (Ed.): New Media Consortium.

Lipshitz, R., & Strauss, O. (1997). Coping with uncertainty: A naturalistic decision-making analysis. Organizational Behavior and Human Decision Processes, 69(2), 149-163.

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. Educause Review, 46(5), 9. Retrieved from Educause Review Online website: http://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education

Learning analytics and situation awareness

The following post is just a shallow airing some of my thoughts around how I see the body of literature around situation awareness fitting in with my learning analytics research and a current project. Like many other universities, we are currently working on a project that aims to draw upon learning analytics to help with student retention. The project provides teaching academics with a simple tool to aid with their awareness over their distance students by combing and presenting data from the learning management system and student information system.

In a recent paper we suggested that e-learning is a type of complex adaptive system (CAS). CAS are defined by John Holland as a network of systems that learn and adapt as they interact (J. Holland, 2006). Each agent or element within a CAS is nested within, and alongside other systems, evolving together as they interact (J. Holland, 2006). This means we cannot understand any of the agents or systems without reference to the others (Plsek & Greenhalgh, 2001). In other words, the whole represents more than just the sum of the parts. The agents interact and adapt and this results in emergent behavior (J. H. Holland, 1995; Plsek & Greenhalgh, 2001). Stock markets, traffic flow, ecosystems, cells are all examples of complex adaptive systems. From the perspective of learning analytics (LA), considering e-learning a CAS changes how we manage and monitor the system.

CAS are not systems where effect proportionally follows cause. Interventions within a CAS are likely to have diverse, far-reaching, unpredictable and non-linear effects (Shiell, Hawe, & Gold, 2008), known colloquially as the butterfly effect. Considering LA as something that results from the interactions occurring with a CAS enables us to evaluate and respond to the realities of the present as opposed to targeting an idealistic future state (Beer, Jones, & Clark, 2012). The key point here is the phrase “respond to the realities of the present”. It has been said that LA can improve learning, teaching and student success through an awareness of patterns within the data (Campbell, Oblinger, & DeBlois, 2007) and it has also been said that teachers have the right mix of proximity to, and understanding of the learning and teaching context (Beer et al., 2012). To me (based on a CAS model) this would suggest that LA engage with teachers, within their learning and teaching contexts, during the teaching term and this is where I believe that the body of knowledge around situation awareness (SA) might help.

SA is a concept that describes how operators in complex systems develop and maintain a sufficient awareness of ‘what is going on’ in order to perform tasks successfully (Mica R. Endsley, 1995). While the body of research around SA comes from the human factors arena (factories, aircraft and military operations), it touches on a problem that is probably quite familiar to folk involved with e-learning. The problem is not a lack of information, but finding the information we need when, and where we need it (Mica R Endsley, 2001). There has been a staggering amount (and depth) of research within human factors around how to contextually present information to operators performing within complex systems. Although failures of situation awareness within an e-learning setting is not likely to have the dramatic consequences we might associate with industry or aviation, I wonder if it can help us with the provision of information to when and where academics might need it?  Thoughts?

References

Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Paper presented at the ASCILITE2012 Future challenges, sustainable futures, Wellingtone, New Zealand.

Campbell, J. P., Oblinger, D. G., & DeBlois, P. B. (2007). Academic Analytics. Educause Review(July/August 2007).

Endsley, M. R. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors, 37(1), 32-64. doi: 10.1518/001872095779049543

Endsley, M. R. (2001). Designing for situation awareness in complex systems. Paper presented at the Proceedings of the Second International Workshop on symbiosis of humans, artifacts and environment.

Holland, J. (2006). Studying Complex Adaptive Systems. Journal of Systems Science and Complexity, 19(1), 1-8. doi: 10.1007/s11424-006-0001-z

Holland, J. H. (1995). Hidden order : how adaptation builds complexity / John H. Holland: Reading, Mass. : Addison-Wesley, c1995.

Plsek, P. E., & Greenhalgh, T. (2001). Complexity science: The challenge of complexity in health care. BMJ (Clinical Research Ed.), 323(7313), 625-628.

Shiell, A., Hawe, P., & Gold, L. (2008). Complex interventions or complex systems? Implications for health economic evaluation. BMJ, 336(7656), 1281-1283.