A possible indicator of faddism?

I have been preparing a document for my PHD supervisors that makes mention of a paper we wrote last year. The paper titled “Three paths for learning analytics and beyond: Moving from Rhetoric to Reality” talks about the dangers associated with learning analytics and management fads, and how the hype around technological concepts can swamp deliberate and mindful adoption and implementation.

While looking at this paper I conducted a quick search on Google scholar, year by year using the search term “learning analytics”. While it’s not particularly scientific, the trend is interesting nonetheless.

Screen Shot 2015-10-30 at 11.02.54 AM

Peer review of a colleagues assessment item

This post is a quick peer review of a design based research proposal a colleague has developed for a unit in their masters where they are required to demonstrate peer review. Good luck Rebecca!

The proposal is centered on a short course, five weeks in duration that is offered to high school students so as to provide them with some insight into tertiary education. The course uses Conley’s model of college readiness to guide what is taught in the course, which includes the following facets of readiness:

  • Key cognitive strategies
  • Academic knowledge and skills
  • Academic behaviours
  • Contextual skills and awareness

It is clear that there are some issues associated with the short course as it stands now. Some of these issues resonated with me as they are not limited to just this short course. For example, one of the problems mentioned is linked with the dominant online course delivery mechanism in higher education, the learning management system (LMS). Rebecca points out that LMS delivered courses have transactional distance, are instructor led and have to be completed in an allocated timeframe. According to the proposal introduction, the style of teaching and learning afforded by the LMS is not constructivist, connectivist or conducive to learner autonomy and critical thinking. All sentiments I agree with to some extent. However, given the dominance of LMS as the way that eLearning is delivered in higher education (Coates, James, & Baldwin, 2005), and given that this is a preparatory course for high school students, it seems appropriate that future students gain some experience with this medium, warts and all.

The proposal mentions moving towards a “more heutagogical approach”, I assume to offset some of the limitations associated with LMS based eLearning. I’m no expert at Heutagogy but I must admit that the idea of self determined learning is very attractive in comparison to the current approaches to eLearning. The proposal also considers the student cohort, many of who live in rural or remote areas, and come from low socio-economic backgrounds. This can correspond to limited access to technology, such as reliable broadband internet connections.

The proposal describes three research questions:

  1. Does restructuring the Preparation for Success in Health course to include a Heutagogical approach, allow students to collaborate, critically reflect and provide feedback in an open online environment?

  2. Does shifting the knowledge acquisition into the students’ hands mean they will access a wider variety of sources of information, including health professionals to answer their questions and build on their own ideas of what appropriate knowledge is?

  3. Will the students engage in the Preparation for Success in Health course more authentically if allowed to be more self-directed in their approach to learning, thus engaging in deeper cognitive learning?

The proposal’s literature review hinges upon student centred learning and suggests that Heutagogy might be an appropriate framework for digital age learning. In a Heutagogical approach, the learners are highly autonomous and the focus is on building the learner’s capacity to learn. Experiential and reflective learning is preferred over ‘transmissive’ approaches. On the surface at least, Heutogogy seems to have a lot in common with the personal learning environment literature from a while back. Even the graduate/generic attributes folk talk about some of this albeit from a different perspective, as do the problem based learning folk. The double-loop learning approach described in the proposal interests me as it links nicely to my PHD around complex adaptive systems. Non-linear learning, to me, is a better match for how people really learn from an anthropological perspective, yet the dominant socio-technical approach is very linear (IMHO).

Some feedback on the proposal:

Very interesting and worthwhile proposal. I’ll be disappointed if it doesn’t happen. Have you considered an internal SOLT grant?

  • The chosen methodology is design based research or DBR. This needs to be unpacked more comprehensively. It is not clear to me how the implementation plan links with the methodology. DBR does seem to be driving the methodology behind implementation plan, but it’s not explicit.
  • I’m not sure about the three research questions. All together they describe a very broad scope that includes a huge body of literature. My advice would be to narrow the scope somewhat. The first research question is great, and, in my mind, enough.

More specifically:

  • Conley’s model needs to be unpacked in the introduction.
  • Heutagogy is introduced in the introduction without introduction. I would suggest that describing the problem in the introduction, without Heutagogy, is the way forward. Then introduce it as an alternative framework in the body of the proposal.
  • The proposal could benefit from an abstract/paragraph/exec summary to help set the scene for the introduction section.
  • The proposal touches on constuctivism, connectivism and technology. Might be just my pattern-entrainment but this could be a nice way to articulate how this proposal is different and is challenging the status quo. Some of the technological issues associated with the LMS, David has unpacked here.

All in all, a very interesting and worthwhile proposal. Well done and good luck.

References

Coates, H., James, R., & Baldwin, G. (2005). A critical examination of the effects of learning management systems on university teaching and learning. Tertiary education and management, 11(2005), 19-36.

The ‘wickedness’ of student attrition and retention

Earlier this year Dr Celeste Lawson and I wrote a couple of papers (still in review) about student attrition in Australian Higher Education. In the first paper we looked at the actual nature of the student attrition problem while in the second, we looked at the approaches that universities took in their attempts to address student attrition. This post is focusing on the first paper in which we questioned the way that universities conceptualized their student attrition problems. Note: I should add that referring to student attrition as a problem is probably wrong. It creates a negative impression of the situation and and also implies that it has to be solved. This is perhaps the wrong way to think about student retention and attrition.

“Reasons for student non-completion are complex”

(Maher & Macallister, 2013)

We analyzed a survey of students who started, but failed to finish their degrees. The results were pretty much what folk in higher education have come to expect. Students leave for reasons such work commitments, family commitments, financial problems, personal problems, health problems and so on; The usual array of reasons that are found throughout the student attrition literature.

If we consider student attrition as a problem within a linear (causal) system, (which as a sector we tend to do) these factors can be addressed systematically within the organizational hierarchy. For example, many students mentioned work commitments as a significant factor in their decision to leave the university. The typical university response would be to perhaps develop an instructional time-management module for new students; or allocate a learning support person who can help students with their study load; or provide a service whereby students could receive advice on how to better balance their work-study life. All of which are valid responses if the problem was single dimensional.

We conducted a content analysis on the free text comments that students made within these surveys and looked closely at the factors that led to attrition. We found that it was the accumulation of factors that led to the students dropping out, and not single reasons. The complex interplay between a range of factors, and the students’ context ends in their premature departure from university. The following diagram from our paper attempts to visualize this by showing relationships between attrition factors. Note that the strength of the line between the factors indicators the frequency in which the factors appeared together:

Screen Shot 2015-10-14 at 8.14.27 AM

What is not shown here (and is perhaps an avenue for future research) is that a similar diagram showing weighted interactions between contributing factors, could be developed for individual students. So from a university perspective, we have, not a single or even a series of issues to address, but a complex network of context dependent issues. Many of which are beyond our ability to address, or even perceive. Add to this that even the small subset of contributing factors shown above, have dependencies at multiple levels. For example, a student might identify as struggling with a financial situation that could have been externally triggered at a local, regional or national level.

It appears we have a complex web of inter-related and temporal factors that can contribute to a student withdrawing from their studies. We describe this in the paper as a wicked problem, which I have mentioned before. Wicked problems are difficult to define, have many inter dependencies, are multi-causal, unstable and socially complex (Briggs, 2007). Importantly, traditional bureaucracies with their vertical silos are unable to tackle wicked problems that are ambiguous and lack clarity. Traditional bureaucracies are also risk adverse which can inhibit the innovation, experimentation or bricolage needed to address wicked problems (Briggs, 2007).

the social complexity of wicked problems as much as their technical difficulties that make them tough to manage
(Camillus, 2008)

Where to from here is the million-dollar question although there are some ideas in the literature about tackling wicked problems that require exploring. Two in particular grabbed my attention given my interest in complex adaptive systems:

  • Involve stakeholders, document opinions and communicate, especially horizontally. This appears to align with the complexity thinking around ongoing ethnographic collection and growing the network conduits between agents. There are also some links to the self-assertive and integrative paper that David has mentioned.
  • Focus on action. Something I’ve been banging on about recently in regards to learning analytics. Detailed planning and analysis are of little use in complex systems, or in this case, with wicked problems, as the future systems states cannot be predicted due to unknowable effects stemming from interaction between agents. Take a number of small-scale actions and monitor for emergence and repeat. As opposed to upfront planning and analysis, then a single course of action. Safe-fail probes is the term that Snowden uses, and it makes a lot of sense (no pun intended).

It is safe to say that there are no silver bullets when it comes to student attrition. However, I believe there is scope to start thinking about and tackling attrition differently. Attrition is a complex multi-causal issue that the sector continues to try and address using SET mindsets and methods. I’m saying we need to think about it differently, and perhaps engage in some BAD practices.

References

Briggs, L. (2007). Tackling wicked problems: A public policy perspective. Canberra: Australian Government, Commonwealth of Australia.

Camillus, J. C. (2008). Strategy as a Wicked Problem. Harvard Business Review, 86(5), 98-106. Retrieved from http://ezproxy.cqu.edu.au/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=31730150&site=eds-live&scope=site

Maher, M., & Macallister, H. (2013). Retention and attrition of students in higher education: Challenges in modern times to what works. Higher Education Studies, 3(2), p62.

Is learning analytics hamstrung from the outset?

Over the last six months I’ve been writing about student attrition and retention with a colleague from work. We’ve submitted a couple of journal articles that are currently in review about how universities continue to misinterpret the nature of their student attrition issue. To cut a long story short, we argue that attrition is a wicked problem, a problem in complexity. Climate change, deforestation and geopolitical conflicts are examples of wicked problems where there are no single solutions or even any obvious paths towards solution. Conceptualizing student attrition as a wicked problem occurring within non-linear, complex systems changes how we approach these types of issues. As an aside, even classifying attrition as a problem (or issue) the wrong way to think about it. It’s more like a symptom of a network of problems, a network where we can’t possible know what or where most of the nodes are.

While writing these papers I saw some similarities between how universities are approaching student attrition and how they are approach learning analytics adoption. In both cases they have mis-specified the nature of the organization. Their approaches are based on assumptions that the organisations are machine-like.

“Managers want workers to respond predictably to incentives and to accomplish goals defined by managers and to do this with little deviation from plans that management has developed to improve performance”(McDaniel, 2007)

The Machine

The machine like model of organizations is associated with management approaches based on command, control and planning (McDaniel, 2007). This is a valid approach for managing in a linear, stable environment where future states can be anticipated. In fact these approaches depend on the ability of managers and workers to forecast future system states (McDaniel, 2007). However, if we view organisations and the environments in which they operate as complex adaptive systems, machine-model management no longer works. It is simply not possible to predict future states when the systems are made up of agents that are information processors with the capacity to modify their behavior based on information they receive (J. Holland, 2006; J. H. Holland, 1995). An important contrast between viewing an organization as a machine or as a complex adaptive system is the diversity of the agents within the system. Complex adaptive systems encourage diversity whereas the machine model tends to favor agent homogenization (McDaniel, 2007).

“Participation of clinicians in hospital strategic decision making is more helpful in terms of bottom line performance than the participation of middle managers”
(Ashmos, Duchon, McDaniel Jr, & Huonker, 2002)

“If we want workers to be able to improve performance in the face of unknowability, we must invest in efforts to help them make sense of the world in a way that enables the organisation to take action and to learn about the world from the actions that are taken”
(McDaniel, 2007)

So we appear have a mis-interpretation of the actual nature of organisations and the environments in which they operate. Organisations that are composed of information processing agents that change and adapt with the information they receive. And the rapid spread of hype around learning analytics, fueled by commercial entities, aimed at assisting decision-makers at many different levels of the academy. While I believe that learning analytics has enormous potential, I can’t help wondering if the fundamentally misunderstood nature of organisations is going to be its greatest limiting factor, especially when we are talking about information/action cycles.

 References

Ashmos, D. P., Duchon, D., McDaniel Jr, R. R., & Huonker, J. W. (2002). What a mess! Participation as a simple managerial rule to ‘complexify’organizations. Journal of Management studies, 39(2), 189-206.

Holland, J. (2006). Studying Complex Adaptive Systems. Journal of Systems Science and Complexity, 19(1), 1-8. doi:10.1007/s11424-006-0001-z

Holland, J. H. (1995). Hidden order : how adaptation builds complexity / John H. Holland: Reading, Mass. : Addison-Wesley, c1995.

McDaniel, R. R., Jr. (2007). Management Strategies for Complex Adaptive Systems: Sensemaking, Learning, and Improvisation. Performance Improvement Quarterly, 20(2), 21-41. Retrieved from http://onlinelibrary.wiley.com/store/10.1111/j.1937-8327.2007.tb00438.x/asset/j.1937-8327.2007.tb00438.x.pdf?v=1&t=h672itdt&s=a7c341c21237351ad995bdb34074c98db94bb026

A little about sensemaking

In my previous post I described situation awareness as it applies to learning analytics in complex adaptive systems. The aircraft analogy I used, compared the black-box flight recorder with the cockpit instrumentation to differentiate the role of real-time sensemaking with retrospective analysis. As David commented, learning and teaching is more complex than flying an aircraft, which means the instrumentation and the black-box need to be more configurable and adaptable than the analogy would suggest. Irrespective of the analogy, my suspicion is that a majority of learning analytics projects are too focused on retrospective data analysis. This analysis has limited value in complex contexts when there is a need to act and adapt in the here-and-now.

That said, whether the learning analytics data is retrospective or real-time, they are both used by humans to make sense of something. Sensemaking is a well-researched phenomenon and I think it has the potential to help us improve learning analytics. Even my favorite learning analytics definition alludes to the important role that sensemaking has:

“Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs”

Complexity

What is sensemaking?

Sensemaking refers to how we structure the unknown so as to be able to act in it (Weick, 2005). Sensemaking involves coming up with a plausible understanding, a map, of a shifting world; testing this map with others through data collection, action, and conversation; and then refining, or abandoning, the map depending on how credible it is (Ancona, 2010). Action is not a separate or later step in sensemaking, but is an integral part of it (Ancona, 2010). This aligns neatly with how agents act in complex adaptive systems whereby they probe, sense and respond. The unanticipated and unintended consequences of acting within a complex adaptive system make upfront analysis less valuable than typical organizational ways of doing things would suggest. (Weick, 2005) summarises this nicely when he says:

“ To work with the idea of sensemaking is to appreciate that smallness does not equate with insignificance. Small structures and short moments can have large consequences”

I mention organisations (such as universities) deliberately because the risk averse SET mindset drives approaches (such as those involving learning analytics) and are based on upfront analysis and one-off projects (I would say at the expense of sensemaking). The sense-making and decision-making models that I would associate with the SET mindsets are outmoded models based on linear and stable environments (Mika, 2008). In these models, analysis and upfront design make rational sense, but they do not match the unstable and turbulent contexts that we see today.

One of things that I notice with learning analytics is that data from information systems receives most, if not all the focus. According to (Ancona, 2010) when sensemaking, you “seek out and combine many different types of data. This includes system data and narrative of people involved”. The area of narrative is an area that I’m quite interested in, an interest learned from Dave Snowden’s Cynefin podcasts from sometime ago. I just wonder if the learning analytics community is a little too focused on data from information systems when there is an untapped human sensor network available?

The following are some other interesting quotes I found while scanning the sensemaking literature that I have to consider further:

“Failure is part of sensemaking”
(Ancona, 2010)

People create their own environments and are then constrained by them.
(Ancona, 2010)

“much of the effort to design information technology to support cognition in organizations has not addressed its distributed quality”
(Boland, 1994)

“Sensemaking is about the interplay of action and interpretation rather than the influence of evaluation on choice”
(Weick, 2005)

“ignorance and knowledge coexist, which means that adaptive sensemaking both honors and rejects the past”
(Weick, 2005)

References

Ancona, D. Framing and Acting in the Unknown.

Boland Jr, R. J., Tenkasi, R. V., & Te’eni, D. (1994). Designing information technology to support distributed cognition. Organization science, 5(3), 456-475.

Mika, A. Multi-ontology, sense-making and the emergence of the future. Futures, 41, 279-283. doi:10.1016/j.futures.2008.11.017

Weick, K. E., Sutcliffe, K. M., & Obstfeld, D. (2005). Organizing and the process of sensemaking. Organization science, 16(4), 409-421.

Situation awareness, complex adaptive systems and learning analytics

According to (Endsley, 1988) situation awareness is:

“the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and projection of their status in the near future”

While situation awareness has received particular attention in situations where spatial awareness is important (such as aircraft piloting), it is also applicable in non-moving systems such as industrial instrumentation systems (Pew, 1994). Situation awareness has become particularly important due to the introduction of advanced technologies and automation in many of today’s systems, which are being blamed for a reduction in situation awareness in many areas (Pew, 1994).

The “situation” in situation awareness has been defined as:

“a set of environmental conditions and system states with which the participant is interacting that can be characterized uniquely by a set of information, knowledge and response options”
(Pew, 1994).

There are a number of elements of awareness in situation awareness including:

  • Current state of the system including all the relevant variables
  • Predicted state in the “near” future
  • Information and knowledge required in support of the person’s activities
  • List of current goals
  • Time
  • Information and knowledge needed to support anticipated “near” future contexts

The “awareness” in situation awareness has been defined as the information resources that are available that can contribute to awareness:

  • Sensory information from the environment
  • Visual and auditory displays
  • Decision aids and decision support systems
  • Extra and intra team communication
  • Team member background knowledge and experience

It is worth knowing that situation awareness is the product that results from situation assessments. It is also important to note that situation awareness is not exclusively knowledge nor exclusively process, but is, an adaptive, externally directed consciousness (Smith & Hancock, 1995). In other words, situation awareness is a component within an adaptable cycle of knowledge, action and information.

The process of situational assessment requires active effort to achieve, effort that competes with other aspects of task performance. The situation awareness process requires someone to be attentive to numerous pieces of information, usually from a variety of sources. Some information will be ignored while some will be determined to be relevant. The structure of the information received from sensory inputs is critical as it determines how quickly and easily the input can be processed (Pew, 1994). This is an important consideration as it is well recognized that:

“human processing capabilities are not well suited to a multiplicity of tasks”
(Pew, 1994).

So what? What has this got to do with learning analytics?

To answer this, requires some explanation about complex adaptive systems. We believe, and have to some extent shown in a previous paper, that learning analytics is data resulting from the interactions of agents within complex adaptive systems (Beer, Jones, & Clark, 2012). Complex adaptive systems are systems containing agents that adapt and change as they interact (Holland, 2006). To cut a long story short, making predictions within complex adaptive systems is futile, as the patterns observed are never likely to be reproduced (Mason, 2008). This limits the value of retrospective (or out of context) analysis of agent behavior within complex systems, as the patterns uncovered are not likely to happen again. Hence, the recommended approach when dealing with complex adaptive systems is manage the situated present rather than targeting idealistic future states (Kurtz & Snowden, 2003).

This is where I see a role for learning analytics data and I think it links to David’s sentiments about why dashboards suck. The retrospective nature of business intelligence style dashboards limits their usefulness in the here and now. I’m not quite as anti-dashboard as David, I think they do have their uses, but I also think that far too much time, effort and money is being spent on retrospective analysis of data. We need more real-time data that can aid decision-making in the here and now. The analogy I’m thinking of is the comparison between an airliners black box flight recorder and the cockpit instrumentation.

IMG_9839

The black-box records what is happening so in the event of an accident, it can help retrospectively determine what transpired. The instrumentation in the flight deck is about providing the pilots with real-time situation awareness.

_MG_9288

A process of analysis is required to analyze black-box recordings whereas cockpit instrumentation is in-context sense-making. Based on my (limited) understanding of how learning analytics is being deployed around the world, I wonder if we could benefit from more real-time situation awareness rather than allocating all of our resources to predicting the future?

Things to do

The following is just a parking spot for some things to improve upon or explore based upon this post:

  • Look at specific examples whereby learning analytics is providing real-time decision support.
  • Explore the link between learning analytics, complex adaptive systems, situation awareness and the sense-making literature.
  • Explore the link between this post and (P)IRAC.
  • Explore distributed situation awareness along with distributed cognition
  • Explore the hypothesis linking SET mindsets/non-complex systems with retrospective data analysis.

 References

Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Paper presented at the ASCILITE2012 Future challenges, sustainable futures, Wellingtone, New Zealand.

Endsley, M. R. (1988). Situation awareness global assessment technique (SAGAT). Paper presented at the Aerospace and Electronics Conference, 1988. NAECON 1988., Proceedings of the IEEE 1988 National.

Holland, J. (2006). Studying Complex Adaptive Systems. Journal of Systems Science and Complexity, 19(1), 1-8. doi:10.1007/s11424-006-0001-z

Kurtz, C. F., & Snowden, D. J. (2003). The new dynamics of strategy: Sense-making in a complex and complicated world. IBM Systems Journal, 42(3), 462-483. Retrieved from http://ezproxy.cqu.edu.au/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=tnh&AN=10654356&site=ehost-live

http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=5386804

Mason, M. (2008). What Is Complexity Theory and What Are Its Implications for Educational Change? Educational Philosophy and Theory, 40(1), 35-49. Retrieved from http://onlinelibrary.wiley.com/store/10.1111/j.1469-5812.2007.00413.x/asset/j.1469-5812.2007.00413.x.pdf?v=1&t=h672i83c&s=4c8fb933073ab8cf0e246a0ede0d3ad149ac5866

Norman, D. A. (1993). Things that make us smart: Defending human attributes in the age of the machine: Basic Books.

Pew, R. (1994). An introduction to the concept of situation awareness. Situational Awareness in complex systems, 17-26.

Smith, K., & Hancock, P. (1995). Situation awareness is adaptive, externally directed consciousness. Human Factors: The Journal of the Human Factors and Ergonomics Society, 37(1), 137-148.

Another example of a bad performance metric

A colleague sent me an interesting study from the Australian Council for Educational Research titled “Completing university in a growing sector: Is equity an issue“. There is one particular page that caught my attention. It shows completion rates, nine years after commencement, for domestic bachelor students commencing in 2005.

It showed lower completion rates for low socioeconomic students, indigenous students, remote and regional students, part-time students and students over 25 years of age. Ok. Nothing new here and we know that (generally) the lower the tertiary entrance score, the higher the student attrition. Now there are all sorts of arguments to be made about how this doesn’t apply to each and every student, in every cast. However the trend is evident at the macro level. Students with one or more of these indicators are more likely to not complete their studies.

This, to me, highlights a real problem around funding plans for higher education such as the one from The Australian Labor Party last week. I would expect that regional universities would have higher proportions of students who exhibit the indicators listed above, especially with remote and regional. So I hope that this plan will take into account how exposed regional universities are to plans based on, lets be polite and say awkward, performance indicators.