Learning analytics implementation struggles

In a previous post I talked a little about how universities (well their leaders anyway) are attracted to the predictive potential of learning analytics, and how this approach is fundamentally flawed. In a subsequent post I suggested that situation awareness can provide a useful theoretical platform that may help universities broaden their learning analytics implementation approaches beyond prediction-based endeavours. This new post has been inspired by a couple of fascinating LAK18 papers that talk about learning analytics implementation.

Universities in Australia generally fall into two camps based on their leadership approach to learning analytics implementation; top-down and bottom-up (Dawson et al., 2018).

Top-down or instrumental approaches to learning analytics adoption are often based on preconceived recipes or prescribed methodologies and are all too often doomed to failure (Dawson et al., 2018). The top-down implementation of analytics related technology is the easy part, getting uptake and generating impact based on how and if people use the technology is where these systems fail (Marchand & Peppard, 2013). In other words, these systems are easy to install but are unlikely generate the desired uptake and impact. It is well known that top-down implementation approaches are “less than ideal” when it comes to learning analytics (Beer & Jones, 2014, p. 244).

Bottom-up or emergent approaches to learning analytics takes a much more consultative approach and usually begin on a small scale (Dawson et al., 2018). However, bottom up approaches are difficult to scale up beyond the local context to “a more holistic and complex organisational level” (Dawson et al., 2018, p. 236). So while the bottom-up approach might meet the needs of a small number of learning and teaching contexts, it may fail to scale beyond this due to the diversity of contexts found in a typical university.

As LA research continues to grow there is a very real danger of a widening gulf between identified research needs and outcomes and applied practice. (Dawson et al., 2018, p.242)

While seemingly discrete, I suspect these two approaches are unfortunately linked. Technology adoption in Australian Higher Education is dominated by vanilla implementations and centralised approaches (D. T. Jones & Clark, 2014). So even if the learning analytics system at an institution has been developed with a bottom-up approach and has a track record of uptake and impact, it may still be perceived as “feral” or “risky” due to its decentralised and perhaps unconventional origins (Spierings, Kerr, & Houghton, 2014).

In talking to various folk at the recent ALASI2017 conference, there seems to be a trend whereby universities are wanting to quickly bring learning analytics to the enterprise. In at least one case that I am aware of, a platform developed using a bottom-up approach, is being replaced with a commercial off-the-shelf product that is to be implemented using a top-down, centralised approach, in spite of the evidence against such an approach.

Once an innovation such as [learning analytics] achieves a high public profile, it can create an urgency to ‘join the bandwagon’ that swamps deliberative, mindful behavior (Beer & Jones, 2014, p. 243)

There are two things at play here that I’m thinking about with regards to my PhD studies. The first is how learning analytics is being conceptualised by these universities. No matter the university, learning analytics is an applied research project (Dawson et al., 2018) and not an IT project. I suspect that this mistake/misinterpretation also contributes to the high failure rates experienced by analytics related projects (Marchand & Peppard, 2013). The second is the role of meso-level practitioners and they can potentially contribute to bridging the gap between the two approaches (Hannon, 2013). Meso-level practitioners assuage the tension between the small-scale, local interactions, and the large-scale policy and institutional processes (C. Jones, Dirckinck‐Holmfeld, & Lindström, 2006; Uhl-Bien, Marion, & McKelvey, 2007).

As a personal aside with regards to LAK18, my application to the doctoral consortium got accepted and I was very much looking forward to attending. Unfortunately, I had to withdraw at the last minute due to a family illness. It is a fantastic event that sees the world’s foremost experts in learning analytics gather to talk and share their stories. I dearly hope I can make LAK19 next year.

References

Beer, C., & Jones, D. T. (2014). Three paths for learning analytics and beyond: Moving from rhetoric to reality. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education: Rhetoric and Reality, Dunedin, New Zealand. Conference Publication retrieved from http://ascilite2014.otago.ac.nz/files/fullpapers/185-Beer.pdf

Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gasevic, D. (2018). Rethinking learning analytics adoption through complexity leadership theory. Paper presented at the Proceedings of the 8th International Conference on Learning Analytics and Knowledge.

Hannon, J. (2013). Incommensurate practices: Sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29, 168-178.

Jones, C., Dirckinck‐Holmfeld, L., & Lindström, B. (2006). A relational, indirect, meso-level approach to CSCL design in the next decade. International Journal of Computer-Supported Collaborative Learning, 1(1), 35-56.

Jones, D. T., & Clark, D. (2014). Breaking Bad to Bridge the Reality / Rhetoric Chasm. Paper presented at the ASCILITE2014 Rhetoric and Reality, Dunedin, New Zealand. Conference publication retrieved from http://ascilite2014.otago.ac.nz/

Marchand, D. A., & Peppard, J. (2013). Why IT Fumbles Analytics. Harvard Business Review, 91(1), 104-112.

Spierings, A., Kerr, D., & Houghton, L. (2014). What drives the end user to build a feral information system? Feral Information Systems Development: Managerial Implications, 161-188.

Uhl-Bien, M., Marion, R., & McKelvey, B. (2007). Complexity leadership theory: Shifting leadership from the industrial age to the knowledge era. The Leadership Quarterly, 18, 298-318. doi:10.1016/j.leaqua.2007.04.002

 

Advertisements

Learning analytics is about the here and now

Summary of this post for the time poor
Leaning analytics needs to focus less on predicting the future and more on what’s happening right now.

A vexing question
This post extends my previous posts about learning analytics and the limitations of predictive modelling. Knowing that one-size-fits-all approaches to learning analytics are not going to work, presents us with a dilemma. How do we approach our learning analytics implementations? There is almost an implied understanding that learning analytics and prediction go hand-in-hand. If this is not the case, what is it that we need to be doing? This post is some early thinking around this question.

Lessons from other industries
Historically, most industries designed and developed their systems from a technology-centred perspective (Endsley, 2016). As the systems become more sophisticated, operators had to cope with an exponential growth in available information provided by the systems. People can only pay attention to a certain amount of information at once and the gap between the volume of data provided by the systems and the operators ability to distill the information required for their tasks continued to grow. This was because the design and development of these systems centred around technical considerations rather than a detailed understanding of the operator’s tasks and goals. In industrial and military settings, much of what has been attributed to human error is the direct result of technology-centred designs that are ill-suited for augmenting human performance across the wide range of conditions and circumstances found in real world environments (Endsley, 2016).

In complex systems, the problem becomes more pronounced as the elements in the system interact and change constantly. Automation that was quite suitable for linear, repetitive tasks breaks down in complex systems. Data and information on what worked previously is no longer useful as the context evolves and adapts. Trying to automate our way out of “human error” in complex environments leads to more complexity, more cognitive load and catastrophic errors (Endsley, 2001, 2016). Complex environments with many interacting and adapting agents are inherently unpredictable therefore the utility of retrospective data and information is limited. In these environments, it is the operators’ ability to understand the situation as a whole that forms the basis of sound decision-making. The ability of the operators to understand the system as a whole is known as situation awareness (Endsley, 2016).

“Situation awareness is the engine that drives the train for decision making and performance in complex, dynamic systems” (Endsley, 2016).

In complex and dynamic environments, decision making is highly dependent on situation awareness – a constantly evolving picture of the state of the environment. Situation awareness is goal oriented whereby the goals of the job determine the elements within the environment that people need to be aware of. There are three broad levels to situation awareness. The lowest level is the perception of the status, attributes and relevance of elements in the environment. The next level is understanding what the data and cues perceived mean in relation to what the endeavour is trying to achieve. Once the person is able to perceive the elements in the environment and what they mean, they can project what this means into the near future which is the third level of situation awareness.

“The best way to support human performance is to better support the development of high levels of situation awareness” (Endsley, 2016)

What does this have to do with learning analytics?
Learning analytics is data and information derived from agent interactions within a complex system and aims to enhance our understanding about learners and the environment in which they learn (Colvin et al., 2015; Macfadyen, Dawson, Pardo, & Gasevic, 2014). Given the limited value of retrospective data and predictive modelling, situation awareness affords an alternative way of thinking about learning analytics. Rather than thinking of learning analytics as a tool for prediction, we think of it as a tool to help us determine what is happening right now, a mechanism for enhanced situation awareness. As David has touched on previously, our information systems in higher education are wholly inadequate when it comes to providing teachers with the tools they need to manage the situated present. I believe that this is a gap where learning analytics can provide enormous value. However it does change the approach we take with learning analytics implementation.

References
Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., . . . Corrin, L. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching.

Endsley, M. R. (2001). Designing for situation awareness in complex systems. Paper presented at the Proceedings of the Second International Workshop on symbiosis of humans, artifacts and environment.

Endsley, M. R. (2016). Designing for situation awareness: An approach to user-centered design. Boca Raton, FL: CRC press.

Macfadyen, L. P., Dawson, S., Pardo, A., & Gasevic, D. (2014). Big Data in Complex Educational Systems: The Learning Analytics Imperative and the Policy Challenge. Research & Practice in Assessment, 9(Winter, 2014), 11.

 

Learning analytics and magic beans

My PhD is broadly about helping universities implement learning analytics. This post relates to some of the things that I’m seeing around the Australian Higher Education Sector with regards to learning analytics.

There are two broad trajectories that universities tend to take when implementing learning analytics (Colvin et al., 2015). One trajectory is focused on measurement and broader performativity precepts and retention interventions. The second trajectory is underpinned by pursuit of understanding where the emphasis is on learning, and recognition that retention is consequential to broader teaching, learning and engagement experiences for students.

The first trajectory seems to be where a lot of universities are at the moment, a situation that is at loggerheads with the second trajectory. Rather than seeing these trajectories converge as I hoped, they seem to be diverging in a worrying way.

The first trajectory, in particular, fits with the perpetual concern that universities have with student attrition and its (real and potential) impact on their bottom lines. However, it is becoming more apparent that this approach is flawed, especially when considered in relation to how universities approach the adoption of technology – single centralised systems that are implemented top-down, often using external consultants and off-the-shelf enterprise software (Jones & Clark, 2014).

It is becoming increasingly evident that one-size-fits-all approaches to learning analytics do not work (Colvin et al., 2015). Meaning-making from learning analytics data is dependent on a sound understanding of the learning and teaching context and requires a human in the sense-making loop (Clow, 2014). Simplistic approaches (such as those proposed by consulting companies peddling off-the-shelf software solutions) are doomed to fail (Macfadyen & Dawson, 2012). The use of generalised models encapsulated in these simplistic approaches poses a threat to the potential of learning analytics to improve the quality of learning and teaching practice (Gašević, Dawson, Rogers, & Gasevic, 2016; Liu, Rogers, & Pardo, 2015). These generalised models and simplistic approaches are especially absurd when you consider the remarkably complex and diverse learning and teaching contexts involved.

When algorithms are black boxes, this prevents academics from identifying teaching or curriculum issues that may be at play
 (Liu et al., 2015)

Learning analytics aside, such approaches are also incompatible with the actual nature of student attrition as a problem construct. Student attrition is only rarely caused by a single problem that an external agency like a university can assist with (Beer & Lawson, 2016). It is the complex interplay between multiple, ever-changing variables that results in student attrition, a notion that contrast with simplistic approaches to solutions (Beer & Lawson, 2017). The nature of student attrition further reinforces the point that one-size-fits-all approaches to learning analytics implementations aimed at helping with student attrition do not work. However, as organisations, we are still drawn to these simplistic solutions that are often proffered by consulting companies with their array of glossy brochures and anecdotal evidence.

Universities as organisations have long struggled to overcome their active inertia  preferring to apply familiar approaches. In my mind, the consulting companies are well aware of this and know exactly which buttons to push to peddle their solution. As such, I worry that we will see universities adopting off-the-shelf learning analytics systems with sexy names that are inherently rigid and are based on generalised models. The lure of predictive models based on mysterious (often proprietary) algorithms is strong and has always been a successful consulting tactic. This, despite ample evidence showing that predicting outcomes from systems that involve humans is utterly futile except in a very narrow set of circumstances (Allen & Boulton, 2011).

The only way that prediction becomes possible is when the system or agent is isolated from external influences; something that can only ever occur in laboratory conditions (Allen & Boulton, 2011)

Directly addressing student attrition through one-shot projects and special funding has had little to no impact on the problem in the past. Limiting the potential of learning analytics by focusing only on student attrition is unlikely to meaningfully contribute in the long term. Learning analytics is complexly entangled with learning and teaching contexts and thinking about it as just another IT project to be outsourced to the snappiest vender is a mistake. These sorts of projects fail more than they succeed, often because they lack the contextualisation to be useful across diverse contexts (Goldfinch, 2007). Learning analytics requires a learning approach, something that institutions are not going to achieve by buying off-the-shelf and limiting their learning analytics to a single dimension.

References

Allen, P., & Boulton, J. (2011). Complexity and limits to knowledge: The importance of uncertainty. In P. Allen, S. Maguire, & B. McKelvey (Eds.), The SAGE Handbook of Complexity and Management (pp. 164-181). London, England: SAGE.

Beer, C., & Lawson, C. (2016). The problem of student attrition in higher education: An alternative perspective. Journal of Further and Higher Education, 1-12. doi:10.1080/0309877X.2016.1177171

Beer, C., & Lawson, C. (2017). Framing attrition in higher education: A complex problem. Journal of Further and Higher Education, 1-12. doi:10.1080/0309877X.2017.1301402

Clow, D. (2014). Data wranglers: Human interpreters to help close the feedback loop. Paper presented at the Proceedings of the fourth international conference on learning analytics and knowledge, Indianapolis, IN, USA.

Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., . . . Corrin, L. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching.

Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, 28, 68-84.

Goldfinch, S. (2007). Pessimism, computer failure, and information systems development in the public sector. Public Administration Review, 67(5), 917-929.

Jones, D. T., & Clark, D. (2014). Breaking Bad to Bridge the Reality / Rhetoric Chasm. Paper presented at the ASCILITE2014 Rhetoric and Reality, Dunedin, New Zealand. Conference publication retrieved from http://ascilite2014.otago.ac.nz/

Liu, D. Y.-T., Rogers, T., & Pardo, A. (2015). Learning analytics-are we at risk of missing the point. Paper presented at the Proceedings of the 32nd ascilite conference, Perth, Australia.

Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Journal of Educational Technology & Society, 15(3), 149-163.

 

Learning analytics, complex adaptive systems and meso-level practitioners: A way forward

This post provides a very succinct summary of my PhD for those strange folk out there who might be interested 🙂

Universities are struggling to develop the capabilities required to implement meaningful learning analytics (Colvin et al., 2015). This struggle is linked with the approach that universities take with learning analytics implementation whereby detailed planning is designed to achieve a predetermined and idealistic future state (D. T. Jones & Clark, 2014). The dominant managerial and bureaucratic approach that universities apply to technology adoption manifests in top-down centralised approaches that are deemed to be organisationally efficient (D. T. Jones & Clark, 2014). However, learning is a complex social activity that is situated in complex and diverse social environments (Macfadyen & Dawson, 2012). Consequently, learning analytics is a multifaceted construct with many interdependent and contributing variables, and is highly dependent on specific contextual variables (Clow, 2014). The misalignment between the complex nature of learning analytics implementation, universities as complex socio-technical systems, and the strategic operations of universities (Colvin, Dawson, Wade, & Gasevic, 2017) represents a challenge for universities trying to develop the capabilities for meaningful learning analytics implementation.

Other complex socio-technical systems, most notably healthcare, have applied an alternative ontological conceptualisation based on complex adaptive systems theory in an effort to move beyond hierarchical and mechanical models (Boustani et al., 2010; Plsek & Greenhalgh, 2001). This theory describes systems that are non-causal, non-linear and are comprised of many interacting and interdependent agents (Holland, 2006, 2014). The application of an alternative ontological conceptualisation to learning analytics shifts the epistemological approach from planning and strategy, to an approach to implementation based on learning and improvisation (Juarrero, 1999; Kurtz & Snowden, 2003). While an alternative conceptualisation of learning analytics may assist with implementation and uptake, it also needs to fit within current strategic and hierarchical operating norms of universities. The orthodox approach to technology related implementations is to apply deliberate strategy and detailed planning (Kezar, 2001) (Reid, 2009), which raises a question about how can we apply a complex adaptive systems lens in this environment.

The embryonic nature of learning analytics and the complexity of issues that influence its systemic uptake go some way to explaining the paucity in large-scale implementations and highlights a need for further empirical and methodological studies (Colvin et al., 2017). Considering learning analytics as either top-down or bottom-up is unlikely to lead to meaningful learning analytics implementation. Top-down approaches are unlikely to meet the needs of specific learning and teaching contexts while bottom-up approaches are unlikely to scale across multiple learning and teaching contexts. Considering learning analytics as either top-down or bottom-up also risks overlooking a key area of translation in between that is often overlooked (Hannon, 2013). Much of the work of implementation occurs between the top-down and bottom-up, at the meso-level. This is the level within the organisation that sits between the small scale, local interactions and the large-scale policy and institutional process (C. Jones, Dirckinck‐Holmfeld, & Lindström, 2006). The meso-level practitioners assuage the tension between the upper and lower levels (Uhl-Bien, Marion, & McKelvey, 2007) and can negotiate a balance between the contextual complexity of learning analytics and the conventional centralised approach to data services common to universities.

This project aims to produce design principles derived from complex adaptive systems theory to guide the contribution of meso-level practitioners in order to help universities address the challenges of institutional learning analytics implementation. The principles aim to improve and enhance the integration between tools, actionable data and educator practices in real-world settings. The principles will be iteratively tested through a cycle of design-based research at a regional Australian university with the broad goal of improving student outcomes. The study aims to answer the following research question:

How can a complex adaptive systems conceptualisation help meso-level practitioners enhance and transform university learning analytics implementation capability?

References

Boustani, M. A., Munger, S., Gulati, R., Vogel, M., Beck, R. A., & Callahan, C. M. (2010). Selecting a change and evaluating its impact on the performance of a complex adaptive health care delivery system. Clinical Interventions In Aging, 5, 141-148.

Clow, D. (2014). Data wranglers: Human interpreters to help close the feedback loop. Paper presented at the Proceedings of the fourth international conference on learning analytics and knowledge, Indianapolis, IN, USA.

Colvin, C., Dawson, S., Wade, A., & Gasevic, D. (2017). Addressing the Challenges of Institutional Adoption. In C. Lang, G. Siemens, A. Wise, & D. Gasevic (Eds.), Handbook of Learning Analytics (Vol. 1, pp. 281 – 289). Australia: Society for Learning Analytics Research.

Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., . . . Corrin, L. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching.

Hannon, J. (2013). Incommensurate practices: Sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29, 168-178.

Holland, J. H. (2006). Studying complex adaptive systems. Journal of Systems Science and Complexity, 19, 1-8. doi:10.1007/s11424-006-0001-z

Holland, J. H. (2014). Complexity: A very short introduction. Oxford, England: Oxford University Press.

Jones, C., Dirckinck‐Holmfeld, L., & Lindström, B. (2006). A relational, indirect, meso-level approach to CSCL design in the next decade. International Journal of Computer-Supported Collaborative Learning, 1(1), 35-56.

Jones, D. T., & Clark, D. (2014). Breaking Bad to Bridge the Reality / Rhetoric Chasm. Paper presented at the ASCILITE2014 Rhetoric and Reality, Dunedin, New Zealand. Conference publication retrieved from http://ascilite2014.otago.ac.nz/

Juarrero, A. (1999). Dynamics in action: Intentional behavior as a complex system. Cambridge, Massachusetts: MIT press.

Kezar, A. (2001). Understanding and facilitating organizational change in the 21st century. ASHE-ERIC higher education report, 28(4), 147.

Kurtz, C. F., & Snowden, D. J. (2003). The new dynamics of strategy: Sense-making in a complex and complicated world. IBM Systems Journal, 42, 462-483. doi:10.1147/sj.423.0462

Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Journal of Educational Technology & Society, 15(3), 149-163.

Plsek, P. E., & Greenhalgh, T. (2001). Complexity science: The challenge of complexity in health care. BMJ: British Medical Journal, 323, 625-628.

Reid, I. C. (2009). The contradictory managerialism of university quality assurance. Journal of Education Policy, 24(5), 575-593. doi:10.1080/02680930903131242

Uhl-Bien, M., Marion, R., & McKelvey, B. (2007). Complexity leadership theory: Shifting leadership from the industrial age to the knowledge era. The Leadership Quarterly, 18, 298-318. doi:10.1016/j.leaqua.2007.04.002

 

 

Learning analytics

This blog post represents an exemplar for my students undertaking Learning in a Digital Age as part of their Graduate Certificate of Tertiary Education. It is based on the EASI system in place at CQUniversity and attempts to recreate/revisit our thinking when we saw an opportunity to adopt and integrate an emerging technology into our learning and teaching context.

Introduction to learning analytics
Associated with the almost universal adoption of digital technologies in higher education is their ability to store and track vast amounts of data on staff and student behaviour. The data captured by these digital systems can be analysed to improve decision making or to provide insight into the learning and teaching process. Learning analytics has been loosely defined as the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs (Gašević, Dawson, & Siemens, 2015). Learning analytics has been touted as a game-changer in higher education that can contribute to many areas within the academy.

There are two broad trajectories that universities tend to take with learning analytics. The use of learning analytics to help address student attrition and retention, and the use of learning analytics to contribute understanding to the learning and teaching process (Colvin et al., 2015). There is an argument to be made that learning analytics that contributes understanding to the learning and teaching process will also result in improved student attrition and retention. Despite this, there is evidence that suggests that these two trajectories are in fact diverging, despite their apparent complementary nature.

My context
As an educational developer attached to a central learning and teaching support department of a regional Australian university, my learning and teaching context is broader than is usual for a faculty academic. My role within a regional university that has high proportions of low SES and online students means that I am constantly on the lookout for new ways of helping my university retain more students. Learning analytics is recognised as an approach that can help with student attrition and retention by providing improved visibility over students who, with the advent of digital classrooms, have become less visible to their teachers when compared with face-to-face classrooms. The notion of the invisible student means that online students are not directly observable by their teachers and so alternative mechanisms are required to monitor student engagement in these online environments.

Evaluating learning analytics in this context?
There is extent literature and learning analytics projects aimed at addressing student attrition through the early identification of ‘at risk’ students and the facilitation of subsequent interventions (Liu, Rogers, & Pardo, 2015). However, the use of learning analytics for student attrition and retentions is not without criticism. Correlating variables in student behaviours with student success artificially creates an isolation from the real-world complexity of student life (Liu et al., 2015). It also promotes a deficit view of the student in that being ‘at risk’ suggests that there is ‘something wrong’ with the student, something that needs to be fixed (Liu et al., 2015; Macfadyen & Dawson, 2012). This raises questions of ethics and privacy around the intent behind the institutional use of student data (Prinsloo & Slade, 2015, 2017a, 2017b) such as might occur with learning analytics.

Like any evaluation of a new technology in any specific context, there are going to be pros and cons. In this case there is evidence that suggests that learning analytics can help universities with their student retention and many universities are investigating this approach. However, the research suggests that there has been a focus on the variables that contribute to student success or failure despite the absence of an established link between student success or failure and student attrition or retention. Student lives are simply too complex to categorise in such a manner. As such, I would suggest an evolutionary approach starting with the representation of student activity within the learning management system as a proxy indicator of student engagement. An approach that is less about the factors that contribute to student success and more about student activity.

Integration of the technology in my context
I believe that there is some potential for learning analytics to help unit coordinators better focus their attention in online classrooms. For example, there is anecdotal evidence that suggests that the engaged students demand and receive a disproportionate amount of attention from their unit coordinators. They are actively engaged in forums, seeking feedback on formative and summative activities and ask many questions of their teachers. With the increasing class sizes associated with online classrooms, this can mean that the less engaged and often lower achieving students can be overlooked or underserviced by their teachers.

My idea is to provide teaching staff with a view of their students’ activity in their Moodle sites. While there are not insignificant issues associated with clickstream information drawn from learning management systems, there appears to be an opportunity to take data that is already being collected and present it to teachers so as to highlight online students who may not be as engaged as others in their class. These students would otherwise be invisible to the busy and time-poor teacher who is struggling to keep abreast of the online forums and marking.

The following image is an example of the correlation between student activity on the Moodle learning management system and their resulting grade at CQUniversity. While this is a nice neat correlation it hides a great deal of the underpinning complexity and diversity across the student cohort.

Screen Shot 2017-08-22 at 11.18.05.png

References
Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., & Fisher, J. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Retrieved from http://www.olt.gov.au/project-student-retention-and-learning-analytics-snapshot-current-australian-practices-and-framework

Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends: Linking Research & Practice to Improve Learning, 59(1), 64-71. doi:10.1007/s11528-014-0822-x

Liu, D. Y.-T., Rogers, T., & Pardo, A. (2015). Learning analytics-are we at risk of missing the point. Paper presented at the Proceedings of the 32nd ascilite conference.
Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Journal of Educational Technology & Society, 15(3), 149-163.

Prinsloo, P., & Slade, S. (2015). Student privacy self-management: implications for learning analytics. Paper presented at the Fifth International Conference on Learning Analytics and Knowledge, Poughkeepsie, New York. Conference Publication retrieved from https://pdfs.semanticscholar.org/09d7/56d7a66f002f5c06b05237c3fc162b61a653.pdf

Prinsloo, P., & Slade, S. (2017a). Big Data, Higher Education and Learning Analytics: Beyond Justice, Towards an Ethics of Care Big Data and Learning Analytics in Higher Education (pp. 109-124): Springer International Publishing.

Prinsloo, P., & Slade, S. (2017b). An elephant in the learning analytics room: the obligation to act. Paper presented at the LAK’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference.

 

Post-truth

Relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”

https://en.oxforddictionaries.com/definition/post-truth

I’m quite fascinated by this term that has been used in conjunction with politics and climate change. I find it fascinating and increasingly frustrating, that scientific evidence can be overcome through induced ignorance or doubt. Climate change denialism, homeopathy, big tobacco and GMO’s are all fields where ideology and self-interest drive efforts aimed at clouding or disputing scientific fact. Closer to home, I often find in conversations with colleagues and friends, that scientific proof or overwhelming evidence is not enough to shake what are essentially beliefs.

For example, I was talking to a person recently who firmly believes in the effectiveness of homeopathic remedies, despite overwhelming evidence that suggests otherwise. This person is very intelligent, very qualified and has experience in the health sciences, yet despite the evidence, they still believe that homeopathy treatments actually work. It never ceases to amaze me at just how irrational we humans really are, yet we all tend to believe that we can objectively evaluate without bias. This is especially interesting when you consider that these same flawed humans have created organisations like governments, corporations and universities; organisations that are obsessed with objectivity and quantitative measures of performance.

The trouble is, humans are not equipped to be truly objective, we have a blind spot when it comes to our own biases. Simply put, our view of the world is passed through our own cognitive filter and we not very good at processing and acting upon information received. I think this is worth thinking about as we try to employ learning analytics as a foundation for evidence-based learning and teaching.

How to take a complexity approach to attrition/retention

A colleague and I authored a paper last year that questioned some of the assumptions that Australian Higher Education Institutions (HEI) make about student retention/attrition (Beer & Lawson, 2016). We suggested that student attrition is a complex, non-linear problem; a wicked problem that is set within a complex social system and universities are making little headway with the issue (Beer & Lawson, 2016) . Despite the enormous interconnected complexity associated with student attrition, HEI still use traditional problem solving methods and mindsets when it comes to addressing their student attrition issues. We are now thinking about how we might convert our abstract writings on the topic of student attrition and retention into action. This post is intended to help get some of our thoughts down, writing as thinking if you like.

We know that students leave their universities based on a culmination of many factors, most of which fall outside the university’s ability to influence. This isn’t to say universities can’t do anything about it, far from it, but maybe we need to think about student attrition in a different way. Universities tend to treat attrition like it is a traditional problem that can be solved using classic approaches to problem solving based on a process of understanding the problem, gathering information, synthesizing information and formulating a solution (Ritchey, 2002). We would argue that this is an ontological misinterpretation of the actual nature of the system we are dealing with, so maybe we need a different approach.

It could be argued that the underlying system is being treated as an ordered, linear system whereby it makes sense to apply an approach based on detailed planning that aims to achieve an idealistic future state (ie most Australian universities mention increased retention in their strategic plans) (Boehm & Turner, 2003; Camillus, 2008). However, we suggest that the underlying system is (ontologically) an unordered system with its many interacting and interdependent variables, and behaves more like a complex adaptive system (CAS) (Davis & Sumara, 2007; Davis & Sumara, 2006; Mason, 2008a, 2008b). The following sections are not mutually exclusive and look at some of the differences between how universities are currently approaching attrition and an approach based on CAS. This might help us determine, where to from here.

Approach to implementing change

How HEI work at the moment (at least in my limited experience) is based around episodic change. This is where organisational change is stimulated by internal or external catalysts (Weick, 2012; Weick & Quinn, 1999; Weick, Sutcliffe, & Obstfeld, 2005). For example, new technologies, new managers, financial situations, restructuring and so on. These changes are intentional, infrequent and discontinuous. The organisational metaphor here is inertial and the emphasis is on short term adaptation (Weick & Quinn, 1999). When dealing with a CAS, unpredictability and disproportionate consequences are the norm. Change in these contexts is constant, always evolving, cumulative and endlessly reacting to small contingencies. The organisational metaphor here is based on agility and long term adaptation.

Communications, responsibility and accountability

HEI are, at least in Australia, rigidly organised as hierarchical bureaucracies. They are decomposed into organisational units where people are grouped by role. We often critically refer to these units as silos. Strategy is determined centrally by a small group of people and detailed plans are created, disseminated and deviation from the plan is strongly discouraged. Communications, responsibility around who does what, and accountability all flow from this rigid structure and acquiescence to the plan. A CAS approach recognises that institutional memory, cognition and the ability to solve problems is distributed across the network of agents in the organisation. Cross silo communications and collaboration in this case is crucial. CAS requires a network approach to organisational communications and collaboration.

Approach to problem solving and taking action

This is linked with the previous section but is another key difference worth mentioning. Currently, when universities are trying to address a complex issue like student attrition, they resort to detailed plans that aim to help the organisation achieve their desired future state; ie reduced attrition, increase enrolments etc. This plans include a range of key performance indicators (KPI) that are used to measure progress against the said plan. Detailed planning and strict adherence to the plan assumes that the interconnected array of systems involved are stable and fixed and won’t change as we implement the plan. An assumption that is almost universally wrong. CAS assume change, which then changes the approach to problem solving and action. Instead of targeting the idealistic future goals through detailed planning, the organisation adapts to the here and now, at the local level, addressing issues as they arise day-to-day, sharing what works and what doesn’t. In other words, the organisation applies a strategy centred upon learning, not planning.

Where to from here

These are just three broad areas whereby a CAS approach differs from the dominant approach, particularly how it pertains to addressing student retention. The challenge for us is to figure out how we can move towards a CAS approach, which we think has a greater chance of impacting upon student retention, given the dominant (for want of a better word) hierarchical approach. The reality is that our operating environment with its associated mindsets are rigidly hierarchical and this is not going to change anytime soon. The next step for us is to figure out how we apply and test some of the CAS principles within an environment that in many respects, contrasts markedly. So how can we apply an approach based on CAS principles to the application of CAS principles within a hierarchical environment?

References

Beer, C., & Lawson, C. (2016). The problem of student attrition in higher education: An alternative perspective. Journal of Further and Higher Education, 1-12. doi:10.1080/0309877X.2016.1177171

Boehm, B., & Turner, R. (2003). Using Risk to Balance Agile and Plan-Driven Methods. Computer, 36(6), 57.

Camillus, J. C. (2008). Strategy as a Wicked Problem. Harvard Business Review, 86(5), 98-106.

Davis, B., & Sumara, D. (2007). Complexity Science and Education: Reconceptualizing the Teacher’s Role in Learning. Interchange: A Quarterly Review of Education, 38(1), 53-67.

Davis, B., & Sumara, D. J. (2006). Complexity and education: Inquiries into learning, teaching, and research: Psychology Press.

Mason, M. (2008a). Complexity theory and the philosophy of education. Educational Philosophy and Theory, 40(1), 15. doi:10.1111/j.1469-5812.2007.00412.x

Mason, M. (2008b). What Is Complexity Theory and What Are Its Implications for Educational Change? Educational Philosophy and Theory, 40(1), 35-49.

Ritchey, T. (2002). Modelling complex socio-technical systems using morphological analysis. Adapted from an address to the Swedish Parliamentary IT Commission, Stockholm.

Weick, K. E. (2012). Making sense of the organization: Volume 2: The impermanent organization (Vol. 2): John Wiley & Sons.

Weick, K. E., & Quinn, R. E. (1999). Organizational change and development. Annual review of psychology, 50(1), 361-386.

Weick, K. E., Sutcliffe, K. M., & Obstfeld, D. (2005). Organizing and the process of sensemaking. Organization science, 16(4), 409-421.