I am currently working up a paper for ASCILITE2014 that is based on a couple of presentations and a workshop that David and I prepared last year. The paper is talking about the gathering hype around Learning Analytics, and how this hype might contribute to ‘less than ideal’ learning analytics related implementations. We believe that learning analytics implementations will tend to fall into three broad categories or approaches:
- Do it to the academics
- Do it for the academics
- Do it with the academics
While David describes these approaches in more detail on his blog, this post is a rough attempt to share some of my thinking around these approaches based on David’s work and against a backdrop of what type of system is assumed to be underpinning each approach. Any comments or suggestions are most welcome.
A bit about systems
Before I talk about these approaches and analytics, a quick primer on complex systems might useful. It has been suggested that there are four types of systems (Snowden & Boone, 2007):
Simple systems are consistent and patterns are evident. Cause and effect is apparent and best practice can be applied. The correct solution can be found in a simple system. Simple systems only require a recipe approach. One example might be assembling a plastic model aircraft. Simply follow the instructions.
Complicated systems still have cause and effect relationships, but they are much more difficult to determine. This is the domain of experts. Expert knowledge and experience is required in complicated systems. An example might be assembling a real aircraft. It’s enormously complex and requires considerable expertise. There may be more than one right answer.
Complex systems are where cause and effect are not clear, and the system will change over time due to the encapsulated interactions (learning). They often involve multiple interacting agents that exhibit emergent behaviors. Traffic flow systems, social systems and even our brains and immune systems are common examples of complex systems. There are no right answers, but we watch for emergent patterns and respond accordingly.
Chaotic systems are systems where there is no cause and effect. There is no point in looking for patterns, as they do not exist. The atmosphere, heartrate and the three-body problem are examples of chaotic systems. Interestingly (and probably showing my mathematical ignorance here), chaotic systems are meaningful to mathematicians as they can be described by straightforward equations and yet, are unpredictable and unrepeatable. Chaotic systems are highly dependent on starting conditions.
Do it to the academics
This will (unfortunately I believe) be the most common approach when it comes to learning analytics implementation. It’s the typical top-down, techno-rational, episodic, plan-driven approach that we see in higher education (Kenny, 2009). This will typically involve the setting up of a formal project with appropriate management buy in, performance indicators, budgets, project teams, user groups etcetera. The project will produce a product that, along with an associated training and awareness regime, will be pushed out to the academics. This approach will, perhaps, be attributable to some of the hype surrounding learning analytics at the moment and the tendency for everyone to “jump on the band wagon”. While there are a whole host of reasons why this sort of approach will probably fail (Duke, 2001; Jones, 2012), my interest is the assumption as to the type of system that this approach is based on.
I am proposing that the assumed system type in play here is a simple system. The assumption is that we install this new learning analytics ‘thing’ (cause) and this will improve our learning and teaching (effect). So by installing this shiny new learning analytics product, it will change the way that academics think and teach. We follow the recipe based on the project management framework in vogue at the time, provide some training and resources, bingo, we’ve improved our learning and teaching and now we can move on the next ‘big thing’. Conversely, there is a wealth of evidence to suggest that learning and teaching is vastly more complex than this and even a single academics learning and teaching context at any particular point in a term involves an array of temporally dependent interactions between multiple agents or systems. (Beer, Jones, & Clark, 2012; Davis & Sumara, 2007; Lodge & Lewis, 2012; Marra, Moore, & Klimczak, 2004; Mason, 2008a, 2008b; Morrison, 2006).
Do it for the academics
David suggests that there are two possible sub-paths here. Interested researchers might develop research-based approaches that can be used to improve what academics do, or a support area, such as an IT or central L&T department might implement some learning analytics related ‘thing’. Either of these paths can often be attributable to hype and buzzwords. This approach is still teleological or top-down but might, perhaps, be more informed as to the L&T context that the academics are operating in. It is predicated on a client/customer model where the approach or product is developed by others, based on their understanding/interpretation of the clients’ context. It’s still a plan-driven and idealistic approach that is based on a deliberate predefined strategy.
I propose that this approach is aligned with a complicated systems model whereby the solution (note “the”) is developed by experts on behalf of the clients. It is still underpinned by an assumption that cause and effect are evident and considerable expertise is required to ascertain the causal link. Similarly to the ‘do it to’ approach, this approach does not necessarily engage the academics within their contexts and all the diversity that this entails. Again, L&T is much more complex and diverse than this, and this approach will be constrained by institutional systems and the implementer’s network of connections.
Do it with the academics
This is an ateleological approach that is quite foreign and disconnected from typical teleological management approaches that we see. This approach is based on learning. That is, it is an approach that is naturalistic, agile, based on continuous improvement and is predicated on a strategy of emergence. It is not assumed upfront that we know how learning analytics can best contribute to L&T. It’s a cycle of continuous improvement and change, while monitoring for emergence. We do not target an idealistic future state but observe and manage the situated present. The assumption is that the system(s) in place will change in response to the learning that occurs based on what is happening, and the cycle continues.
I propose that this approach is based on complex systems. Complex systems involve many components that adapt, learn or change as they interact (J. Holland, 2006; J. H. Holland, 1995). From an academic’s perspective, the academics’ use of learning analytics will lead to change the way that they do things, which will mean that the learning analytics product will have to change accordingly. Interventions made by academics based on learning analytics data, will also promote unpredictable changes to the way that students interact within their courses. This is an evolutionary approach based on a constant cycle of change and observation.
We know that learning analytics is relatively new, and that L&T is complex and diverse. Despite this, many companies are making extraordinary claims about it can contribute to academic success, learning effectiveness, financial performance and a reduction in risk and complexity.
The temptation for universities to adopt these ‘turnkey’ solutions is growing. However we need to be mindful that our approaches to learning analytics adoption aren’t predicated on flawed assumptions about the type of systems we are dealing with.
Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Paper presented at the ASCILITE2012 Future challenges, sustainable futures, Wellingtone, New Zealand.
Davis, B., & Sumara, D. (2007). Complexity Science and Education: Reconceptualizing the Teacher’s Role in Learning. Interchange: A Quarterly Review of Education, 38(1), 53-67.
Duke, C. (2001). Networks and Managerialism: field-testing competing paradigms. Journal of Higher Education Policy & Management, 23(1), 103-118. doi: 10.1080/13600800020047270
Holland, J. (2006). Studying Complex Adaptive Systems. Journal of Systems Science and Complexity, 19(1), 1-8. doi: 10.1007/s11424-006-0001-z
Holland, J. H. (1995). Hidden order : how adaptation builds complexity / John H. Holland: Reading, Mass. : Addison-Wesley, c1995.
Jones, D. (2012). Three likely paths for learning analytics and academics. Retrieved from http://davidtjones.wordpress.com/2012/10/11/three-likely-paths-for-learning-analytics-and-academic-in-oz-higher-education/
Kenny, J. D. (2009). Managing a Modern University: Is It Time for a Rethink? Higher Education Research and Development, 28(6), 629-642.
Lodge, J., & Lewis, M. (2012). Pigeon pecks and mouse clicks: Putting the learning back into learning analytics
. Paper presented at the ASCILITE 2012,, Wellington.
Marra, R. M., Moore, J. L., & Klimczak, A. K. (2004). Content analysis of online discussion forums: A comparative analysis of protocols. Educational Technology Research and Development, 52(2), 23-40. doi: 10.1007/BF02504837
Mason, M. (2008a). Complexity theory and the philosophy of education. Educational Philosophy and Theory, 40(1), 15. doi: 10.1111/j.1469-5812.2007.00412.x
Mason, M. (2008b). What Is Complexity Theory and What Are Its Implications for Educational Change? Educational Philosophy and Theory, 40(1), 35-49.
Morrison, K. (2006, 30 November 2006). Complexity Theory and Education. Paper presented at the Asia Pacific Education Research Association, Hong Kong.
Snowden, D. J., & Boone, M. E. (2007). A Leader’s Framework for Decision Making. (cover story). Harvard Business Review, 85(11), 68-76.