The challenge of learning analytics implementation: lessons learned

The following is a paper that I presented at the 2019 ASCILITE conference in Singapore

Abstract

Despite broad interest in learning analytics across the Australian Higher Education sector, there remains few examples of institution-wide implementations. Learning analytics implementation is currently under-theorised with little knowledge of the complexities that mediate the systemic uptake required for an institution-wide implementation. It has been established that approaches to learning analytics that are exclusively top-down or bottom-up, are insufficient for successful implementation across an enterprise. Drawing upon an award-winning and institution-wide learning analytics intervention that has been used across almost 5,000 unit offerings, this paper formulates an initial set of theory informed design principles that can help learning analytics practitioners mediate the complexities of institution-wide implementation.

Keywords
Learning analytics, complexity, action design research, sensemaking, situation awareness, feral information systems, design principles, emergence

Introduction
Despite sector-wide interest in learning analytics, there are currently few institution-wide deployments at scale (Dawson et al., 2018; Ferguson et al., 2014). The deficit of whole-of-institution implementations continues to deny the sector a comprehensive understanding of the complexity of issues that mediate systemic uptake of learning analytics across an enterprise (Dawson, Mirriahi, & Gasevic, 2015). The same deficit applies to the theories and methodological approaches required for learning analytics implementation in real-world environments. Knowing what works, or otherwise, and why, provides potentially valuable generalisations or abstractions that can inform future learning analytics implementations. A team at a regional Australian university has been researching and experimenting with learning analytics for over 10 years and has developed an institution-wide learning analytics system. For deidentification purposes, the system that was developed will be called System X throughout this study. System X was developed by the team during 2014, has been used in 63% of the university’s offerings, and has facilitated communications with almost 90% of the university’s higher education students. While System X is a rare example of an institution-wide learning analytics implementation, its life beyond implementation has been beset with organisation-related challenges. Reflecting upon the design, development and operation phases of a learning analytics implementation like System X can provide valuable insights, which can contribute to a theory of implementation (Marabelli & Galliers, 2017; Sanders & George, 2017). This is especially important for learning analytics where successful, institution-wide implementations are currently rare.

Learning analytics is a field of research and practice that is relatively new and still evolving (Colvin, Dawson, Wade, & Gasevic, 2017). While there are many examples of theory-informed, empirical research around learning analytics, there is a shortfall of theoretical knowledge for how learning analytics can be operationalised in a given situation (Wise & Shaffer, 2015). Theory provides the learning analytics practitioners with guidance on the variables to include in their models, how to interpret their results, how to make the results actionable and how to evaluate their work (Wise & Shaffer, 2015). The current shortfall of theory in the learning analytics literature around design and action prohibits the broad recipes and principles that can help solve problems in specific contexts (Colvin et al., 2015). Theories for design and action are needed to provide “explicit prescriptions for how to design and develop an artefact, whether it is a technological product or a managerial intervention” (Gregor & Jones, 2007, p. 313). In addition to the absence of theory around implementation, there is a proliferation of commercially available learning analytics tools that are marketed as institution-wide solutions to complex problems related to learning and teaching (Dawson et al., 2015). In a vacuum of theoretical knowledge that affords informed scepticism, organisations are naturally drawn to solutions marketed as learning analytics (Dawson et al., 2018).

The problem of learning analytics adoption at the organisational level needs to be considered in relation to how these organisations operate, how they conceptualise learning analytics, and the problems they are looking for learning analytics to solve. A recent study identified two classes of universities based on their approach to learning analytics adoption; those that followed an instrumental approach, and those that followed an emergent approach (Dawson et al., 2018). Instrumental approaches were identified by top-down leadership, large-scale projects with a heavy focus on technological considerations, and were associated with limited staff uptake (Dawson et al., 2018). Emergent approaches were identified by bottom up, strongly consultative processes that proved highly resistant to being scaled to the organisational level (Dawson et al., 2018). While System X’s adoption was predominately based on an emergent approach, it was successfully scaled to the institutional level within an organization with a stated preference for instrumental approaches to technology adoption.

This paper aims to use the journey of System X’s development and operation to unpack the theories, methods and heuristics that contributed across its lifecycle to date. It is hoped that the insight into the organisational realities associated with a learning analytics implementation can help universities address the deficit of institution-wide implementations, and help bridge the growing divide between learning analytics research and practice (Colvin et al., 2017; Dawson et al., 2015). This paper develops a design principles that can help learning analytics practitioners within universities employ emergent approaches to learning analytics implementation that can scale to the institutional level. In essence, these principles form a nascent design theory (Gregor, 2006), a type of theory that can be used to guide implementation across a variety of contexts, and can contribute to our theoretical understanding of learning analytics implementation. This paper aims to answer the following research question: What theoretically derived design principles can help practitioners employ an emergent approach to learning analytics implementation?

Methodology
This paper aims to retrospectively analyse an example of the emergent development of learning analytics that scaled to the institutional level. It applies a methodological approach based on Action Design Research (ADR) to determine the theoretical elements that contributed to the design, development and operation of the IT artefact. ADR is a design research method that conceptualises the research process as containing the inseparable and inherently intertwined activities of building the information technology (IT) artefact, intervening in the organisation, and evaluating it concurrently (Sein, Henfridsson, Purao, Rossi, & Lindgren, 2011). ADR is not intended to solve problems as might a software engineer, but to generate design knowledge and reflections by building and evaluating IT artefacts in authentic organisational settings (Sein et al., 2011). ADR removes the sharp distinction between IT artefact development and its use by organisational stakeholders that is often assumed with design research and design thinking (Sein et al., 2011). ADR reflects the premise that IT artefacts are shaped by the organisational context during their development and operation (Sein et al., 2011). Organisational specific structures such as hardware, software, process and policies impact upon, and are subtly ensconced in the development and operation of an IT artefact. That ADR encapsulates the IT artefact within a real-world organisational context establishes an obvious link with learning analytics whereby a primary challenge is how it can be implemented across an organisation. ADR suggests that technological rigor often comes at the expense of organisational relevance and acknowledges that IT artefacts emerge from interactions within the organisational context (Sein et al., 2011).

ADR consists of four broad, non-linear stages. The impetus for the first stage, Problem Formulation, is an identified problem perceived in practice by the researchers, that represents a research opportunity based on existing theories or technologies. The problem is viewed as a knowledge creation opportunity at the intersection of the technological and organisational domains. The second stage of ADR is the building, intervention and evaluation (BIE) stage whereby the problem framing and theoretical elements from stage one provide a platform for generating an IT artefact within an organisational context. The mutual influence of the developing IT artefact and the real-world organisational context is iteratively and concurrently built and evaluated, generating reflections and learnings. The third stage of ADR parallels the first two stages and moves conceptually from building a solution in a specific context, to applying what has been learned to the broader class of problems. Concentrated effort is conducted on what emerges from the evaluation and research processes, ensuring that contributions to knowledge are identified. ADR refers to this process as guided emergence, whereby the external, intentional intervention is brought together with the organic evolution that results from real-world operation (Sein et al., 2011). The fourth and final stage of ADR is the formalised learning that developed from the research process and can be represented by generalised outcomes or principles (Sein et al., 2011).

The intention of the ADR process is not necessarily to solve the problem in its entirety, but to generate knowledge that can be applied to a broader range of problems that the specific problem exemplifies. For this study, the broader class of problem relates to how an emergent approach can be taken with learning analytics implementation, and still scale to the organisational level. This study is using ADR to retrospectively analyse System X. System X’s journey has been divided into three chronological sections; explorations, formal development, and operation. Each of these three sections is described according to ADR in terms of each section’s problem formulation, BIE, and reflections. Project plans, designer reflections, designer blog posts, project logs, emails and other empirical data sources are drawn upon to inform these sections. Following these three sections, this study will reflect on the theoretical elements that emerge. These theoretical elements and reflections will inform the main contribution of this study, a set of theoretically aligned design principles for learning analytics implementation.

Exploration and serendipity: 2008 – 2014

Problem formulation
While System X was established as a formal university project that officially began in 2014, its genesis was in the five years prior. During this time, the designers were conducting research around learning analytics, exploring patterns found in institutional datasets, and were exploring how these patterns could help understand and respond to learning and teaching related problems (Beer, 2008). How these patterns and data could help with student attrition and student engagement were specific problems the designers were investigating at the time, given a sector wide trend of increasing online enrolments and its negative impact on student retention (Beer, 2010b; Beer, Clark, & Jones, 2010). The designers were part of the central learning and teaching support unit who were tasked with supporting teaching staff with their learning and teaching. This provided the designers with broad perspectives on the problem based on their daily interactions with the teachers. Their perspectives were further informed by their technical knowledge, their experience with local information systems, and their knowledge of institutional policies and processes.

BIE
The System X designers conducted a number of investigations between 2008 and 2014, investigations that included the development of experimental IT artefacts that were applied in real-world contexts (Beer, 2009b, 2009c, 2010a). These investigations were often centered upon patterns found within the data, patterns that required further exploration to determine their usefulness and how they could be applied. A simple and early example was the development of a small artefact that retrospectively showed teachers how online students interacted with their unit sites compared with the grades their students received (Beer, 2009a). In each case the designers worked with teaching staff using informal cycles of evaluation centered upon the interventions. Together with the teachers, the designers were learning what worked, what did not work, and gathering knowledge and experience about why. Variations in student behaviours, the diversity of pedagogical contexts, the different mental models of the teachers, along with large variations in teacher’s technical and teaching experience, made it difficult for the designers to distil which data would be useful across all contexts. As the BIE cycles progressed, a common theme emerged from the feedback from academic staff. They wanted simple indicators of student activity in their teaching contexts, and the ability to monitor student activity to determine if the actions the teachers were taking during the term was impacting upon student activity and results (Beer, 2009d). In addition, teachers wanted the data to help them with a range of questions they had about their students depending on their teaching context. These questions included: which students have accessed the LMS site and when; how often are they accessing the LMS site; which students have failed this unit previously; what is their GPA and so on?

During this period, the team’s supervisor, a Pro Vice Chancellor (PVC), was coordinating a large undergraduate science unit and asked if the learning analytics activities the designers were engaged with could help with their unit’s relatively high failure rate. Using the learnings from the previous BIE cycles, the designers provided the unit coordinator with weekly lists of all students in the unit, arranged by an algorithmically developed estimate of success (EOS). The EOS was a simple algorithm that combined each student’s academic history (GPA, prior fails, grades received, withdraw fails) with their current level of Moodle activity as indicated by clicks, and was modeled using previous offerings of this and other units. The unit coordinator used this list to proactively contact students who were showing as being at risk of failing. This process, along with other instructional design changes, contributed to a 7% reduction in the failure rate of the unit over the next two offerings, on top of a small rise in the student satisfaction surveys. The unit coordinator’s direct contextual experience and their influential position with the university led to the establishment of a formal, centrally-funded project to develop the concept into an system that could be used by all unit coordinators.

Reflections and learnings
The organisational position of the designers afforded a perspective that was situated between university-wide learning and teaching policies, processes and management, and the learning and teaching coalface. Their roles, their history with technology at the institution, their interest in learning analytics, who they worked for and no small degree of serendipity associated with legacy access to institutional datasets, meant that the designers had access to data, and had the technical skills to manipulate it and represent it in different ways as the situation required. The trial-and-error cycles of development, and the close proximity the designers had with the teachers, led to an understanding of what data could help with what problem, and how it might help. The multiple perspectives of the designers allowed them to consider the problem in a different manner than would be afforded by a single perspective, such as a software developer for example. The circumstance whereby the team’s supervisor held an influential position with the university, and was also teaching a unit with a specific problem, was happenstance, yet pivotal in the establishment of a formal university project to expand on the initial concept.

The rise of the feral system: formal development – 2014

Problem formulation
At the prompting of the PVC, a formal university project began with the associated funding being allocated at the end of 2013. A project initiation document was prepared by the designers where the stated purpose of the overall project was to “help address student attrition by strengthening and focusing the interactions between academics and students” (Reed, Fleming, Beer, & Clark, 2013, p. 3). The broader project had multiple sub-components, one of which was System X, the learning analytics focused project. System X was aimed at helping with “the early identification of students who may be at-risk along with more effective targeting of student support for such students” (Reed et al., 2013, p. 6). A further requirement was added by the PVC who specified that the system needed to be very easy to use with little to no training or guidance required. While earlier explorations provided the designers with some understanding of what was required at the unit level, moving beyond a small scale intervention to an institution-wide IT artefact required a more formal approach due to the required investment in infrastructure, integration with other university systems, and consultation with other university departments. As such, the practical problems faced by the designers during this time was how to scale System X from a handful of units to a university-wide system, and how System X could be integrated with established university systems. These are problems represented in the wider learning analytics literature whereby the transition from small, local learning analytics experimentation, to institutional-wide implementation is known to be difficult (Ferguson et al., 2014).

BIE
The formal development of System X began with the allocation of funding and the fulltime secondment of the three designers. The team of designers consisted of two teaching academics and a graphic designer, who all had web development skills. System X was unusual in the context of information systems procurement, in that it was developed in-house and outside of the information technology (IT) department. The conceptual shift from a small-scale experiment to the institutional scale, while superficially a technical exercise due to the learnings already developed from previous BIE cycles, required cycles of iteration beyond just technical iterations. For example, as staff used and became more experienced with System X, the designers noticed that the feedback staff were providing changed from functionality-related commentary to requests for additional features. As an example of this, an iteration of System X released early in 2014 provided unit coordinators with a mail-merge feature that allowed them to send personalised emails to groups of students. Feedback from unit coordinators suggested that an indication of changes in student behaviour after the email would be useful in terms of assessing the need for a follow up. This indicator was added to the Moodle weekly activity timeline in System X and meant that they could quickly identify changes in student behaviour after the email “nudge”. The gradual shift in teacher feedback as they used System X was found to align with previous research that showed staff usage of education systems in general became more refined as they gained experience with the system (Malikowski, Thompson, & Theis, 2006). The iterative approach taken by the designers catered for the reciprocal evolution of both the technology and the human users of the information provided.

Formal and informal consultations, and conversations with schools and discipline teams, contributed to System X’s evolutions. Over 20 open discussions were conducted with various departments of the university during the first half of 2014. Each of the ideas presented in these forums were collected by the designers with the intention of incorporating as many as possible in the available time. This was reflected in the underlying modular design of System X where it was designed with an expectation of frequent change. However, the governance requirements associated with a funded institution-wide project often conflicted with the approach the designers employed. Detailed plans, that included specific feature release dates and other detailed outcomes were required upfront, prior to the building process. In the case of System X, the project management framework used by the institution discouraged an evolutionary approach where the next stage of the project was uncertain, dependent on cycles of feedback from the teachers, and not predetermination. For a concept like learning analytics that was very new at that time, the absence of institution-wide implementation examples or recipes required an evolutionary and learning approach. From a methodological perspective, the design phase of System X differed from the exploration phase in that much of the effort became directed to the IT artefact to achieve the required scale and to still support adaptation. However, it could be argued that the approach was more focused on the education intervention as opposed to the IT artefact, as the end-users were still integral to the ongoing design process, and there is evidence of reciprocal shaping whereby the end-users and the IT artefact continued to be shaped by each other.

Reflections and learning
The iterative approach taken with the development of System X supported the emergent development of the intervention where the intervention consisted of both the IT artefact and the end users. The approach allowed the IT artefact and the end users to coevolve as the design was implemented into a complex organisational context. So while the overall project superficially conformed to the mandated top-down, plan-driven approach, the underlying development process was conducted with change and evolution in mind. Adopting a modular design from the outset afforded the ability for the IT artefact to change based on user feedback. The iterative and evolutionary approach also contributed to the problem of scale. Aside from the addition of several hundred units with variables that included pedagogical contexts, student cohorts and teachers, the technical design needed to be flexible enough to enable frequent change. The technical components required to support a flexible, iterative approach and often conflicted with traditional enterprise implementation norms. So while the consultative approach and modular design allowed the IT artefact to adapt with the teachers as they became more experienced with the system, the design was different to IT procurement and architecture norms.

Maintenance and operation: 2015 – 2019

Problem formulation
The formal System X development project finished at the end of 2014 with the designers returning to their substantive positions in the central learning and teaching support unit, signaling the end of the development phase and the start of its maintenance phase. While the team’s supervisor had informally indicated System X would continue to be maintained by the designers moving forward, the supervisor’s retirement created a new set of unanticipated problems. The idea of an institutional IT system operating outside of the central IT department was unconventional, and associated with a myopic assumption of risk. How System X could continue to operate, never mind evolve, without senior-level advocacy, in an increasingly lean and homogeneous IT environment, became the core problem associated with this phase. This is a problem that links with a broader problem noted in the research literature, associated with issues that arise with systems that are developed outside of central IT departments, systems that are often referred to as shadow systems or feral systems (Behrens & Sedera, 2004; Spierings, Kerr, & Houghton, 2014, 2016).

BIE
While System X was unable to secure funding or workload allocation for maintenance, the designers continued to keep the system operating in its current form in addition to their normal duties. This included a number of non-trivial adaptations that were required to cater for upstream changes that impacted upon System X’s data ingestion processes. However, from an ADR perspective, the BIE cycle was severely constrained with the lack of allocated resources, a situation that contrasted with the design intent. Despite this, System X had developed into a tool that continued to prove useful for many teaching staff. At the time of writing System X has been used to view 63% (4,924) of the university’s higher education offerings while 39% (3,016) of these offerings used the personalised email (nudge) facility. These nudges were delivered to 89.7% (49,089) of the university’s higher education students over this period and usage continues to grow. For example, there were 108,523 nudges delivered in 2014 by 231 teaching staff whereas across 2018, 315,192 nudges were delivered by 429 teachers. A sentiment analysis of 1,208,762 nudge texts has shown that 61% of the nudges were worded positively, 30% used neutral language and 9% were deemed to be negatively worded. This aligns with the design intent whereby System X was not developed as a reliable predictive instrument, but as a tool of positive communication between teachers and students, with a focus on the students most at risk. While the usage of System X continues to grow, its long-term sustainability remains uncertain due to negative perceptions associated with its unorthodox implementation approach.

System X was unable to secure funding for maintenance beyond the end of the development phase. That it was an IT system outside of central IT control, proved to be an irreconcilable hurdle when funding was being sought for basic maintenance. Exacerbating the problem, was tension with the IT department who were concerned with what they considered to be an enterprise system, operating within the organisation, and outside of their direct control. A senior member of the IT department was quoted as saying “we don’t want another repeat of…System X”, referring to the approach taken in System X’s development. Ironically, the development approach taken by the design team had previously been recognised with a national learning and teaching award (Australian Office for Learning and Teaching, 2016). The IT department’s targets and performance indicators generated a different ontological perspective of System X to that of the designers and the teachers; a different perspective that encapsulates different assumptions about how such work gets done (Ellis & Goodyear, 2019). This demonstrates how an exclusive focus on IT considerations can be incompatible with learning approaches and problem solving in complex environments, when an IT artefact is involved (Macfadyen & Dawson, 2012). While teacher usage of System X continues to grow, the lack of resourcing and investment has prevented its evolution beyond what was available at the end of the design phase, a situation that contrasts with the design intent where a fundamental premise was adaptation and change.

Reflections and learning
Arguably, the approach the designers had taken with the development of System X approximated an emergent BIE cycle when framed with ADR. The primary focus was on solving a problem at the intersection of the users, their learning and teaching context, and the IT artefact. However, from the perspective of normal IT procurement and implementation, the primary consideration is on technical matters relating to the IT artefact (Macfadyen & Dawson, 2012). This created an organisational tension around the IT artefact, a tension that still exists at the time of writing. As learning, teaching and technology become further entwined and interdependent, this creates a problem whereby the focus on the IT artefact, coupled with rigid approaches to implementation, can become divorced from the problem context and the context of the human users, a situation that inherently limits the ability of organisations to adapt to emerging and complex challenges. Imposing one-size-fits-all approaches into changing and increasingly complex learning and teaching contexts makes exploration and implementation of hybrid concepts like learning analytics exceedingly difficult. Learning what works, why and how with learning analytics is unlikely to emerge from rigid, plan-based approaches to implementation. However, the notion that a well-used and reliable system can be considered feral, is represented in the research literature, and was a critical and unanticipated oversight by the design team.

Formalisation of learning
The following section reflects on the design, development and operational phases of System X from a theoretical perspective. This section is an attempt to derive formal learnings from the System X phases into theoretical elements that can be translated into design principles that support emergent approaches to learning analytics implementation.

Bottom-up, middle-out and meso-level practitioners
System X was designed and developed by academic staff, whose roles involved the provision of learning and teaching support to teachers, as well as contributing to policy, processes and learning and teaching systems. The teachers were embedded in the process of System X’s development, which helped ensure the process was grounded in the teachers’ lived-experience (Beer & Jones, 2014). The designers were able to help the teachers adapt to the new technology, and could also adapt the technology based on its real-world use by the teachers. The designers’ roles and position between the top-down and bottom-up allowed them to balance the requirements of the end-users (teachers) and the socio-material requirements of the organisation. This aligns with a theoretical construct known as meso-level practitioners (MLP). Much of the work of implementation with learning and teaching related innovation happens at the meso-level, which sits between small-scale local interactions, and the large scale policy and institutional processes (Hannon, 2013). MLP are well situated to mediate the tension between learning and teaching practice and the ambiguities associated with real-world technology change (Hannon, 2013). For learning analytics implementation, MLP is a theoretical perspective that can help balance the tension between the top-down and the bottom-up; emergent and instrumental. However, the MLP concept requires a blurring of the sharp distinction between traditional organisational boundaries.

Shadow systems
The problems that arose for System X as an institutional-wide system, relate to how it was developed outside of normal IT procurement processes and that it was perceived singularly as an IT system. The lack of funding for maintenance and the tension that developed with the IT department had detrimental impacts on the system. IT systems that are not under the control of an organisation’s IT department are often referred to as shadow systems (Spierings et al., 2014; Zimmermann, Rentrop, & Felden, 2014). These are systems that sit outside of the control of IT management and often develop as work-arounds for the deficiencies with existing institutional systems (Spierings et al., 2014; Zimmermann et al., 2014). There are two perspectives on shadow systems in an enterprise environment: On one hand, they introduce innovation into an organisation and allow for flexibility in specific contexts; On the other hand, they increase heterogeneity and complexity (Spierings et al., 2014, 2016; Zimmermann et al., 2014). However, most contemporary universities follow a strategic approach to deciding what work gets done (Jones & Clark, 2014) and shadow systems are generally considered to be an undesirable phenomenon in these environments (Behrens & Sedera, 2004). While shadow systems are often viewed unfavourably, it has been argued that their presence is an indication of a gap between required business workflow and what the sanctioned systems are providing (Spierings et al., 2016). Managers or supervisors can often insulate the shadow systems from the enterprise system proponents who seek to close or suppress the shadow systems (Spierings et al., 2016). With the loss of a key advocate from a senior leadership position, System X was increasingly perceived as a shadow system, as it sat outside centralised IT management. Emergent approaches to learning analytics implementation requires a shared organisational conceptualisation of the process as applied research, rather than a purely IT implementation process.

Complex adaptive systems (CAS)
The evolutionary approach taken in System X’s development acknowledged and supported the adaptation of the agents involved. The cyclical approach facilitated the reciprocal shaping of the IT artefact and the teachers, within an organisational context. Plan-based approaches assume that there is sufficient knowledge about how to integrate the technology so that a recipe-based approach can be applied (Hannon, 2013). In the case of an emerging field like learning analytics and in the absence of a critical mass of successful examples, there are currently no recipes to support a deterministic approach. Further to this, top-down and mechanical approaches assume that the system and its agents are stable and unchanging, an assumption that is fundamentally flawed in systems that involve humans (Beer, Jones, & Clark, 2012; Snowden & Boone, 2007). Reconceptualising learning and teaching as a complex system (Beer et al., 2012) or more recently, applying principles of complexity leadership theory (Siemens, Dawson, & Eshleman, 2018), have been presented as theoretical foundations that can guide non-deterministic and emergent learning analytics implementation. Complexity contends that systems involving agents, such as humans, are always changing disproportionately with the non-linear interactions between agents (Boustani et al., 2010; deMattos, Ribeiro Soriano, Miller, & Park, 2012; Plsek & Greenhalgh, 2001). For learning analytics implementation, complex adaptive systems (CAS) theory provides a conceptualisation of the system that describes the interdependency and mutability of the agents and actors operating within the system (Beer et al., 2012; Beer & Lawson, 2016; Dawson et al., 2018). The application of a CAS lens to learning analytics implementation provides a theoretical base for an emergent approach in a socio-technical system that is complexly and unpredictably entangled with other socio-technical systems.

Situation Awareness (SA)
The System X designers struggled to distill the types of data that the teachers required from the incredible volume of data available. Like most organisations, universities are collecting vast volumes of diverse data from their operations. The humans in these environments are exposed to increasing volumes of data which has created a gap between the volume of data being produced and the data that the human needs to achieve their goals (Endsley, 2001). Situation awareness (SA) is a theory that helps to define the data that the human operator needs at a particular time (Endsley, 1995). Visibility over the elements interacting within their environment is crucial for decision-making, particularly as the complexity of our operating environments increases (Endsley, 1995). In essence, situation awareness is the operators internal model of the current state of their environment (Endsley, 2001). In the case of System X and although the designers had access to vast quantities of diverse data that could be provided to the teachers, situation awareness theory suggested limiting the data to key metrics related to the teachers’ tasks, and that teachers needed the ability to filter the available information. It is vitally important that learning analytics be tethered to the learning design, and consequently the goals and expectations of the users (Wise, 2014). For learning analytics implementation in data rich university environments, situation awareness provides a theoretical framework and design principles for filtering, focusing and centering the information on the goals of the human operators (Endsley, 2016).

Sensemaking (SM)
System X included the ability for the teachers to take action based on the information they were provided, along with an indicator that could help assess the subsequent impact of the action. Taking action and monitoring for a resultant change is a key property of a theoretical construct known as sensemaking. Sensemaking is the interplay of action and interpretation (Weick, Sutcliffe, & Obstfeld, 2005). The most basic question in sensemaking is “what’s going on here?”, closely followed by “what do I do next?” (Weick et al., 2005). Sensemaking is about the continual redrafting of an emerging story through cycles of action and interpretation (Weick et al., 2005). In other words, it is not enough to provide teachers with just situation awareness, it needs to be coupled with the ability for them to take action (Jones, Beer, & Clark, 2013). The ability for teachers to take action based on learning analytics data, can often be overlooked in the face of increasingly sophisticated and attractive data analysis and visualisation tools. However, sensemaking is a critical diagnostic process that allows us to develop plausible interpretations when faced with ambiguous cues (Weick, 2012). These plausible interpretations are coupled with actions where the results further refine our understanding in a cyclical process. As learning environments and student lives become more complex and busy, our ability to make sense of situations based on what can only ever be fragmented information, is becoming increasingly important. This would suggest that the provision of near real-time information that augments the humans operators, coupled with the ability to take action, is a more appropriate starting point than detailed statistical analysis and sophisticated predictive modeling (Liu, Bartimote-Aufflick, Pardo, & Bridgeman, 2017). For learning analytics implementation, sensemaking provides a theoretical framework for the taking of action based on near real-time and incomplete information.

Design Principles
Design principles are intended to be reusable, evidence-based heuristics that can inform future development and implementation decisions across contexts (Herrington, McKenney, Reeves, & Oliver, 2007). However, learning analytics is a diverse field and its purpose will be driven by the context in which it is applied. Variations in pedagogical intent, learning design, available organisational technology, staff capability and capacity will influence the design, development and operation of the learning analytics systems (Wise, 2014). Similar to the learning analytics concept, design principles for learning analytics implementation cannot be everything for everyone. Instead, this study is proposing a set of initial design principles for meso-level practitioners engaged in learning analytics implementation, where the design principles are empirically derived and theory-informed.

The following principles are intended as an initial starting point for meso-level practitioners engaging in learning analytics implementation in higher education and represent the embarkation point of a longer journey. The concept of meso-level practitioners and the following design principles can potentially help bridge the divide between the current polarized approach that learning analytics implementations tend to take. These design principles offer a model of compromise that may help bridge these seemingly incompatible approaches to learning analytics implementation. Applying these principles in real-world contexts will refute or refine the principles and determine their applicability across multiple contexts. The application of these principles will also provide guidance on another under-theorised area of learning analytics, how to design the actual learning analytics artefact.

Design principles
Balance top-down and bottom up – Learning analytics implementation requires an emergent approach that balances top-down and bottom-up requirements and considerations. Balancing the ambiguities of the teachers’ lived experience with the organisational requirement for homogenization is the role of the meso-level practitioner.

Balance the socio-technical – Meaningful learning analytics requires equitable and contextual consideration of both the users and the technology. Recognise that effective learning analytics results from the complex interplay between humans, technology and context.

Consider learning analytics implementation to be applied research – Learning analytics implementation is a process of discovering what works, or otherwise, and why, in specific contexts. The objective is not to build an IT system, but to iteratively and methodically develop knowledge about what information the users require.

Allow for emergence – Outcomes from the learning analytics process emerge from complex interactions between humans, technology and information. Design for, expect and facilitate change.

Apply informed skepticism – Detailed plans, deterministic approaches and assumptions of certainty are incompatible with the complexities of learning analytics implementation.

Centre the learning analytics information around tasks and goals – Filter information to just what is needed to support specific tasks and goals in specific contexts. Resist the urge to provide additional information just because you can, or it is easy to do.

Link learning analytics with action – Provide affordances for action. Information and action are inseparable. Understanding in complex environments results from the combination of information and the taking of action.

Apply purpose-specific, theory-informed evidence-based practice – The purpose of specific learning analytics implementations will have related theoretical underpinnings that can help inform the implementation. Theory provides the implementation with guidance on a range of important functions including variable selection, model selection, data selection, result discrimination, result interpretation, actionable results and generalisability of results (Wise & Shaffer, 2015).

Conclusion
The experience provided by System X comes at an opportune time for Higher Education with many universities looking to learning analytics to help solve complex problems. In many, if not most of these cases, overly simplistic conceptualisations and mechanical approaches to implementation will limit the potential benefits in the longer term. Reconceptualising learning analytics implementation as cross-institutional applied research can help bridge the growing divide between learning analytics research and real-world practice, and lead to meaningful learning analytics implementations. Learning analytics is a relatively new concept in Australian Higher Education and the reality of real-world implementation is proving to be difficult and complex. Reframing our ontological conceptualisations of learning analytics implementation from the design of IT products, to the co-design of a service that integrates IT and people, is a vast and under-acknowledged challenge that has been recognised more broadly (Ellis & Goodyear, 2019). The design principles developed by this study provide an initial starting point that can help universities develop more meaningful learning analytics implementations through emergent development approaches.

References
Australian Office for Learning and Teaching. (2016). Citation for outstanding contribution to student learning: The educational technology team. https://docs.education.gov.au/node/41811: Government of Australia.

Beer, C. (2008). Is a data mine a gold mine? Retrieved from https://beerc.wordpress.com/2008/12/08/is-a-data-mine-a-gold-mine/

Beer, C. (2009a). LMS discussion forums and fully online students. Retrieved from https://beerc.wordpress.com/2009/11/14/lms-discussion-forums-and-fully-online-students/

Beer, C. (2009b). Quick Indicators Update. Retrieved from https://beerc.wordpress.com/2009/04/22/quick-indicators-update/
Beer, C. (2009c). Quick Indicators Update [Presentation to CQU on learning analytics and the indicators project.]. Retrieved from https://beerc.wordpress.com/2009/07/28/quick-indicators-update-2/

Beer, C. (2009d). What is learner engagement? Retrieved from https://beerc.wordpress.com/2009/10/22/what-is-learner-engagement/

Beer, C. (2010a). Moodlemoot: Enabling comparisons of LMS usage across institutions, platforms and time. Paper presented at the Moodlemoot Australia 2010, Melbourne.

Beer, C. (2010b). Using the Indicators project data to identify at risk students. Retrieved from https://beerc.wordpress.com/2010/05/14/using-the-indicators-project-data-to-identify-at-risk-students/

Beer, C., Clark, K., & Jones, D. T. (2010). Indicators of engagement. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education, Sydney. Conference publication retrieved from http://www.ascilite.org.au/conferences/sydney10/procs/Beer-full.pdf

Beer, C., & Jones, D. T. (2014). Three paths for learning analytics and beyond: Moving from rhetoric to reality. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education: Rhetoric and Reality, Dunedin, New Zealand. Conference Publication retrieved from http://ascilite2014.otago.ac.nz/files/fullpapers/185-Beer.pdf

Beer, C., Jones, D. T., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education: Future Challenges, Sustainable Futures, Wellington, New Zealand. Conference Publication retrieved from http://www.ascilite.org/conferences/Wellington12/2012/images/custom/beer%2ccolin_-_analytics_and.pdf

Beer, C., & Lawson, C. (2016). The problem of student attrition in higher education: An alternative perspective. Journal of Further and Higher Education, 1-12. doi:10.1080/0309877X.2016.1177171

Behrens, S., & Sedera, W. (2004). Why do shadow systems exist after an ERP implementation? Lessons from a case study. PACIS 2004 Proceedings, 136.

Boustani, M. A., Munger, S., Gulati, R., Vogel, M., Beck, R. A., & Callahan, C. M. (2010). Selecting a change and evaluating its impact on the performance of a complex adaptive health care delivery system. Clinical Interventions In Aging, 5, 141-148.

Colvin, C., Dawson, S., Wade, A., & Gasevic, D. (2017). Addressing the Challenges of Institutional Adoption. In C. Lang, G. Siemens, A. Wise, & D. Gasevic (Eds.), Handbook of Learning Analytics (Vol. 1, pp. 281 – 289). Australia: Society for Learning Analytics Research.

Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., . . . Corrin, L. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching.

Dawson, S., Mirriahi, N., & Gasevic, D. J. J. o. L. A. (2015). Importance of theory in learning analytics in formal and workplace settings. 2(2), 1-4.

Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gasevic, D. (2018). Rethinking learning analytics adoption through complexity leadership theory. Paper presented at the Proceedings of the 8th International Conference on Learning Analytics and Knowledge.

deMattos, P. C., Ribeiro Soriano, D., Miller, D. M., & Park, E. H. (2012). Decision making in trauma centers from the standpoint of complex adaptive systems. Management Decision, 50, 1549-1569. doi:10.1108/00251741211266688

Ellis, R. A., & Goodyear, P. (2019). The education ecology of universities : integrating learning, strategy and the academy.

Endsley, M. R. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors, 37(1), 32-64. doi:10.1518/001872095779049543

Endsley, M. R. (2001). Designing for situation awareness in complex systems. Paper presented at the Proceedings of the Second International Workshop on symbiosis of humans, artifacts and environment.

Endsley, M. R. (2016). Designing for situation awareness: An approach to user-centered design. Boca Raton, FL: CRC press.

Ferguson, R., Clow, D., Macfadyen, L., Essa, A., Dawson, S., & Alexander, S. (2014). Setting learning analytics in context: overcoming the barriers to large-scale adoption. Paper presented at the Fourth International Conference on Learning Analytics And Knowledge, Indianapolis, USA. doi: 10.1145/2567574.2567592

Gregor, S. (2006). The nature of theory in information systems. MIS quarterly, 30(September, 2006), 611-642. doi:10.2307/25148742

Gregor, S., & Jones, D. (2007). The anatomy of a design theory. Journal of the Association for Information systems, 8(5), 312.

Hannon, J. (2013). Incommensurate practices: Sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29, 168-178.

Herrington, J., McKenney, S., Reeves, T., & Oliver, R. (2007). Design-based research and doctoral students: Guidelines for preparing a dissertation proposal. Paper presented at the World Conference on Educational Multimedia, Hypermedia and Telecommunications 2007, Chesapeake, VA, USA.

Jones, D. T., Beer, C., & Clark, D. (2013). The IRAC framework: Locating the performance zone for learning analytics. Paper presented at the ASCILITE2013 Electric Dreams, Sydney. Conference publication retrieved from http://www.ascilite.org.au/conferences/sydney13/program/papers/Jones.pdf

Jones, D. T., & Clark, D. (2014). Breaking Bad to Bridge the Reality / Rhetoric Chasm. Paper presented at the ASCILITE2014 Rhetoric and Reality, Dunedin, New Zealand. Conference publication retrieved from http://ascilite2014.otago.ac.nz/

Liu, D. Y.-T., Bartimote-Aufflick, K., Pardo, A., & Bridgeman, A. J. (2017). Data-Driven Personalization of Student Learning Support in Higher Education. In A. Peña-Ayala (Ed.), Learning Analytics: Fundaments, Applications, and Trends: A View of the Current State of the Art to Enhance e-Learning (pp. 143-169). Cham: Springer International Publishing.

Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Journal of Educational Technology & Society, 15(3), 149-163.

Malikowski, S. R., Thompson, M. E., & Theis, J. G. (2006). External factors associated with adopting a CMS in resident college courses. Internet and Higher Education, 9(2006), 163-174. doi:10.1016/j.iheduc.2006.06.006

Marabelli, M., & Galliers, R. D. (2017). A reflection on information systems strategizing: the role of power and everyday practices. Information Systems Journal, 27(3), 347-366.

Plsek, P. E., & Greenhalgh, T. (2001). Complexity science: The challenge of complexity in health care. BMJ: British Medical Journal, 323, 625-628.
Reed, R., Fleming, J., Beer, C., & Clark, D. (2013). Project Iniitiation Document: Student Retention – Engagement and Intervention. Structural adjustment funding. CQUniversity. CQUniversity.

Sanders, M., & George, A. (2017). Viewing the changing world of educational technology from a different perspective: Present realities, past lessons, and future possibilities. Education and Information Technologies, 22(6), 2915-2933. doi:10.1007/s10639-017-9604-3

Sein, M., Henfridsson, O., Purao, S., Rossi, M., & Lindgren, R. (2011). Action Design Research. MIS quarterly, 35(March 2011), 37 – 56.
Siemens, G., Dawson, S., & Eshleman, K. (2018). Complexity: A leader’s framework for understanding and managing change in higher education. Educause Review, 53(6), 27 – 42. Retrieved from Educause Review website:
Snowden, D., & Boone, M. E. (2007). A Leader’s Framework for Decision Making. Harvard Business Review, 85(11), 68-76.

Spierings, A., Kerr, D., & Houghton, L. (2014). What drives the end user to build a feral information system? Feral Information Systems Development: Managerial Implications, 161-188.

Spierings, A., Kerr, D., & Houghton, L. (2016). Issues that support the creation of ICT workarounds: towards a theoretical understanding of feral information systems. Information Systems Journal, 27(6), 775 – 794. doi:https://doi.org/10.1111/isj.12123

Weick, K. E. (2012). Making sense of the organization: Volume 2: The impermanent organization (Vol. 2). West Sussex, United Kingdom: John Wiley & Sons.

Weick, K. E., Sutcliffe, K. M., & Obstfeld, D. (2005). Organizing and the process of sensemaking. Organization Science, 16, 409-421. doi:10.1287/orsc.1050.0133

Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics. Paper presented at the Proceedings of the fourth international conference on learning analytics and knowledge.

Wise, A. F., & Shaffer, D. W. J. J. o. L. A. (2015). Why Theory Matters More than Ever in the Age of Big Data. 2(2), 5-13.

Zimmermann, S., Rentrop, C., & Felden, C. (2014). Managing shadow IT instances–a method to control autonomous IT solutions in the business departments. Paper presented at the Americas Conference on Information Systems (AMCIS).

Are we to blame?

This post is an exercise in procrastination brought about by late afternoon writer’s block. I’m currently preparing a paper that examines a rare example of an institution-wide learning analytics implementation using a theoretical lens. The purpose of the paper is to contribute to the theoretical understanding of learning analytics implementation and to represent this as an initial set of design principles or heuristics for practitioners. However, I am noticing a fascinating irony in the research, an irony that is further reinforced by my experience with an enterprise-wide learning analytics implementation over the last five years.

Basically, I suspect the biggest barrier to organisation-wide learning analytics implementation is the organisation itself. I’m reminded of the oft cited quote:

“We have met the enemy and he is us”

(https://en.wikipedia.org/wiki/Pogo_(comic_strip))

Learning analytics, and particularly its implementation, seems to me to sit in an organisational no-mans-land. That it involves data and employs some information technology seems to see it pushed far too often in the direction of the IT department. But we know that this doesn’t work:

“All too frequently, LA is conceptualised as a technical solution to an education problem. As such oversight and management of LA is assigned to core administrative units (e.g. IT units) who establish the various rules and regulations guiding access to the data and adoption of the technologies”

(Dawson et al., 2018)

There seems to be a problem whereby we struggle to link what we know theoretically about learning analytics implementation, with how we approach implementation. For example:

  • We know that learning analytics is, by and large, an applied research field (Dawson, Gašević, Siemens, & Joksimovic, 2014)
  • We know that learning analytics is a multidisciplinary concept (Dawson et al., 2018)
  • We know that one-size-fits-all approaches are fundamentally flawed with learning analytics (Gašević, Dawson, Rogers, & Gasevic, 2016)
  • We know that our organizations are mired in technical, social and cultural challenges when it comes to learning analytics adoption (Macfadyen & Dawson, 2012)

So if we know these things as a sector, why is it that the gap between our research-based knowledge of learning analytics and our knowledge of how to implement learning analytics continues to grow (Colvin, Dawson, Wade, & Gasevic, 2017), and why do we see these mistakes repeated? I am wondering just how much of an impact our organisational structures / arrangements have on something like learning analytics, which needs to span our internal organisational boundaries? I also wonder if my anecdotal knowledge of the politicking that happens with learning analytics implementation across the sector, is somehow linked with this apparent homelessness?

If we are struggling to apply what we know when we conceptualise our learning analytics implementations, it follows that we will struggle to implement an approach that favours learning and adaptation; something that is needed while learning analytics  remains generally undertheorised (Dawson, Mirriahi, & Gasevic, 2015).

Just a thought.

References

Colvin, C., Dawson, S., Wade, A., & Gasevic, D. (2017). Addressing the Challenges of Institutional Adoption. In C. Lang, G. Siemens, A. Wise, & D. Gasevic (Eds.), Handbook of Learning Analytics (Vol. 1, pp. 281 – 289). Australia: Society for Learning Analytics Research.

Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014). Current state and future trends: A citation network analysis of the learning analytics field. Paper presented at the Proceedings of the fourth international conference on learning analytics and knowledge.

Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gasevic, D. (2018). Rethinking learning analytics adoption through complexity leadership theory. Paper presented at the Proceedings of the 8th International Conference on Learning Analytics and Knowledge.

Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, 28, 68-84.

Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Journal of Educational Technology & Society, 15(3), 149-163.

 

The tensions around learning analytics

This post is some thinking around my PhD resulting from some conversations and presentations from this year’s wonderful ALASI2018 conference held recently in Melbourne.

In a recent post, I mentioned that technology or solutions to problems do not just spring into existence. They emerge from a complex network of interactions between people and technology (Hannon, 2013; Introna, 1996, 2007). This is especially true for learning analytics which aims to help people better understand some other people and their learning environments, using some technology, and with what is at best, incomplete data. So learning analytics does not just spring into existence, but results from cycles of interaction between people, technology (data), and educational objects (objects being units, assessments, students, staff, problems etc). A real example might help explain this better:

Some colleagues and I have been researching and tinkering with learning analytics since 2007. We noted that the information systems available to our teaching staff were woefully inadequate at providing the information and affordances that these staff needed, when they needed it, during their day-to-day activities. Our university has high proportions of online, low-socioeconomic and first-in-family students, so our initial focus around this information and affordance deficit was how do we provide teachers with an evidence-informed approach to interacting with their students? We also wanted to make it much easier for teachers to access engagement and performance information about their students, during the term when it could be acted upon. This lead to a series of research-development-feedback cycles that resulted in the EASICONNECT system back in 2014.

In the following video, Damo explains a little about EASICONNECT

 

The EASI system is a basic risk/intervention type of learning analytics that allows academic staff to view their students’ activity and performance based around an indicator of success for each student. Staff can “nudge” (mailmerge) their students from the same web page, a facility that has proven to be very popular with teaching staff. To date, 1,077,732 EASI nudges have been delivered by teachers to 88.9% of all our HE students. While only 63% of units use EASI, and only 40% use it to nudge their students, this 40% represents 88.9% of all our students (perhaps pointing to EASI’s utility for teachers with larger undergraduate classes). We also noted a significant increase in activity (on average) by students who were sent an engagement nudge. EASI was not developed by a vendor, but was developed locally using an incremental, user-centered approach.

However, the point is that EASI did not just spring into existence back in 2014. It was the result of a whole range of interactions and adaptations prior to the formal implementation. Prior to, and during EASI’s implementation, the developers were conducting what I’d retrospectively suggest were cycles of interaction between people, technology and an education problem as per the following diagram.

Screen Shot 2018-11-29 at 09.08.24

Figure 1. Interaction Cycles

One of the workshops I attended at ALASI2018 was talking about complexity leadership and how there exists a tension between administrative/mainstream systems, processes and mindsets, and adaptive systems, process and mindsets. Innovation and problem solving (adaptive) in organisations always develops a tension with established, mainstream ways of doing things, and this is an area where leadership is often lacking. The following diagram builds upon what was shown at the conference.

Screen Shot 2018-11-29 at 09.25.05.png

Figure 2. Adapted from presentation slides by University of South Australia

The key point here is that there is an area of tension when these small scale innovations or solutions try to scale into the mainstream “business as usual” environment. This is where Siemens, Dawson & Eshleman (2018) say that leadership is required and complexity leadership provides a model that can help. Knowing that this tension exists and understanding its drivers, can potentially help us develop processes and policies that cater for this inevitable tension, and allows us to move forward with learning analytics. I think this is a vitally important concept for learning analytics where one-size-fits-all and single lens approaches simply don’t work. It also fits with our experience with EASI in this area of tension between bespoke, evidence-informed approaches to learning analytics and the orthodox preference for commercial-off-the-shelf, vendor-supplied solutions.

The above diagram also helps articulate the continuum between complicated and complex (Snowden & Boone, 2007). The left hand side is the domain of management where managers strive for predictability and order, while risk, change, information flow and diversity is shunned. The right hand side is the domain of leadership where leaders are comfortable with change, acceptable of failure, and strive to increase diversity and information flow (Freeburg, 2018). I would suggest that while organisations generally need both sides of this diagram, the left side can often abrogate the right. In an era of cascading complexity and change, the adaptations required for prosperity or even survival, will most likely come from the right side of the diagram (in my opinion).

From the perspective of learning analytics, I believe this diagram helps explain why learning analytics research is accelerating while learning analytics practice slowly pushes through this area of tension. From the perspective of my PhD, how does the meso-level practitioner, operating on the right hand side, help the “coal-face” teachers, navigate or work around this middle area of tension and conflict?

References

Freeburg, D. (2018). Leadership and innovation within a complex adaptive system: Public libraries. Journal of Librarianship and Information Science, 0(0), 0961000618810367. doi:10.1177/0961000618810367

Hannon, J. (2013). Incommensurate practices: Sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29, 168-178.

Introna, L. D. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20-39.

Introna, L. D. (2007). Maintaining the reversibility of foldings: Making the ethics (politics) of information technology visible. Ethics and Information Technology, 9(1), 11-25.

Siemens, G., Dawson, S., & Eshleman, K. (2018). Complexity: A leader’s framework for understanding and managing change in higher education. Educause Review, 53(6), 27 – 42. Retrieved from Educause Review website:

Snowden, D., & Boone, M. E. (2007). A Leader’s Framework for Decision Making. Harvard Business Review, 85(11), 68-76

 

 

 

A little more on the single lens problem

In my previous post I talked about how a singular view and unilateral approach to learning analytics is actually a misinterpretation of the underlying system and indeed the concept of learning analytics. Universities tend to invest in learning and teaching systems (generally) based on orthodox, hierarchical and normative approaches modeled from the business world (Hannon, 2013). The business model is known as a “planning approach”, aims to achieve managerially planned organisational change (Hannon, 2013). Technological change projects employing this approach are reported to have failure rates of up to 70% in the business world with comparable results in universities (Alvesson & Sveningsson, 2015; Malikowski, Thompson, & Theis, 2007). Note that failure does not mean that the systems are malfunctioning; it means that there is little evidence of significant impact upon teaching practices (Malikowski et al., 2007). This, I believe is something very important for organisations to consider when it comes to learning analytics implementation.

The tendency for universities to focus on learning analytics from a technology-centred perspective, overlooks the “sociomateriality” of technology systems more generally. The sociomaterial approach to practice acknowledges that practice is a complex entanglement of humans and technology (Hannon, 2013). In other words, technology does not miraculously spring into existence, it results from a complex and socially situated design process (Introna, 2007). To me, this sits at the heart of the single lens issue. If learning analytics is conceptualised and enacted by a single entity, it is only representative of that particular entity’s view of the world, their foci and more pragmatically, their performance indicators. From an organisational perspective I can see the attraction; here we have this complex and increasingly emotive thing called learning analytics. That’s data and some associated technology isn’t it?  Let’s throw it at that organisational unit over there to make it happen. This is a valid approach if we are talking about a system like a finance or payroll or even a student information system where the unknowns are known, and not unknowable. However, learning analytics at its core, is a decision support system that helps humans better understand themselves, other humans and technology-based learning environments based on what can only ever be absurdly incomplete data.

So if top-down approaches to learning analytics are not going to work and bottom-up approaches are incompatible with how our organisations are structured and operationalised, what can we do? In my PhD I am proposing that meso-level practitioners, with their knowledge of the complex learning analytics concept and their local knowledge of the organisation, are a key part to the solution. The meso-level is intermediate between the coal-face and the large-scale policy and institutional processes (Hannon, 2013). Much of what (Siemens, Dawson, & Eshleman, 2018) talk about in their recent Educause article resonates with me and the meso-level practitioner concept. However, countering the current operating norms and their entrenched mindsets is a monumental challenge. Our mindsets around organisations and how they operate is so entrenched, that even voicing alternatives is almost treated as blasphemy. However, I hope that considering learning analytics from other perspectives might help move some of these mindsets and keep learning analytics practices in touch with rapidly advancing learning analytics research.

References

Alvesson, M., & Sveningsson, S. (2015). Changing organizational culture: Cultural change work in progress: Routledge.

Hannon, J. (2013). Incommensurate practices: Sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29, 168-178.

Introna, L. D. (2007). Maintaining the reversibility of foldings: Making the ethics (politics) of information technology visible. Ethics and Information Technology, 9(1), 11-25.

Malikowski, S., Thompson, M., & Theis, J. (2007). A model for research into course management systems: bridging technology and learning theory. Journal of educational computing research, 36(2)(2007), 24. doi:https://doi.org/10.2190/1002-1T50-27G2-H3V7

Siemens, G., Dawson, S., & Eshleman, K. (2018). Complexity: A leader’s framework for understanding and managing change in higher education. Educause Review, 53(6), 27 – 42. Retrieved from Educause Review website: https://er.educause.edu/articles/2018/10/complexity-a-leaders-framework-for-understanding-and-managing-change-in-higher-education