Help! My students have disappeared

This post is for those colleagues who are perhaps teaching online for the first time. There are plenty of good resources available for how you can quickly transition from face-to-face to online teaching.

Have a look at Griffith’s video series or the principles of online teaching from a national award winning online educator.

There are many quality resources available that talk about student engagement in these online environments. This is great advice and it is worth your time to search around for hints and tips that you can quickly enact with your own online class. However, I would like to draw your attention to a specific part of teaching online and student engagement that is often easily overlooked when you are cognitively overloaded by current events.

Student Sleeping in library and a book by Mandala

“Student Sleeping in library and a book by Mandala” by hasnainyaseen6 is licensed under CC BY 2.0

With your face-to-face classes, you have the class in front of you and you intuitively know whether or not they are paying attention. Are they looking at you intently or are they starting off into space, are they taking notes or are they doodling in their notebooks and so on and so forth. In response to your ongoing sensing of their attentiveness, you adapt what you are doing and how you are doing it right?

But…

In the online environment, your class becomes invisible to you. You can no longer see the glint in their eye or their vacant wish-I-was-somewhere-else expressions. If you are only using synchronous methods, such as video conferencing, chances are you can still sense their attentiveness as you could before. But now you have uploaded all of these files and videos that the students can visit in their own time, how do you tell if they are being accessed, and how do you who has accessed them?

vandevermooreandrea_99165_5786729_AVM 11

“vandevermooreandrea_99165_5786729_AVM 11” by KSRE Photo is licensed under CC BY 2.0

This is where it gets a little tricky. Many universities will have sophisticated systems for monitoring student engagement in their online environments, systems that will tell them who has been active and who hasn’t (search Google Scholar for Learning Analytics if you want to know more). Most schools and small institutions, however, will not have this level of sophistication. What you are able to do in this space will depend on the technology in use at your institution. Most learning management systems will allow you to closely track student accesses to your resources and will faithfully record every mouse click students make in these systems.

Important note: Mouse clicks and views of specific resources DOES NOT equal student engagement. A click on a specific resource does not mean that it was processed or internalised. However, in the absence of anything else, student activity in online environments can be used as useful indictors of student engagement when coupled with your professional judgement.

The following are just some rough thoughts on how you might monitor student engagement in your own online context. It is not exhaustive or specific but may help start you thinking about how you monitor your student engagement in this scary new online world.

  • Find out from your IT folk how you can access logs of student logins into your course or unit.
  • Find out how you see which of your students are accessing your course resources.
  • Link your specific resources to small formative tasks for your students. Given them a video or reading, then ask for a couple of sentences on what they thought about it or how it might relate to learning.
  • Small formative quizzes are useful if you are using a Learning Management System like Blackboard or Moodle. Who hasn’t attempted these quizzes is gold when it comes to figuring out which of your students need prompting.
  • Keep track of who is contributing and who is lurking in your discussion forums. If someone is neither lurking or contributing, they might need prompting. Remember that lurking is not necessarily a bad thing.

When it comes to student engagement in online environments, I remember hearing the following at some conference somewhere and it resonated with me:

Some students will pass no matter what you do. Some students will fail no matter what you do. The trick is identifying the students who will fail without your intervention.

Often with online learning environments, our systems and tools make it very easy to focus on engagement, and not necessarily disengagement. We need to be on the lookout for those invisible students who made need a little more help.

Learning analytics and the three-body problem

The three body problem

In 1687, Sir Isaac Newton published his seminal article “Philosophiae Naturalis Principia Mathematica” in which he described the motion of celestial bodies (Newton, 1987). Newton’s theory of gravity provided a means for precisely characterising complex orbital motion by establishing that celestial bodies excerpt force on all other celestial bodies, where the force is inversely proportional to the distance between the bodies (Newton, 1987). Newton’s work, particularly with Principia Mathematica, became a cornerstone to natural philosophy and remains one of the most important scientific works in human history (Smith, 2008). However, for mathematicians in particular, Newtons work in Principia Mathematica created a long-standing problem.

The problem became known as the three-body problem and has been described as the most celebrated of all dynamical problems (Barrow-Green, 1997). The three-body problem can be simply stated:

Three particles move in space under their mutual gravitation attraction; given their initial conditions, determine their subsequent motion

(Barrow-Green, 1997).

The essence of the three-body problem is that even if we know the initial position and momentum for three bodies, we cannot solve for their subsequent motion (except in some highly specific and contrived situations). Between 1750 and 1900, over 800 publications were written related to the three-body problem representing many distinguished mathematicians of that time (Barrow-Green, 1997), with the problem remaining unsolved to this day.

So what has this got to do with learning analytics?

Well, not much really and I’m probably even guilty of committing a logical fallacyreductio ad absurdum. It is also an apples/oranges comparison, but it does help to illustrate a point.

Consider the dominant management model and the dominant approach to technology adoption in our universities: Universities tend to be based on the Newtonian-machine management model where the focus is on hierarchical structures, rules-based culture, command, control and formal relationships (Cenere, Gill, Lewis, & Lawson, 2015; Goldspink, 2007). This mechanical model assumes that goals can be achieved through deliberate action based on knowledge of what has happened previously (Beer & Lawson, 2017). Put simply, we tend to think that a thorough analysis of what has happened previously can help us to predict what will happen next, and so we make our plans and design our systems based on the assumption that the past can inform the future.

Getting back to the three-body problem, we only have three entities, each with its own mass, vector and velocity, and we are still unable to algorithmically characterise their respective positions in the future. If we have a class with only three students, we cannot begin to perceive or quantify the variables that will contribute to their success or otherwise. Even if we had the hypothetical ability to perceive these variables at the outset of their studies, it would have little bearing on the outcome or the journey to the outcome.

My point here is that techno-centric approaches to learning analytics, where the focus of attention is on historical data, are fundamentally flawed. I’m not sure we truly understand this yet as a sector. I’m still seeing people buying into the delusion that says predicting student success is simply a matter of investing in some IT infrastructure and applying some fancy-sounding algorithm. Investment in data infrastructure and exploring these algorithms is something we should be doing, no argument, but what about investing the human side of learning analytics, the non-IT side? Even if we have the most amazing systems and the most amazing algorithms, what can we do about that student who hasn’t logged on for two weeks because they have been busy with their three jobs and complex family challenges? Do our policies and processes allow the flexibility required to respond to the range of situations that will be brought to light by our learning analytics systems? More broadly still, do our systems, processes and policy appropriately represent our specific cohort of students?

I’m a nerd and I get the fascination with IT systems and predictive algorithms with learning analytics, I really do. But we need to think about the assumptions that underpin our investments in learning analytics. Resources are finite, now more than ever, so we need a more balanced approach to how we conceptualise learning analytics in each of our organisations.

References

Barrow-Green, J. (1997). Poincaré and the three body problem: American Mathematical Soc.

Beer, C., & Lawson, C. (2017). Framing attrition in higher education: A complex problem. Journal of Further and Higher Education, 1-12. doi:10.1080/0309877X.2017.1301402

Cenere, P., Gill, R., Lewis, M., & Lawson, C. (2015). Communication Skills for Business Professionals. Port Melbourne, Victoria, Australia: Cambridge University Press.

Goldspink, C. (2007). Rethinking Educational Reform: A Loosely Coupled and Complex Systems Perspective. Educational Management Administration & Leadership, 35(1), 27-50. http://dx.doi.org/10.1177/1741143207068219

Newton, I. (1987). Philosophiæ naturalis principia mathematica (Mathematical principles of natural philosophy). London (1687), 1687.

Smith, G. (2008). Isaac Newton. https://plato.stanford.edu/archives/fall2008/entries/newton/: Metaphysics Research Lab, Stanford University.

 

Regional universities are & need to be different

The Vice Chancellor of my university recently wrote an article for ABC Opinion highlighting the neglect that regional Australia suffers. Of particular interest was the education gap between metropolitan and regional/remote areas whereby metropolitan citizens are much more likely to attend university than their regional counterparts (73% versus 47% when comparing Brisbane and Bundaberg school leavers). I wholeheartedly agree that regional Australian universities are getting a raw deal and I also believe that the problem is actually much worse than we currently appreciate. However, I also think that some of the blame rests with regional universities who could be better at embracing their different operating contexts.

With an attitude of resigned indifference, regional Australians appear to simply accept an inferior position on our nation’s economic, cultural and social ladder
(ABC Opinion, 2020)

Consider CQUniversity, which is based in regional Australia. CQUniversity has the largest footprint of any university in Australia with 26 locations across the nation (CQUniversity, 2019). CQUniversity is a “comprehensive” university offering a full range of vocational and higher education qualifications. Geographic distribution and the mixture of vocational and higher education qualifications makes for an astonishingly complex learning, teaching and administrative environment, which is arguably unique in the Australian higher education context. However, it is the student cohort that really distinguishes CQUniversity from all other Australian universities. Vastly higher proportions of students from high-risk equity groups make the university particularly susceptible to performance funding models that use student attrition and retention as a performance measure. To me, this seems grossly unfair and a perpetuation of the disparity between metropolitan and regional areas that the ABC article suggests.

Regional - national

Students from equity groups face a number of structural challenges in accessing, participating and completing higher education, including geographical location, financial constraints, emotional factors and sociocultural incongruity. (Nelson et al., 2017)

There can be little doubt that regional universities are punching well above their weight when it comes to their holistic contribution to Australian society. However, that said, I also think that regional Australian universities can, and need to do much better, even within the hegemony of the higher education sector. Given the differences in student cohorts and operating contexts between regional and metropolitan universities, it would seem reasonable to expect regional universities to think and operate differently when it comes to their learning, teaching and technology. But this does not seem to be the case, with broadly similar policies, processes, structures, infrastructure and worst of all, thinking, across the sector. Regional universities are different in a difficult environment; an environment that in many respects reinforces homogeneity that is not suited to their context.

It is essential to develop a more nuanced understanding of the relationships that exist between social, cultural, financial and structural issues, demographic and equity characteristics, enrolment patterns, and the completion rates of cohorts at RUN universities to ensure that measures effectively target the lived complexity of diverse student populations (Nelson et al., 2017)

So how can Australian regional universities be better at being different? This is a vastly complex issue that has economic, cultural, political, ecological and educational linkages. It is a wicked problem in a transcontextual environment. Perhaps thinking more about the context of our operations and context of the students we have, and less about what our larger metropolitan cousins are doing with regards to their structures, technologies, processes etcetera, would be a useful place to start? A different ontology & epistemology for a different context?

References

CQUniversity. (2019). CQUniversity Strategic Plan 2019-2023. CQUniversity: CQUniversity Retrieved from https://www.cqu.edu.au/about-us/about-cquniversity/strategic-plan-2019-2023

Head, B., & Alford, J. (2008). Wicked problems: The implications for public management. Paper presented at the Presentation to Panel on Public Management in Practice, International Research Society for Public Management 12th Annual Conference.

Nelson, K., Picton, C., McMillan, J., Edwards, D., Devlin, M., & Martin, K. (2017). Understanding the completion patterns of equity students in regional universities. Retrieved from The National Centre for Student Equity in Higher Education (NCSEHE) website https://www.ncsehe.edu.au/publications/completion-patterns-of-equity-students-in-regional-universities.

 

The challenge of learning analytics implementation: lessons learned

The following is a paper that I presented at the 2019 ASCILITE conference in Singapore

Abstract

Despite broad interest in learning analytics across the Australian Higher Education sector, there remains few examples of institution-wide implementations. Learning analytics implementation is currently under-theorised with little knowledge of the complexities that mediate the systemic uptake required for an institution-wide implementation. It has been established that approaches to learning analytics that are exclusively top-down or bottom-up, are insufficient for successful implementation across an enterprise. Drawing upon an award-winning and institution-wide learning analytics intervention that has been used across almost 5,000 unit offerings, this paper formulates an initial set of theory informed design principles that can help learning analytics practitioners mediate the complexities of institution-wide implementation.

Keywords
Learning analytics, complexity, action design research, sensemaking, situation awareness, feral information systems, design principles, emergence

Introduction
Despite sector-wide interest in learning analytics, there are currently few institution-wide deployments at scale (Dawson et al., 2018; Ferguson et al., 2014). The deficit of whole-of-institution implementations continues to deny the sector a comprehensive understanding of the complexity of issues that mediate systemic uptake of learning analytics across an enterprise (Dawson, Mirriahi, & Gasevic, 2015). The same deficit applies to the theories and methodological approaches required for learning analytics implementation in real-world environments. Knowing what works, or otherwise, and why, provides potentially valuable generalisations or abstractions that can inform future learning analytics implementations. A team at a regional Australian university has been researching and experimenting with learning analytics for over 10 years and has developed an institution-wide learning analytics system. For deidentification purposes, the system that was developed will be called System X throughout this study. System X was developed by the team during 2014, has been used in 63% of the university’s offerings, and has facilitated communications with almost 90% of the university’s higher education students. While System X is a rare example of an institution-wide learning analytics implementation, its life beyond implementation has been beset with organisation-related challenges. Reflecting upon the design, development and operation phases of a learning analytics implementation like System X can provide valuable insights, which can contribute to a theory of implementation (Marabelli & Galliers, 2017; Sanders & George, 2017). This is especially important for learning analytics where successful, institution-wide implementations are currently rare.

Learning analytics is a field of research and practice that is relatively new and still evolving (Colvin, Dawson, Wade, & Gasevic, 2017). While there are many examples of theory-informed, empirical research around learning analytics, there is a shortfall of theoretical knowledge for how learning analytics can be operationalised in a given situation (Wise & Shaffer, 2015). Theory provides the learning analytics practitioners with guidance on the variables to include in their models, how to interpret their results, how to make the results actionable and how to evaluate their work (Wise & Shaffer, 2015). The current shortfall of theory in the learning analytics literature around design and action prohibits the broad recipes and principles that can help solve problems in specific contexts (Colvin et al., 2015). Theories for design and action are needed to provide “explicit prescriptions for how to design and develop an artefact, whether it is a technological product or a managerial intervention” (Gregor & Jones, 2007, p. 313). In addition to the absence of theory around implementation, there is a proliferation of commercially available learning analytics tools that are marketed as institution-wide solutions to complex problems related to learning and teaching (Dawson et al., 2015). In a vacuum of theoretical knowledge that affords informed scepticism, organisations are naturally drawn to solutions marketed as learning analytics (Dawson et al., 2018).

The problem of learning analytics adoption at the organisational level needs to be considered in relation to how these organisations operate, how they conceptualise learning analytics, and the problems they are looking for learning analytics to solve. A recent study identified two classes of universities based on their approach to learning analytics adoption; those that followed an instrumental approach, and those that followed an emergent approach (Dawson et al., 2018). Instrumental approaches were identified by top-down leadership, large-scale projects with a heavy focus on technological considerations, and were associated with limited staff uptake (Dawson et al., 2018). Emergent approaches were identified by bottom up, strongly consultative processes that proved highly resistant to being scaled to the organisational level (Dawson et al., 2018). While System X’s adoption was predominately based on an emergent approach, it was successfully scaled to the institutional level within an organization with a stated preference for instrumental approaches to technology adoption.

This paper aims to use the journey of System X’s development and operation to unpack the theories, methods and heuristics that contributed across its lifecycle to date. It is hoped that the insight into the organisational realities associated with a learning analytics implementation can help universities address the deficit of institution-wide implementations, and help bridge the growing divide between learning analytics research and practice (Colvin et al., 2017; Dawson et al., 2015). This paper develops a design principles that can help learning analytics practitioners within universities employ emergent approaches to learning analytics implementation that can scale to the institutional level. In essence, these principles form a nascent design theory (Gregor, 2006), a type of theory that can be used to guide implementation across a variety of contexts, and can contribute to our theoretical understanding of learning analytics implementation. This paper aims to answer the following research question: What theoretically derived design principles can help practitioners employ an emergent approach to learning analytics implementation?

Methodology
This paper aims to retrospectively analyse an example of the emergent development of learning analytics that scaled to the institutional level. It applies a methodological approach based on Action Design Research (ADR) to determine the theoretical elements that contributed to the design, development and operation of the IT artefact. ADR is a design research method that conceptualises the research process as containing the inseparable and inherently intertwined activities of building the information technology (IT) artefact, intervening in the organisation, and evaluating it concurrently (Sein, Henfridsson, Purao, Rossi, & Lindgren, 2011). ADR is not intended to solve problems as might a software engineer, but to generate design knowledge and reflections by building and evaluating IT artefacts in authentic organisational settings (Sein et al., 2011). ADR removes the sharp distinction between IT artefact development and its use by organisational stakeholders that is often assumed with design research and design thinking (Sein et al., 2011). ADR reflects the premise that IT artefacts are shaped by the organisational context during their development and operation (Sein et al., 2011). Organisational specific structures such as hardware, software, process and policies impact upon, and are subtly ensconced in the development and operation of an IT artefact. That ADR encapsulates the IT artefact within a real-world organisational context establishes an obvious link with learning analytics whereby a primary challenge is how it can be implemented across an organisation. ADR suggests that technological rigor often comes at the expense of organisational relevance and acknowledges that IT artefacts emerge from interactions within the organisational context (Sein et al., 2011).

ADR consists of four broad, non-linear stages. The impetus for the first stage, Problem Formulation, is an identified problem perceived in practice by the researchers, that represents a research opportunity based on existing theories or technologies. The problem is viewed as a knowledge creation opportunity at the intersection of the technological and organisational domains. The second stage of ADR is the building, intervention and evaluation (BIE) stage whereby the problem framing and theoretical elements from stage one provide a platform for generating an IT artefact within an organisational context. The mutual influence of the developing IT artefact and the real-world organisational context is iteratively and concurrently built and evaluated, generating reflections and learnings. The third stage of ADR parallels the first two stages and moves conceptually from building a solution in a specific context, to applying what has been learned to the broader class of problems. Concentrated effort is conducted on what emerges from the evaluation and research processes, ensuring that contributions to knowledge are identified. ADR refers to this process as guided emergence, whereby the external, intentional intervention is brought together with the organic evolution that results from real-world operation (Sein et al., 2011). The fourth and final stage of ADR is the formalised learning that developed from the research process and can be represented by generalised outcomes or principles (Sein et al., 2011).

The intention of the ADR process is not necessarily to solve the problem in its entirety, but to generate knowledge that can be applied to a broader range of problems that the specific problem exemplifies. For this study, the broader class of problem relates to how an emergent approach can be taken with learning analytics implementation, and still scale to the organisational level. This study is using ADR to retrospectively analyse System X. System X’s journey has been divided into three chronological sections; explorations, formal development, and operation. Each of these three sections is described according to ADR in terms of each section’s problem formulation, BIE, and reflections. Project plans, designer reflections, designer blog posts, project logs, emails and other empirical data sources are drawn upon to inform these sections. Following these three sections, this study will reflect on the theoretical elements that emerge. These theoretical elements and reflections will inform the main contribution of this study, a set of theoretically aligned design principles for learning analytics implementation.

Exploration and serendipity: 2008 – 2014

Problem formulation
While System X was established as a formal university project that officially began in 2014, its genesis was in the five years prior. During this time, the designers were conducting research around learning analytics, exploring patterns found in institutional datasets, and were exploring how these patterns could help understand and respond to learning and teaching related problems (Beer, 2008). How these patterns and data could help with student attrition and student engagement were specific problems the designers were investigating at the time, given a sector wide trend of increasing online enrolments and its negative impact on student retention (Beer, 2010b; Beer, Clark, & Jones, 2010). The designers were part of the central learning and teaching support unit who were tasked with supporting teaching staff with their learning and teaching. This provided the designers with broad perspectives on the problem based on their daily interactions with the teachers. Their perspectives were further informed by their technical knowledge, their experience with local information systems, and their knowledge of institutional policies and processes.

BIE
The System X designers conducted a number of investigations between 2008 and 2014, investigations that included the development of experimental IT artefacts that were applied in real-world contexts (Beer, 2009b, 2009c, 2010a). These investigations were often centered upon patterns found within the data, patterns that required further exploration to determine their usefulness and how they could be applied. A simple and early example was the development of a small artefact that retrospectively showed teachers how online students interacted with their unit sites compared with the grades their students received (Beer, 2009a). In each case the designers worked with teaching staff using informal cycles of evaluation centered upon the interventions. Together with the teachers, the designers were learning what worked, what did not work, and gathering knowledge and experience about why. Variations in student behaviours, the diversity of pedagogical contexts, the different mental models of the teachers, along with large variations in teacher’s technical and teaching experience, made it difficult for the designers to distil which data would be useful across all contexts. As the BIE cycles progressed, a common theme emerged from the feedback from academic staff. They wanted simple indicators of student activity in their teaching contexts, and the ability to monitor student activity to determine if the actions the teachers were taking during the term was impacting upon student activity and results (Beer, 2009d). In addition, teachers wanted the data to help them with a range of questions they had about their students depending on their teaching context. These questions included: which students have accessed the LMS site and when; how often are they accessing the LMS site; which students have failed this unit previously; what is their GPA and so on?

During this period, the team’s supervisor, a Pro Vice Chancellor (PVC), was coordinating a large undergraduate science unit and asked if the learning analytics activities the designers were engaged with could help with their unit’s relatively high failure rate. Using the learnings from the previous BIE cycles, the designers provided the unit coordinator with weekly lists of all students in the unit, arranged by an algorithmically developed estimate of success (EOS). The EOS was a simple algorithm that combined each student’s academic history (GPA, prior fails, grades received, withdraw fails) with their current level of Moodle activity as indicated by clicks, and was modeled using previous offerings of this and other units. The unit coordinator used this list to proactively contact students who were showing as being at risk of failing. This process, along with other instructional design changes, contributed to a 7% reduction in the failure rate of the unit over the next two offerings, on top of a small rise in the student satisfaction surveys. The unit coordinator’s direct contextual experience and their influential position with the university led to the establishment of a formal, centrally-funded project to develop the concept into an system that could be used by all unit coordinators.

Reflections and learnings
The organisational position of the designers afforded a perspective that was situated between university-wide learning and teaching policies, processes and management, and the learning and teaching coalface. Their roles, their history with technology at the institution, their interest in learning analytics, who they worked for and no small degree of serendipity associated with legacy access to institutional datasets, meant that the designers had access to data, and had the technical skills to manipulate it and represent it in different ways as the situation required. The trial-and-error cycles of development, and the close proximity the designers had with the teachers, led to an understanding of what data could help with what problem, and how it might help. The multiple perspectives of the designers allowed them to consider the problem in a different manner than would be afforded by a single perspective, such as a software developer for example. The circumstance whereby the team’s supervisor held an influential position with the university, and was also teaching a unit with a specific problem, was happenstance, yet pivotal in the establishment of a formal university project to expand on the initial concept.

The rise of the feral system: formal development – 2014

Problem formulation
At the prompting of the PVC, a formal university project began with the associated funding being allocated at the end of 2013. A project initiation document was prepared by the designers where the stated purpose of the overall project was to “help address student attrition by strengthening and focusing the interactions between academics and students” (Reed, Fleming, Beer, & Clark, 2013, p. 3). The broader project had multiple sub-components, one of which was System X, the learning analytics focused project. System X was aimed at helping with “the early identification of students who may be at-risk along with more effective targeting of student support for such students” (Reed et al., 2013, p. 6). A further requirement was added by the PVC who specified that the system needed to be very easy to use with little to no training or guidance required. While earlier explorations provided the designers with some understanding of what was required at the unit level, moving beyond a small scale intervention to an institution-wide IT artefact required a more formal approach due to the required investment in infrastructure, integration with other university systems, and consultation with other university departments. As such, the practical problems faced by the designers during this time was how to scale System X from a handful of units to a university-wide system, and how System X could be integrated with established university systems. These are problems represented in the wider learning analytics literature whereby the transition from small, local learning analytics experimentation, to institutional-wide implementation is known to be difficult (Ferguson et al., 2014).

BIE
The formal development of System X began with the allocation of funding and the fulltime secondment of the three designers. The team of designers consisted of two teaching academics and a graphic designer, who all had web development skills. System X was unusual in the context of information systems procurement, in that it was developed in-house and outside of the information technology (IT) department. The conceptual shift from a small-scale experiment to the institutional scale, while superficially a technical exercise due to the learnings already developed from previous BIE cycles, required cycles of iteration beyond just technical iterations. For example, as staff used and became more experienced with System X, the designers noticed that the feedback staff were providing changed from functionality-related commentary to requests for additional features. As an example of this, an iteration of System X released early in 2014 provided unit coordinators with a mail-merge feature that allowed them to send personalised emails to groups of students. Feedback from unit coordinators suggested that an indication of changes in student behaviour after the email would be useful in terms of assessing the need for a follow up. This indicator was added to the Moodle weekly activity timeline in System X and meant that they could quickly identify changes in student behaviour after the email “nudge”. The gradual shift in teacher feedback as they used System X was found to align with previous research that showed staff usage of education systems in general became more refined as they gained experience with the system (Malikowski, Thompson, & Theis, 2006). The iterative approach taken by the designers catered for the reciprocal evolution of both the technology and the human users of the information provided.

Formal and informal consultations, and conversations with schools and discipline teams, contributed to System X’s evolutions. Over 20 open discussions were conducted with various departments of the university during the first half of 2014. Each of the ideas presented in these forums were collected by the designers with the intention of incorporating as many as possible in the available time. This was reflected in the underlying modular design of System X where it was designed with an expectation of frequent change. However, the governance requirements associated with a funded institution-wide project often conflicted with the approach the designers employed. Detailed plans, that included specific feature release dates and other detailed outcomes were required upfront, prior to the building process. In the case of System X, the project management framework used by the institution discouraged an evolutionary approach where the next stage of the project was uncertain, dependent on cycles of feedback from the teachers, and not predetermination. For a concept like learning analytics that was very new at that time, the absence of institution-wide implementation examples or recipes required an evolutionary and learning approach. From a methodological perspective, the design phase of System X differed from the exploration phase in that much of the effort became directed to the IT artefact to achieve the required scale and to still support adaptation. However, it could be argued that the approach was more focused on the education intervention as opposed to the IT artefact, as the end-users were still integral to the ongoing design process, and there is evidence of reciprocal shaping whereby the end-users and the IT artefact continued to be shaped by each other.

Reflections and learning
The iterative approach taken with the development of System X supported the emergent development of the intervention where the intervention consisted of both the IT artefact and the end users. The approach allowed the IT artefact and the end users to coevolve as the design was implemented into a complex organisational context. So while the overall project superficially conformed to the mandated top-down, plan-driven approach, the underlying development process was conducted with change and evolution in mind. Adopting a modular design from the outset afforded the ability for the IT artefact to change based on user feedback. The iterative and evolutionary approach also contributed to the problem of scale. Aside from the addition of several hundred units with variables that included pedagogical contexts, student cohorts and teachers, the technical design needed to be flexible enough to enable frequent change. The technical components required to support a flexible, iterative approach and often conflicted with traditional enterprise implementation norms. So while the consultative approach and modular design allowed the IT artefact to adapt with the teachers as they became more experienced with the system, the design was different to IT procurement and architecture norms.

Maintenance and operation: 2015 – 2019

Problem formulation
The formal System X development project finished at the end of 2014 with the designers returning to their substantive positions in the central learning and teaching support unit, signaling the end of the development phase and the start of its maintenance phase. While the team’s supervisor had informally indicated System X would continue to be maintained by the designers moving forward, the supervisor’s retirement created a new set of unanticipated problems. The idea of an institutional IT system operating outside of the central IT department was unconventional, and associated with a myopic assumption of risk. How System X could continue to operate, never mind evolve, without senior-level advocacy, in an increasingly lean and homogeneous IT environment, became the core problem associated with this phase. This is a problem that links with a broader problem noted in the research literature, associated with issues that arise with systems that are developed outside of central IT departments, systems that are often referred to as shadow systems or feral systems (Behrens & Sedera, 2004; Spierings, Kerr, & Houghton, 2014, 2016).

BIE
While System X was unable to secure funding or workload allocation for maintenance, the designers continued to keep the system operating in its current form in addition to their normal duties. This included a number of non-trivial adaptations that were required to cater for upstream changes that impacted upon System X’s data ingestion processes. However, from an ADR perspective, the BIE cycle was severely constrained with the lack of allocated resources, a situation that contrasted with the design intent. Despite this, System X had developed into a tool that continued to prove useful for many teaching staff. At the time of writing System X has been used to view 63% (4,924) of the university’s higher education offerings while 39% (3,016) of these offerings used the personalised email (nudge) facility. These nudges were delivered to 89.7% (49,089) of the university’s higher education students over this period and usage continues to grow. For example, there were 108,523 nudges delivered in 2014 by 231 teaching staff whereas across 2018, 315,192 nudges were delivered by 429 teachers. A sentiment analysis of 1,208,762 nudge texts has shown that 61% of the nudges were worded positively, 30% used neutral language and 9% were deemed to be negatively worded. This aligns with the design intent whereby System X was not developed as a reliable predictive instrument, but as a tool of positive communication between teachers and students, with a focus on the students most at risk. While the usage of System X continues to grow, its long-term sustainability remains uncertain due to negative perceptions associated with its unorthodox implementation approach.

System X was unable to secure funding for maintenance beyond the end of the development phase. That it was an IT system outside of central IT control, proved to be an irreconcilable hurdle when funding was being sought for basic maintenance. Exacerbating the problem, was tension with the IT department who were concerned with what they considered to be an enterprise system, operating within the organisation, and outside of their direct control. A senior member of the IT department was quoted as saying “we don’t want another repeat of…System X”, referring to the approach taken in System X’s development. Ironically, the development approach taken by the design team had previously been recognised with a national learning and teaching award (Australian Office for Learning and Teaching, 2016). The IT department’s targets and performance indicators generated a different ontological perspective of System X to that of the designers and the teachers; a different perspective that encapsulates different assumptions about how such work gets done (Ellis & Goodyear, 2019). This demonstrates how an exclusive focus on IT considerations can be incompatible with learning approaches and problem solving in complex environments, when an IT artefact is involved (Macfadyen & Dawson, 2012). While teacher usage of System X continues to grow, the lack of resourcing and investment has prevented its evolution beyond what was available at the end of the design phase, a situation that contrasts with the design intent where a fundamental premise was adaptation and change.

Reflections and learning
Arguably, the approach the designers had taken with the development of System X approximated an emergent BIE cycle when framed with ADR. The primary focus was on solving a problem at the intersection of the users, their learning and teaching context, and the IT artefact. However, from the perspective of normal IT procurement and implementation, the primary consideration is on technical matters relating to the IT artefact (Macfadyen & Dawson, 2012). This created an organisational tension around the IT artefact, a tension that still exists at the time of writing. As learning, teaching and technology become further entwined and interdependent, this creates a problem whereby the focus on the IT artefact, coupled with rigid approaches to implementation, can become divorced from the problem context and the context of the human users, a situation that inherently limits the ability of organisations to adapt to emerging and complex challenges. Imposing one-size-fits-all approaches into changing and increasingly complex learning and teaching contexts makes exploration and implementation of hybrid concepts like learning analytics exceedingly difficult. Learning what works, why and how with learning analytics is unlikely to emerge from rigid, plan-based approaches to implementation. However, the notion that a well-used and reliable system can be considered feral, is represented in the research literature, and was a critical and unanticipated oversight by the design team.

Formalisation of learning
The following section reflects on the design, development and operational phases of System X from a theoretical perspective. This section is an attempt to derive formal learnings from the System X phases into theoretical elements that can be translated into design principles that support emergent approaches to learning analytics implementation.

Bottom-up, middle-out and meso-level practitioners
System X was designed and developed by academic staff, whose roles involved the provision of learning and teaching support to teachers, as well as contributing to policy, processes and learning and teaching systems. The teachers were embedded in the process of System X’s development, which helped ensure the process was grounded in the teachers’ lived-experience (Beer & Jones, 2014). The designers were able to help the teachers adapt to the new technology, and could also adapt the technology based on its real-world use by the teachers. The designers’ roles and position between the top-down and bottom-up allowed them to balance the requirements of the end-users (teachers) and the socio-material requirements of the organisation. This aligns with a theoretical construct known as meso-level practitioners (MLP). Much of the work of implementation with learning and teaching related innovation happens at the meso-level, which sits between small-scale local interactions, and the large scale policy and institutional processes (Hannon, 2013). MLP are well situated to mediate the tension between learning and teaching practice and the ambiguities associated with real-world technology change (Hannon, 2013). For learning analytics implementation, MLP is a theoretical perspective that can help balance the tension between the top-down and the bottom-up; emergent and instrumental. However, the MLP concept requires a blurring of the sharp distinction between traditional organisational boundaries.

Shadow systems
The problems that arose for System X as an institutional-wide system, relate to how it was developed outside of normal IT procurement processes and that it was perceived singularly as an IT system. The lack of funding for maintenance and the tension that developed with the IT department had detrimental impacts on the system. IT systems that are not under the control of an organisation’s IT department are often referred to as shadow systems (Spierings et al., 2014; Zimmermann, Rentrop, & Felden, 2014). These are systems that sit outside of the control of IT management and often develop as work-arounds for the deficiencies with existing institutional systems (Spierings et al., 2014; Zimmermann et al., 2014). There are two perspectives on shadow systems in an enterprise environment: On one hand, they introduce innovation into an organisation and allow for flexibility in specific contexts; On the other hand, they increase heterogeneity and complexity (Spierings et al., 2014, 2016; Zimmermann et al., 2014). However, most contemporary universities follow a strategic approach to deciding what work gets done (Jones & Clark, 2014) and shadow systems are generally considered to be an undesirable phenomenon in these environments (Behrens & Sedera, 2004). While shadow systems are often viewed unfavourably, it has been argued that their presence is an indication of a gap between required business workflow and what the sanctioned systems are providing (Spierings et al., 2016). Managers or supervisors can often insulate the shadow systems from the enterprise system proponents who seek to close or suppress the shadow systems (Spierings et al., 2016). With the loss of a key advocate from a senior leadership position, System X was increasingly perceived as a shadow system, as it sat outside centralised IT management. Emergent approaches to learning analytics implementation requires a shared organisational conceptualisation of the process as applied research, rather than a purely IT implementation process.

Complex adaptive systems (CAS)
The evolutionary approach taken in System X’s development acknowledged and supported the adaptation of the agents involved. The cyclical approach facilitated the reciprocal shaping of the IT artefact and the teachers, within an organisational context. Plan-based approaches assume that there is sufficient knowledge about how to integrate the technology so that a recipe-based approach can be applied (Hannon, 2013). In the case of an emerging field like learning analytics and in the absence of a critical mass of successful examples, there are currently no recipes to support a deterministic approach. Further to this, top-down and mechanical approaches assume that the system and its agents are stable and unchanging, an assumption that is fundamentally flawed in systems that involve humans (Beer, Jones, & Clark, 2012; Snowden & Boone, 2007). Reconceptualising learning and teaching as a complex system (Beer et al., 2012) or more recently, applying principles of complexity leadership theory (Siemens, Dawson, & Eshleman, 2018), have been presented as theoretical foundations that can guide non-deterministic and emergent learning analytics implementation. Complexity contends that systems involving agents, such as humans, are always changing disproportionately with the non-linear interactions between agents (Boustani et al., 2010; deMattos, Ribeiro Soriano, Miller, & Park, 2012; Plsek & Greenhalgh, 2001). For learning analytics implementation, complex adaptive systems (CAS) theory provides a conceptualisation of the system that describes the interdependency and mutability of the agents and actors operating within the system (Beer et al., 2012; Beer & Lawson, 2016; Dawson et al., 2018). The application of a CAS lens to learning analytics implementation provides a theoretical base for an emergent approach in a socio-technical system that is complexly and unpredictably entangled with other socio-technical systems.

Situation Awareness (SA)
The System X designers struggled to distill the types of data that the teachers required from the incredible volume of data available. Like most organisations, universities are collecting vast volumes of diverse data from their operations. The humans in these environments are exposed to increasing volumes of data which has created a gap between the volume of data being produced and the data that the human needs to achieve their goals (Endsley, 2001). Situation awareness (SA) is a theory that helps to define the data that the human operator needs at a particular time (Endsley, 1995). Visibility over the elements interacting within their environment is crucial for decision-making, particularly as the complexity of our operating environments increases (Endsley, 1995). In essence, situation awareness is the operators internal model of the current state of their environment (Endsley, 2001). In the case of System X and although the designers had access to vast quantities of diverse data that could be provided to the teachers, situation awareness theory suggested limiting the data to key metrics related to the teachers’ tasks, and that teachers needed the ability to filter the available information. It is vitally important that learning analytics be tethered to the learning design, and consequently the goals and expectations of the users (Wise, 2014). For learning analytics implementation in data rich university environments, situation awareness provides a theoretical framework and design principles for filtering, focusing and centering the information on the goals of the human operators (Endsley, 2016).

Sensemaking (SM)
System X included the ability for the teachers to take action based on the information they were provided, along with an indicator that could help assess the subsequent impact of the action. Taking action and monitoring for a resultant change is a key property of a theoretical construct known as sensemaking. Sensemaking is the interplay of action and interpretation (Weick, Sutcliffe, & Obstfeld, 2005). The most basic question in sensemaking is “what’s going on here?”, closely followed by “what do I do next?” (Weick et al., 2005). Sensemaking is about the continual redrafting of an emerging story through cycles of action and interpretation (Weick et al., 2005). In other words, it is not enough to provide teachers with just situation awareness, it needs to be coupled with the ability for them to take action (Jones, Beer, & Clark, 2013). The ability for teachers to take action based on learning analytics data, can often be overlooked in the face of increasingly sophisticated and attractive data analysis and visualisation tools. However, sensemaking is a critical diagnostic process that allows us to develop plausible interpretations when faced with ambiguous cues (Weick, 2012). These plausible interpretations are coupled with actions where the results further refine our understanding in a cyclical process. As learning environments and student lives become more complex and busy, our ability to make sense of situations based on what can only ever be fragmented information, is becoming increasingly important. This would suggest that the provision of near real-time information that augments the humans operators, coupled with the ability to take action, is a more appropriate starting point than detailed statistical analysis and sophisticated predictive modeling (Liu, Bartimote-Aufflick, Pardo, & Bridgeman, 2017). For learning analytics implementation, sensemaking provides a theoretical framework for the taking of action based on near real-time and incomplete information.

Design Principles
Design principles are intended to be reusable, evidence-based heuristics that can inform future development and implementation decisions across contexts (Herrington, McKenney, Reeves, & Oliver, 2007). However, learning analytics is a diverse field and its purpose will be driven by the context in which it is applied. Variations in pedagogical intent, learning design, available organisational technology, staff capability and capacity will influence the design, development and operation of the learning analytics systems (Wise, 2014). Similar to the learning analytics concept, design principles for learning analytics implementation cannot be everything for everyone. Instead, this study is proposing a set of initial design principles for meso-level practitioners engaged in learning analytics implementation, where the design principles are empirically derived and theory-informed.

The following principles are intended as an initial starting point for meso-level practitioners engaging in learning analytics implementation in higher education and represent the embarkation point of a longer journey. The concept of meso-level practitioners and the following design principles can potentially help bridge the divide between the current polarized approach that learning analytics implementations tend to take. These design principles offer a model of compromise that may help bridge these seemingly incompatible approaches to learning analytics implementation. Applying these principles in real-world contexts will refute or refine the principles and determine their applicability across multiple contexts. The application of these principles will also provide guidance on another under-theorised area of learning analytics, how to design the actual learning analytics artefact.

Design principles
Balance top-down and bottom up – Learning analytics implementation requires an emergent approach that balances top-down and bottom-up requirements and considerations. Balancing the ambiguities of the teachers’ lived experience with the organisational requirement for homogenization is the role of the meso-level practitioner.

Balance the socio-technical – Meaningful learning analytics requires equitable and contextual consideration of both the users and the technology. Recognise that effective learning analytics results from the complex interplay between humans, technology and context.

Consider learning analytics implementation to be applied research – Learning analytics implementation is a process of discovering what works, or otherwise, and why, in specific contexts. The objective is not to build an IT system, but to iteratively and methodically develop knowledge about what information the users require.

Allow for emergence – Outcomes from the learning analytics process emerge from complex interactions between humans, technology and information. Design for, expect and facilitate change.

Apply informed skepticism – Detailed plans, deterministic approaches and assumptions of certainty are incompatible with the complexities of learning analytics implementation.

Centre the learning analytics information around tasks and goals – Filter information to just what is needed to support specific tasks and goals in specific contexts. Resist the urge to provide additional information just because you can, or it is easy to do.

Link learning analytics with action – Provide affordances for action. Information and action are inseparable. Understanding in complex environments results from the combination of information and the taking of action.

Apply purpose-specific, theory-informed evidence-based practice – The purpose of specific learning analytics implementations will have related theoretical underpinnings that can help inform the implementation. Theory provides the implementation with guidance on a range of important functions including variable selection, model selection, data selection, result discrimination, result interpretation, actionable results and generalisability of results (Wise & Shaffer, 2015).

Conclusion
The experience provided by System X comes at an opportune time for Higher Education with many universities looking to learning analytics to help solve complex problems. In many, if not most of these cases, overly simplistic conceptualisations and mechanical approaches to implementation will limit the potential benefits in the longer term. Reconceptualising learning analytics implementation as cross-institutional applied research can help bridge the growing divide between learning analytics research and real-world practice, and lead to meaningful learning analytics implementations. Learning analytics is a relatively new concept in Australian Higher Education and the reality of real-world implementation is proving to be difficult and complex. Reframing our ontological conceptualisations of learning analytics implementation from the design of IT products, to the co-design of a service that integrates IT and people, is a vast and under-acknowledged challenge that has been recognised more broadly (Ellis & Goodyear, 2019). The design principles developed by this study provide an initial starting point that can help universities develop more meaningful learning analytics implementations through emergent development approaches.

References
Australian Office for Learning and Teaching. (2016). Citation for outstanding contribution to student learning: The educational technology team. https://docs.education.gov.au/node/41811: Government of Australia.

Beer, C. (2008). Is a data mine a gold mine? Retrieved from https://beerc.wordpress.com/2008/12/08/is-a-data-mine-a-gold-mine/

Beer, C. (2009a). LMS discussion forums and fully online students. Retrieved from https://beerc.wordpress.com/2009/11/14/lms-discussion-forums-and-fully-online-students/

Beer, C. (2009b). Quick Indicators Update. Retrieved from https://beerc.wordpress.com/2009/04/22/quick-indicators-update/
Beer, C. (2009c). Quick Indicators Update [Presentation to CQU on learning analytics and the indicators project.]. Retrieved from https://beerc.wordpress.com/2009/07/28/quick-indicators-update-2/

Beer, C. (2009d). What is learner engagement? Retrieved from https://beerc.wordpress.com/2009/10/22/what-is-learner-engagement/

Beer, C. (2010a). Moodlemoot: Enabling comparisons of LMS usage across institutions, platforms and time. Paper presented at the Moodlemoot Australia 2010, Melbourne.

Beer, C. (2010b). Using the Indicators project data to identify at risk students. Retrieved from https://beerc.wordpress.com/2010/05/14/using-the-indicators-project-data-to-identify-at-risk-students/

Beer, C., Clark, K., & Jones, D. T. (2010). Indicators of engagement. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education, Sydney. Conference publication retrieved from http://www.ascilite.org.au/conferences/sydney10/procs/Beer-full.pdf

Beer, C., & Jones, D. T. (2014). Three paths for learning analytics and beyond: Moving from rhetoric to reality. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education: Rhetoric and Reality, Dunedin, New Zealand. Conference Publication retrieved from http://ascilite2014.otago.ac.nz/files/fullpapers/185-Beer.pdf

Beer, C., Jones, D. T., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education: Future Challenges, Sustainable Futures, Wellington, New Zealand. Conference Publication retrieved from http://www.ascilite.org/conferences/Wellington12/2012/images/custom/beer%2ccolin_-_analytics_and.pdf

Beer, C., & Lawson, C. (2016). The problem of student attrition in higher education: An alternative perspective. Journal of Further and Higher Education, 1-12. doi:10.1080/0309877X.2016.1177171

Behrens, S., & Sedera, W. (2004). Why do shadow systems exist after an ERP implementation? Lessons from a case study. PACIS 2004 Proceedings, 136.

Boustani, M. A., Munger, S., Gulati, R., Vogel, M., Beck, R. A., & Callahan, C. M. (2010). Selecting a change and evaluating its impact on the performance of a complex adaptive health care delivery system. Clinical Interventions In Aging, 5, 141-148.

Colvin, C., Dawson, S., Wade, A., & Gasevic, D. (2017). Addressing the Challenges of Institutional Adoption. In C. Lang, G. Siemens, A. Wise, & D. Gasevic (Eds.), Handbook of Learning Analytics (Vol. 1, pp. 281 – 289). Australia: Society for Learning Analytics Research.

Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., . . . Corrin, L. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching.

Dawson, S., Mirriahi, N., & Gasevic, D. J. J. o. L. A. (2015). Importance of theory in learning analytics in formal and workplace settings. 2(2), 1-4.

Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gasevic, D. (2018). Rethinking learning analytics adoption through complexity leadership theory. Paper presented at the Proceedings of the 8th International Conference on Learning Analytics and Knowledge.

deMattos, P. C., Ribeiro Soriano, D., Miller, D. M., & Park, E. H. (2012). Decision making in trauma centers from the standpoint of complex adaptive systems. Management Decision, 50, 1549-1569. doi:10.1108/00251741211266688

Ellis, R. A., & Goodyear, P. (2019). The education ecology of universities : integrating learning, strategy and the academy.

Endsley, M. R. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors, 37(1), 32-64. doi:10.1518/001872095779049543

Endsley, M. R. (2001). Designing for situation awareness in complex systems. Paper presented at the Proceedings of the Second International Workshop on symbiosis of humans, artifacts and environment.

Endsley, M. R. (2016). Designing for situation awareness: An approach to user-centered design. Boca Raton, FL: CRC press.

Ferguson, R., Clow, D., Macfadyen, L., Essa, A., Dawson, S., & Alexander, S. (2014). Setting learning analytics in context: overcoming the barriers to large-scale adoption. Paper presented at the Fourth International Conference on Learning Analytics And Knowledge, Indianapolis, USA. doi: 10.1145/2567574.2567592

Gregor, S. (2006). The nature of theory in information systems. MIS quarterly, 30(September, 2006), 611-642. doi:10.2307/25148742

Gregor, S., & Jones, D. (2007). The anatomy of a design theory. Journal of the Association for Information systems, 8(5), 312.

Hannon, J. (2013). Incommensurate practices: Sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29, 168-178.

Herrington, J., McKenney, S., Reeves, T., & Oliver, R. (2007). Design-based research and doctoral students: Guidelines for preparing a dissertation proposal. Paper presented at the World Conference on Educational Multimedia, Hypermedia and Telecommunications 2007, Chesapeake, VA, USA.

Jones, D. T., Beer, C., & Clark, D. (2013). The IRAC framework: Locating the performance zone for learning analytics. Paper presented at the ASCILITE2013 Electric Dreams, Sydney. Conference publication retrieved from http://www.ascilite.org.au/conferences/sydney13/program/papers/Jones.pdf

Jones, D. T., & Clark, D. (2014). Breaking Bad to Bridge the Reality / Rhetoric Chasm. Paper presented at the ASCILITE2014 Rhetoric and Reality, Dunedin, New Zealand. Conference publication retrieved from http://ascilite2014.otago.ac.nz/

Liu, D. Y.-T., Bartimote-Aufflick, K., Pardo, A., & Bridgeman, A. J. (2017). Data-Driven Personalization of Student Learning Support in Higher Education. In A. Peña-Ayala (Ed.), Learning Analytics: Fundaments, Applications, and Trends: A View of the Current State of the Art to Enhance e-Learning (pp. 143-169). Cham: Springer International Publishing.

Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Journal of Educational Technology & Society, 15(3), 149-163.

Malikowski, S. R., Thompson, M. E., & Theis, J. G. (2006). External factors associated with adopting a CMS in resident college courses. Internet and Higher Education, 9(2006), 163-174. doi:10.1016/j.iheduc.2006.06.006

Marabelli, M., & Galliers, R. D. (2017). A reflection on information systems strategizing: the role of power and everyday practices. Information Systems Journal, 27(3), 347-366.

Plsek, P. E., & Greenhalgh, T. (2001). Complexity science: The challenge of complexity in health care. BMJ: British Medical Journal, 323, 625-628.
Reed, R., Fleming, J., Beer, C., & Clark, D. (2013). Project Iniitiation Document: Student Retention – Engagement and Intervention. Structural adjustment funding. CQUniversity. CQUniversity.

Sanders, M., & George, A. (2017). Viewing the changing world of educational technology from a different perspective: Present realities, past lessons, and future possibilities. Education and Information Technologies, 22(6), 2915-2933. doi:10.1007/s10639-017-9604-3

Sein, M., Henfridsson, O., Purao, S., Rossi, M., & Lindgren, R. (2011). Action Design Research. MIS quarterly, 35(March 2011), 37 – 56.
Siemens, G., Dawson, S., & Eshleman, K. (2018). Complexity: A leader’s framework for understanding and managing change in higher education. Educause Review, 53(6), 27 – 42. Retrieved from Educause Review website:
Snowden, D., & Boone, M. E. (2007). A Leader’s Framework for Decision Making. Harvard Business Review, 85(11), 68-76.

Spierings, A., Kerr, D., & Houghton, L. (2014). What drives the end user to build a feral information system? Feral Information Systems Development: Managerial Implications, 161-188.

Spierings, A., Kerr, D., & Houghton, L. (2016). Issues that support the creation of ICT workarounds: towards a theoretical understanding of feral information systems. Information Systems Journal, 27(6), 775 – 794. doi:https://doi.org/10.1111/isj.12123

Weick, K. E. (2012). Making sense of the organization: Volume 2: The impermanent organization (Vol. 2). West Sussex, United Kingdom: John Wiley & Sons.

Weick, K. E., Sutcliffe, K. M., & Obstfeld, D. (2005). Organizing and the process of sensemaking. Organization Science, 16, 409-421. doi:10.1287/orsc.1050.0133

Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics. Paper presented at the Proceedings of the fourth international conference on learning analytics and knowledge.

Wise, A. F., & Shaffer, D. W. J. J. o. L. A. (2015). Why Theory Matters More than Ever in the Age of Big Data. 2(2), 5-13.

Zimmermann, S., Rentrop, C., & Felden, C. (2014). Managing shadow IT instances–a method to control autonomous IT solutions in the business departments. Paper presented at the Americas Conference on Information Systems (AMCIS).