Organisational silos – the challenge for learning analytics

Learning analytics research has identified two key challenges impacting on the ability of universities to effectively embed learning analytics into their learning and teaching ecosystems (Dawson et al., 2018). These challenges are:
– The growing chasm between learning analytics research and organizational practice and;
– How to effectively develop and implement learning analytics across an organization

Many authors have contributed ideas for how these challenges might be resolved including:
– Alternative leadership models (Dawson et al., 2018; Hannon, 2013)
– Frameworks for understanding and actioning learning analytics implementation (Colvin et al., 2015; Drachsler & Greller, 2012; Wise & Vytasek, 2017)
– Theoretical contributions and the identification of specific learning analytics challenges (Beer, Jones, & Lawson, 2019; Beer, Jones, & Clark, 2012; Jones, Beer, & Clark, 2013; Selwyn, 2019)

Successful university learning analytics implementations require the complex integration of data, teaching and learning, and this represents a challenge for learning analytics implementation (Beer et al., 2019; Drachsler & Greller, 2012; Knight, Wise, & Chen, 2017; Wise & Jung, 2019). While some analytics can be used across multiple contexts, questions of interpretation, meaning making and action are inherently local and require intimate learning and teaching knowledge within specific contexts (Wise & Vytasek, 2017). Instrumental approaches to learning analytics, characterized by a technology focus and teleological processes, are associated with limited staff uptake of the learning analytics tools (Dawson et al., 2018). Emergent approaches to learning analytics, characterized by bottom-up and strongly consultative processes are associated with issues of scalability and technical integration (Dawson et al., 2018).

Universities are corporate-like bureaucracies that are structured mechanistically with staff functionally grouped into manageable areas (Cilliers & Greyvenstein, 2012). These vertical divisions are often referred to as organizational silos (Cilliers & Greyvenstein, 2012; de Waal, Weaver, Day, & van der Heijden, 2019).

At best, silos offer a practical way for organizations to operate efficiently. At worst they create a silo mentalitity where departments do not want to exchange knowledge or information, hindering internal collaboration and organizational learning, thus preventing achievement of high performance and organizational sustainability

(de Waal et al., 2019)

These organizational silos represent a real challenge for learning analytics. Learning analytics has matured beyond a focus on developing tools and systems towards developing ways by which analytics can impact upon learning and teaching (Wise & Jung, 2019). This issue of impact is centred upon humans. Learning analytics forms part of a socio-technical system where human decision-making and action are considered as equally important as the technical components (Van Harmelen & Workman, 2012; Wise & Jung, 2019). However, our organization structures have promoted a separation of roles and associated expertise whereby learning and teaching expertise and technical expertise are separated into different organizational silos. The following figure provides an oversimplified example of this separation of expertise.

knowledge and capability

Further compounding the challenge with the example above, these silos tend operate very differently and apply different methodological approaches in how they approach their work. IT departments will typically follow a rigid, structured approach to development and system implementation that resembles the Waterfall Model (Dima & Maassen, 2018). From a concept or idea, requirements are gathered and documented, and an implementation plan developed. An external developer or consultant is then engaged to enact the plan prior to user-acceptance testing, followed by the system entering official operation. On the other hand, learning and teaching environments are inherently complex social systems where the agents within the system coevolve, self-organise and learn (Beer et al., 2012; Mason, 2008a, 2008b). The interdependencies and mutability of the actors and agents within these complex systems precludes top-down and mechanical approaches to change, and instead require a process of organized inquiry to determine and understand the contributing variables – so as to uncover the path forward (Ellis & Goodyear, 2019).

methodological approach

So we have a concept (learning analytics) that requires the amalgamation of technology, learning and teaching expertise. We also have organisational structures that, in some respects, encourage an unbalanced approach to learning analytics implementation. Implementations driven exclusively by learning and teaching will likely run into technical integration problems, scalability issues and a raft of internal political challenges (Beer, Jones, & Lawson, 2019). Implementations driven exclusively by IT departments will likely run into issues of staff and student uptake of the learning analytics tools. Adding to the problem, organisations will likely to consider any post-implementation problem, not as a conceptualisation problem, but as a “training gap” that requires additional support resources and professional development sessions.

While I do not have the answers yet, part of my PhD is about how meso-level practitioners (Hannon, 2013) can navigate these murky waters in their own organisational context. I am exploring the question of how does the meso-level practitioner promote a more balanced perspective of learning analytics and what compromises will be required for meaningful learning analytics implementations due to these organisational  conceptions?

References

Beer, C., Jones, D., & Lawson, C. (2019). The challenge of leanring analytics implementation: Lessons learned. Paper presented at the Personalised Learning. Diverse Goals. One Heart, Singapore.

Beer, C., Jones, D. T., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education: Future Challenges, Sustainable Futures, Wellington, New Zealand. Conference Publication retrieved from http://www.ascilite.org/conferences/Wellington12/2012/images/custom/beer%2ccolin_-_analytics_and.pdf

Cilliers, F., & Greyvenstein, H. (2012). The impact of silo mentality on team identity: An organisational case study. 2012, 38(2). doi:10.4102/sajip.v38i2.993

Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., . . . Corrin, L. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching.

Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gasevic, D. (2018). Rethinking learning analytics adoption through complexity leadership theory. Paper presented at the Proceedings of the 8th International Conference on Learning Analytics and Knowledge.

de Waal, A., Weaver, M., Day, T., & van der Heijden, B. (2019). Silo-busting: overcoming the greatest threat to organizational performance. Sustainability, 11(23), 6860.

Dima, A. M., & Maassen, M. A. (2018). From Waterfall to Agile software: Development models in the IT sector, 2006 to 2018. Impacts on company management. Journal of International Studies, 11(2), 315-326.

Drachsler, H., & Greller, W. (2012). The pulse of learning analytics understandings and expectations from the stakeholders. Paper presented at the Proceedings of the 2nd international conference on learning analytics and knowledge.

Ellis, R. A., & Goodyear, P. (2019). The Education Ecology of Universities: Integrating Learning, Strategy and the Academy: Routledge.

Hannon, J. (2013). Incommensurate practices: Sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29, 168-178.

Jones, D. T., Beer, C., & Clark, D. (2013). The IRAC framework: Locating the performance zone for learning analytics. Paper presented at the ASCILITE2013 Electric Dreams, Sydney. Conference publication retrieved from http://www.ascilite.org.au/conferences/sydney13/program/papers/Jones.pdf

Knight, S., Wise, A. F., & Chen, B. (2017). Time for change: Why learning analytics needs temporal analysis. Journal of Learning Analytics, 4(3), 7–17-17–17.

Mason, M. (2008a). Complexity theory and the philosophy of education. Educational Philosophy and Theory, 40, 15. doi:10.1111/j.1469-5812.2007.00412.x

Mason, M. (2008b). What Is Complexity Theory and What Are Its Implications for Educational Change? Educational Philosophy and Theory, 40(1), 35-49.

Selwyn, N. (2019). What’s the Problem with Learning Analytics? Journal of Learning Analytics, 6(3), 11–19-11–19.

Van Harmelen, M., & Workman, D. (2012). Analytics for learning and teaching. CETIS Analytics Series, 1(3), 1-40.

Wise, A. F., & Jung, Y. (2019). Teaching with analytics: Towards a situated model of instructional decision-making. Journal of Learning Analytics, 6(2), 53–69-53–69.

Wise, A. F., & Vytasek, J. (2017). Learning analytics implementation design. Handbook of learning analytics, 151-160.

Help! My students have disappeared

This post is for those colleagues who are perhaps teaching online for the first time. There are plenty of good resources available for how you can quickly transition from face-to-face to online teaching.

Have a look at Griffith’s video series or the principles of online teaching from a national award winning online educator.

There are many quality resources available that talk about student engagement in these online environments. This is great advice and it is worth your time to search around for hints and tips that you can quickly enact with your own online class. However, I would like to draw your attention to a specific part of teaching online and student engagement that is often easily overlooked when you are cognitively overloaded by current events.

Student Sleeping in library and a book by Mandala

“Student Sleeping in library and a book by Mandala” by hasnainyaseen6 is licensed under CC BY 2.0

With your face-to-face classes, you have the class in front of you and you intuitively know whether or not they are paying attention. Are they looking at you intently or are they starting off into space, are they taking notes or are they doodling in their notebooks and so on and so forth. In response to your ongoing sensing of their attentiveness, you adapt what you are doing and how you are doing it right?

But…

In the online environment, your class becomes invisible to you. You can no longer see the glint in their eye or their vacant wish-I-was-somewhere-else expressions. If you are only using synchronous methods, such as video conferencing, chances are you can still sense their attentiveness as you could before. But now you have uploaded all of these files and videos that the students can visit in their own time, how do you tell if they are being accessed, and how do you who has accessed them?

vandevermooreandrea_99165_5786729_AVM 11

“vandevermooreandrea_99165_5786729_AVM 11” by KSRE Photo is licensed under CC BY 2.0

This is where it gets a little tricky. Many universities will have sophisticated systems for monitoring student engagement in their online environments, systems that will tell them who has been active and who hasn’t (search Google Scholar for Learning Analytics if you want to know more). Most schools and small institutions, however, will not have this level of sophistication. What you are able to do in this space will depend on the technology in use at your institution. Most learning management systems will allow you to closely track student accesses to your resources and will faithfully record every mouse click students make in these systems.

Important note: Mouse clicks and views of specific resources DOES NOT equal student engagement. A click on a specific resource does not mean that it was processed or internalised. However, in the absence of anything else, student activity in online environments can be used as useful indictors of student engagement when coupled with your professional judgement.

The following are just some rough thoughts on how you might monitor student engagement in your own online context. It is not exhaustive or specific but may help start you thinking about how you monitor your student engagement in this scary new online world.

  • Find out from your IT folk how you can access logs of student logins into your course or unit.
  • Find out how you see which of your students are accessing your course resources.
  • Link your specific resources to small formative tasks for your students. Given them a video or reading, then ask for a couple of sentences on what they thought about it or how it might relate to learning.
  • Small formative quizzes are useful if you are using a Learning Management System like Blackboard or Moodle. Who hasn’t attempted these quizzes is gold when it comes to figuring out which of your students need prompting.
  • Keep track of who is contributing and who is lurking in your discussion forums. If someone is neither lurking or contributing, they might need prompting. Remember that lurking is not necessarily a bad thing.

When it comes to student engagement in online environments, I remember hearing the following at some conference somewhere and it resonated with me:

Some students will pass no matter what you do. Some students will fail no matter what you do. The trick is identifying the students who will fail without your intervention.

Often with online learning environments, our systems and tools make it very easy to focus on engagement, and not necessarily disengagement. We need to be on the lookout for those invisible students who made need a little more help.

Learning analytics and the three-body problem

The three body problem

In 1687, Sir Isaac Newton published his seminal article “Philosophiae Naturalis Principia Mathematica” in which he described the motion of celestial bodies (Newton, 1987). Newton’s theory of gravity provided a means for precisely characterising complex orbital motion by establishing that celestial bodies excerpt force on all other celestial bodies, where the force is inversely proportional to the distance between the bodies (Newton, 1987). Newton’s work, particularly with Principia Mathematica, became a cornerstone to natural philosophy and remains one of the most important scientific works in human history (Smith, 2008). However, for mathematicians in particular, Newtons work in Principia Mathematica created a long-standing problem.

The problem became known as the three-body problem and has been described as the most celebrated of all dynamical problems (Barrow-Green, 1997). The three-body problem can be simply stated:

Three particles move in space under their mutual gravitation attraction; given their initial conditions, determine their subsequent motion

(Barrow-Green, 1997).

The essence of the three-body problem is that even if we know the initial position and momentum for three bodies, we cannot solve for their subsequent motion (except in some highly specific and contrived situations). Between 1750 and 1900, over 800 publications were written related to the three-body problem representing many distinguished mathematicians of that time (Barrow-Green, 1997), with the problem remaining unsolved to this day.

So what has this got to do with learning analytics?

Well, not much really and I’m probably even guilty of committing a logical fallacyreductio ad absurdum. It is also an apples/oranges comparison, but it does help to illustrate a point.

Consider the dominant management model and the dominant approach to technology adoption in our universities: Universities tend to be based on the Newtonian-machine management model where the focus is on hierarchical structures, rules-based culture, command, control and formal relationships (Cenere, Gill, Lewis, & Lawson, 2015; Goldspink, 2007). This mechanical model assumes that goals can be achieved through deliberate action based on knowledge of what has happened previously (Beer & Lawson, 2017). Put simply, we tend to think that a thorough analysis of what has happened previously can help us to predict what will happen next, and so we make our plans and design our systems based on the assumption that the past can inform the future.

Getting back to the three-body problem, we only have three entities, each with its own mass, vector and velocity, and we are still unable to algorithmically characterise their respective positions in the future. If we have a class with only three students, we cannot begin to perceive or quantify the variables that will contribute to their success or otherwise. Even if we had the hypothetical ability to perceive these variables at the outset of their studies, it would have little bearing on the outcome or the journey to the outcome.

My point here is that techno-centric approaches to learning analytics, where the focus of attention is on historical data, are fundamentally flawed. I’m not sure we truly understand this yet as a sector. I’m still seeing people buying into the delusion that says predicting student success is simply a matter of investing in some IT infrastructure and applying some fancy-sounding algorithm. Investment in data infrastructure and exploring these algorithms is something we should be doing, no argument, but what about investing the human side of learning analytics, the non-IT side? Even if we have the most amazing systems and the most amazing algorithms, what can we do about that student who hasn’t logged on for two weeks because they have been busy with their three jobs and complex family challenges? Do our policies and processes allow the flexibility required to respond to the range of situations that will be brought to light by our learning analytics systems? More broadly still, do our systems, processes and policy appropriately represent our specific cohort of students?

I’m a nerd and I get the fascination with IT systems and predictive algorithms with learning analytics, I really do. But we need to think about the assumptions that underpin our investments in learning analytics. Resources are finite, now more than ever, so we need a more balanced approach to how we conceptualise learning analytics in each of our organisations.

References

Barrow-Green, J. (1997). Poincaré and the three body problem: American Mathematical Soc.

Beer, C., & Lawson, C. (2017). Framing attrition in higher education: A complex problem. Journal of Further and Higher Education, 1-12. doi:10.1080/0309877X.2017.1301402

Cenere, P., Gill, R., Lewis, M., & Lawson, C. (2015). Communication Skills for Business Professionals. Port Melbourne, Victoria, Australia: Cambridge University Press.

Goldspink, C. (2007). Rethinking Educational Reform: A Loosely Coupled and Complex Systems Perspective. Educational Management Administration & Leadership, 35(1), 27-50. http://dx.doi.org/10.1177/1741143207068219

Newton, I. (1987). Philosophiæ naturalis principia mathematica (Mathematical principles of natural philosophy). London (1687), 1687.

Smith, G. (2008). Isaac Newton. https://plato.stanford.edu/archives/fall2008/entries/newton/: Metaphysics Research Lab, Stanford University.

 

Regional universities are & need to be different

The Vice Chancellor of my university recently wrote an article for ABC Opinion highlighting the neglect that regional Australia suffers. Of particular interest was the education gap between metropolitan and regional/remote areas whereby metropolitan citizens are much more likely to attend university than their regional counterparts (73% versus 47% when comparing Brisbane and Bundaberg school leavers). I wholeheartedly agree that regional Australian universities are getting a raw deal and I also believe that the problem is actually much worse than we currently appreciate. However, I also think that some of the blame rests with regional universities who could be better at embracing their different operating contexts.

With an attitude of resigned indifference, regional Australians appear to simply accept an inferior position on our nation’s economic, cultural and social ladder
(ABC Opinion, 2020)

Consider CQUniversity, which is based in regional Australia. CQUniversity has the largest footprint of any university in Australia with 26 locations across the nation (CQUniversity, 2019). CQUniversity is a “comprehensive” university offering a full range of vocational and higher education qualifications. Geographic distribution and the mixture of vocational and higher education qualifications makes for an astonishingly complex learning, teaching and administrative environment, which is arguably unique in the Australian higher education context. However, it is the student cohort that really distinguishes CQUniversity from all other Australian universities. Vastly higher proportions of students from high-risk equity groups make the university particularly susceptible to performance funding models that use student attrition and retention as a performance measure. To me, this seems grossly unfair and a perpetuation of the disparity between metropolitan and regional areas that the ABC article suggests.

Regional - national

Students from equity groups face a number of structural challenges in accessing, participating and completing higher education, including geographical location, financial constraints, emotional factors and sociocultural incongruity. (Nelson et al., 2017)

There can be little doubt that regional universities are punching well above their weight when it comes to their holistic contribution to Australian society. However, that said, I also think that regional Australian universities can, and need to do much better, even within the hegemony of the higher education sector. Given the differences in student cohorts and operating contexts between regional and metropolitan universities, it would seem reasonable to expect regional universities to think and operate differently when it comes to their learning, teaching and technology. But this does not seem to be the case, with broadly similar policies, processes, structures, infrastructure and worst of all, thinking, across the sector. Regional universities are different in a difficult environment; an environment that in many respects reinforces homogeneity that is not suited to their context.

It is essential to develop a more nuanced understanding of the relationships that exist between social, cultural, financial and structural issues, demographic and equity characteristics, enrolment patterns, and the completion rates of cohorts at RUN universities to ensure that measures effectively target the lived complexity of diverse student populations (Nelson et al., 2017)

So how can Australian regional universities be better at being different? This is a vastly complex issue that has economic, cultural, political, ecological and educational linkages. It is a wicked problem in a transcontextual environment. Perhaps thinking more about the context of our operations and context of the students we have, and less about what our larger metropolitan cousins are doing with regards to their structures, technologies, processes etcetera, would be a useful place to start? A different ontology & epistemology for a different context?

References

CQUniversity. (2019). CQUniversity Strategic Plan 2019-2023. CQUniversity: CQUniversity Retrieved from https://www.cqu.edu.au/about-us/about-cquniversity/strategic-plan-2019-2023

Head, B., & Alford, J. (2008). Wicked problems: The implications for public management. Paper presented at the Presentation to Panel on Public Management in Practice, International Research Society for Public Management 12th Annual Conference.

Nelson, K., Picton, C., McMillan, J., Edwards, D., Devlin, M., & Martin, K. (2017). Understanding the completion patterns of equity students in regional universities. Retrieved from The National Centre for Student Equity in Higher Education (NCSEHE) website https://www.ncsehe.edu.au/publications/completion-patterns-of-equity-students-in-regional-universities.