Organisational silos – the challenge for learning analytics

Learning analytics research has identified two key challenges impacting on the ability of universities to effectively embed learning analytics into their learning and teaching ecosystems (Dawson et al., 2018). These challenges are:
– The growing chasm between learning analytics research and organizational practice and;
– How to effectively develop and implement learning analytics across an organization

Many authors have contributed ideas for how these challenges might be resolved including:
– Alternative leadership models (Dawson et al., 2018; Hannon, 2013)
– Frameworks for understanding and actioning learning analytics implementation (Colvin et al., 2015; Drachsler & Greller, 2012; Wise & Vytasek, 2017)
– Theoretical contributions and the identification of specific learning analytics challenges (Beer, Jones, & Lawson, 2019; Beer, Jones, & Clark, 2012; Jones, Beer, & Clark, 2013; Selwyn, 2019)

Successful university learning analytics implementations require the complex integration of data, teaching and learning, and this represents a challenge for learning analytics implementation (Beer et al., 2019; Drachsler & Greller, 2012; Knight, Wise, & Chen, 2017; Wise & Jung, 2019). While some analytics can be used across multiple contexts, questions of interpretation, meaning making and action are inherently local and require intimate learning and teaching knowledge within specific contexts (Wise & Vytasek, 2017). Instrumental approaches to learning analytics, characterized by a technology focus and teleological processes, are associated with limited staff uptake of the learning analytics tools (Dawson et al., 2018). Emergent approaches to learning analytics, characterized by bottom-up and strongly consultative processes are associated with issues of scalability and technical integration (Dawson et al., 2018).

Universities are corporate-like bureaucracies that are structured mechanistically with staff functionally grouped into manageable areas (Cilliers & Greyvenstein, 2012). These vertical divisions are often referred to as organizational silos (Cilliers & Greyvenstein, 2012; de Waal, Weaver, Day, & van der Heijden, 2019).

At best, silos offer a practical way for organizations to operate efficiently. At worst they create a silo mentalitity where departments do not want to exchange knowledge or information, hindering internal collaboration and organizational learning, thus preventing achievement of high performance and organizational sustainability

(de Waal et al., 2019)

These organizational silos represent a real challenge for learning analytics. Learning analytics has matured beyond a focus on developing tools and systems towards developing ways by which analytics can impact upon learning and teaching (Wise & Jung, 2019). This issue of impact is centred upon humans. Learning analytics forms part of a socio-technical system where human decision-making and action are considered as equally important as the technical components (Van Harmelen & Workman, 2012; Wise & Jung, 2019). However, our organization structures have promoted a separation of roles and associated expertise whereby learning and teaching expertise and technical expertise are separated into different organizational silos. The following figure provides an oversimplified example of this separation of expertise.

knowledge and capability

Further compounding the challenge with the example above, these silos tend operate very differently and apply different methodological approaches in how they approach their work. IT departments will typically follow a rigid, structured approach to development and system implementation that resembles the Waterfall Model (Dima & Maassen, 2018). From a concept or idea, requirements are gathered and documented, and an implementation plan developed. An external developer or consultant is then engaged to enact the plan prior to user-acceptance testing, followed by the system entering official operation. On the other hand, learning and teaching environments are inherently complex social systems where the agents within the system coevolve, self-organise and learn (Beer et al., 2012; Mason, 2008a, 2008b). The interdependencies and mutability of the actors and agents within these complex systems precludes top-down and mechanical approaches to change, and instead require a process of organized inquiry to determine and understand the contributing variables – so as to uncover the path forward (Ellis & Goodyear, 2019).

methodological approach

So we have a concept (learning analytics) that requires the amalgamation of technology, learning and teaching expertise. We also have organisational structures that, in some respects, encourage an unbalanced approach to learning analytics implementation. Implementations driven exclusively by learning and teaching will likely run into technical integration problems, scalability issues and a raft of internal political challenges (Beer, Jones, & Lawson, 2019). Implementations driven exclusively by IT departments will likely run into issues of staff and student uptake of the learning analytics tools. Adding to the problem, organisations will likely to consider any post-implementation problem, not as a conceptualisation problem, but as a “training gap” that requires additional support resources and professional development sessions.

While I do not have the answers yet, part of my PhD is about how meso-level practitioners (Hannon, 2013) can navigate these murky waters in their own organisational context. I am exploring the question of how does the meso-level practitioner promote a more balanced perspective of learning analytics and what compromises will be required for meaningful learning analytics implementations due to these organisational  conceptions?

References

Beer, C., Jones, D., & Lawson, C. (2019). The challenge of leanring analytics implementation: Lessons learned. Paper presented at the Personalised Learning. Diverse Goals. One Heart, Singapore.

Beer, C., Jones, D. T., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education: Future Challenges, Sustainable Futures, Wellington, New Zealand. Conference Publication retrieved from http://www.ascilite.org/conferences/Wellington12/2012/images/custom/beer%2ccolin_-_analytics_and.pdf

Cilliers, F., & Greyvenstein, H. (2012). The impact of silo mentality on team identity: An organisational case study. 2012, 38(2). doi:10.4102/sajip.v38i2.993

Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., . . . Corrin, L. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching.

Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gasevic, D. (2018). Rethinking learning analytics adoption through complexity leadership theory. Paper presented at the Proceedings of the 8th International Conference on Learning Analytics and Knowledge.

de Waal, A., Weaver, M., Day, T., & van der Heijden, B. (2019). Silo-busting: overcoming the greatest threat to organizational performance. Sustainability, 11(23), 6860.

Dima, A. M., & Maassen, M. A. (2018). From Waterfall to Agile software: Development models in the IT sector, 2006 to 2018. Impacts on company management. Journal of International Studies, 11(2), 315-326.

Drachsler, H., & Greller, W. (2012). The pulse of learning analytics understandings and expectations from the stakeholders. Paper presented at the Proceedings of the 2nd international conference on learning analytics and knowledge.

Ellis, R. A., & Goodyear, P. (2019). The Education Ecology of Universities: Integrating Learning, Strategy and the Academy: Routledge.

Hannon, J. (2013). Incommensurate practices: Sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29, 168-178.

Jones, D. T., Beer, C., & Clark, D. (2013). The IRAC framework: Locating the performance zone for learning analytics. Paper presented at the ASCILITE2013 Electric Dreams, Sydney. Conference publication retrieved from http://www.ascilite.org.au/conferences/sydney13/program/papers/Jones.pdf

Knight, S., Wise, A. F., & Chen, B. (2017). Time for change: Why learning analytics needs temporal analysis. Journal of Learning Analytics, 4(3), 7–17-17–17.

Mason, M. (2008a). Complexity theory and the philosophy of education. Educational Philosophy and Theory, 40, 15. doi:10.1111/j.1469-5812.2007.00412.x

Mason, M. (2008b). What Is Complexity Theory and What Are Its Implications for Educational Change? Educational Philosophy and Theory, 40(1), 35-49.

Selwyn, N. (2019). What’s the Problem with Learning Analytics? Journal of Learning Analytics, 6(3), 11–19-11–19.

Van Harmelen, M., & Workman, D. (2012). Analytics for learning and teaching. CETIS Analytics Series, 1(3), 1-40.

Wise, A. F., & Jung, Y. (2019). Teaching with analytics: Towards a situated model of instructional decision-making. Journal of Learning Analytics, 6(2), 53–69-53–69.

Wise, A. F., & Vytasek, J. (2017). Learning analytics implementation design. Handbook of learning analytics, 151-160.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s