This blog post represents an exemplar for my students undertaking Learning in a Digital Age as part of their Graduate Certificate of Tertiary Education. It is based on the EASI system in place at CQUniversity and attempts to recreate/revisit our thinking when we saw an opportunity to adopt and integrate an emerging technology into our learning and teaching context.
Introduction to learning analytics
Associated with the almost universal adoption of digital technologies in higher education is their ability to store and track vast amounts of data on staff and student behaviour. The data captured by these digital systems can be analysed to improve decision making or to provide insight into the learning and teaching process. Learning analytics has been loosely defined as the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs (Gašević, Dawson, & Siemens, 2015). Learning analytics has been touted as a game-changer in higher education that can contribute to many areas within the academy.
There are two broad trajectories that universities tend to take with learning analytics. The use of learning analytics to help address student attrition and retention, and the use of learning analytics to contribute understanding to the learning and teaching process (Colvin et al., 2015). There is an argument to be made that learning analytics that contributes understanding to the learning and teaching process will also result in improved student attrition and retention. Despite this, there is evidence that suggests that these two trajectories are in fact diverging, despite their apparent complementary nature.
As an educational developer attached to a central learning and teaching support department of a regional Australian university, my learning and teaching context is broader than is usual for a faculty academic. My role within a regional university that has high proportions of low SES and online students means that I am constantly on the lookout for new ways of helping my university retain more students. Learning analytics is recognised as an approach that can help with student attrition and retention by providing improved visibility over students who, with the advent of digital classrooms, have become less visible to their teachers when compared with face-to-face classrooms. The notion of the invisible student means that online students are not directly observable by their teachers and so alternative mechanisms are required to monitor student engagement in these online environments.
Evaluating learning analytics in this context?
There is extent literature and learning analytics projects aimed at addressing student attrition through the early identification of ‘at risk’ students and the facilitation of subsequent interventions (Liu, Rogers, & Pardo, 2015). However, the use of learning analytics for student attrition and retentions is not without criticism. Correlating variables in student behaviours with student success artificially creates an isolation from the real-world complexity of student life (Liu et al., 2015). It also promotes a deficit view of the student in that being ‘at risk’ suggests that there is ‘something wrong’ with the student, something that needs to be fixed (Liu et al., 2015; Macfadyen & Dawson, 2012). This raises questions of ethics and privacy around the intent behind the institutional use of student data (Prinsloo & Slade, 2015, 2017a, 2017b) such as might occur with learning analytics.
Like any evaluation of a new technology in any specific context, there are going to be pros and cons. In this case there is evidence that suggests that learning analytics can help universities with their student retention and many universities are investigating this approach. However, the research suggests that there has been a focus on the variables that contribute to student success or failure despite the absence of an established link between student success or failure and student attrition or retention. Student lives are simply too complex to categorise in such a manner. As such, I would suggest an evolutionary approach starting with the representation of student activity within the learning management system as a proxy indicator of student engagement. An approach that is less about the factors that contribute to student success and more about student activity.
Integration of the technology in my context
I believe that there is some potential for learning analytics to help unit coordinators better focus their attention in online classrooms. For example, there is anecdotal evidence that suggests that the engaged students demand and receive a disproportionate amount of attention from their unit coordinators. They are actively engaged in forums, seeking feedback on formative and summative activities and ask many questions of their teachers. With the increasing class sizes associated with online classrooms, this can mean that the less engaged and often lower achieving students can be overlooked or underserviced by their teachers.
My idea is to provide teaching staff with a view of their students’ activity in their Moodle sites. While there are not insignificant issues associated with clickstream information drawn from learning management systems, there appears to be an opportunity to take data that is already being collected and present it to teachers so as to highlight online students who may not be as engaged as others in their class. These students would otherwise be invisible to the busy and time-poor teacher who is struggling to keep abreast of the online forums and marking.
The following image is an example of the correlation between student activity on the Moodle learning management system and their resulting grade at CQUniversity. While this is a nice neat correlation it hides a great deal of the underpinning complexity and diversity across the student cohort.
Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., & Fisher, J. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Retrieved from http://www.olt.gov.au/project-student-retention-and-learning-analytics-snapshot-current-australian-practices-and-framework
Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends: Linking Research & Practice to Improve Learning, 59(1), 64-71. doi:10.1007/s11528-014-0822-x
Liu, D. Y.-T., Rogers, T., & Pardo, A. (2015). Learning analytics-are we at risk of missing the point. Paper presented at the Proceedings of the 32nd ascilite conference.
Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Journal of Educational Technology & Society, 15(3), 149-163.
Prinsloo, P., & Slade, S. (2015). Student privacy self-management: implications for learning analytics. Paper presented at the Fifth International Conference on Learning Analytics and Knowledge, Poughkeepsie, New York. Conference Publication retrieved from https://pdfs.semanticscholar.org/09d7/56d7a66f002f5c06b05237c3fc162b61a653.pdf
Prinsloo, P., & Slade, S. (2017a). Big Data, Higher Education and Learning Analytics: Beyond Justice, Towards an Ethics of Care Big Data and Learning Analytics in Higher Education (pp. 109-124): Springer International Publishing.
Prinsloo, P., & Slade, S. (2017b). An elephant in the learning analytics room: the obligation to act. Paper presented at the LAK’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference.