I have been following with great interest, the proceedings of an open and free course being offered by the Technology Enhanced Knowledge Research Institute at Athabasca University. The course is called an Introduction to Learning and Knowledge Analytics (LAK’11) and is centered on the use of data for informing and improving in the education process. This is something we have been working on for sometime with the Indicators project and we are watching developments in this space with great interest.
In 2007/2008 the curriculum design and development team at CQUniversity were in charge of supporting one of the institutions learning management system (LMS) which was Blackboard 6(3). We noticed that the Blackboard backend database had a table called ACTIVITY_ACUMULATOR that was responsible for recording all staff and student clicks within the system. It did not take long to realize that there was a wealth of information stored in this table and if we correlated the table’s data with student results we could look at features and patterns of behavior that correlated with student success. It should also be noted that CQUniversity has recently (Term 1, 2010) moved to a single Moodle LMS and all of the following charts come from this system.
CQUniversity has a quite complex learning and teaching environment with three distinct student cohorts (distinct in the way that interact with the LMS). We have on-campus students studying at several regional campuses throughout Queensland, International students who are mainly on Brisbane, Sydney and Melbourne campuses, Flex students who are students enrolled via distance and whose interactions with their courses mainly occur online. The following chart shows the difference in hit-counts on the LMS against grade for these three cohorts.
This chart highlights an important issue with regards to the use of learning analytics and that is context. As you can see CQUniversity has three cohorts of students who use the LMS in different ways and therefore have to be considered separately to infer any degree of accuracy from the results. To further expand on the importance of context the following chart compares the same sort of grade versus flex student LMS hits data between two courses in the same term. The courses are delivered in a single discipline by the same school in the same faculty.
As you can see there is a vastly different set of results for each of these courses despite the course being delivered by staff in a common degree program. To me, this highlights the importance of context in the interpretation of learning analytics data and this leads me to another point. Where is the output from learning analytics best targeted?
Again context is very important. In the CQUniversity context, courses are often designed by the course coordinators and delivered by the course coordinators and other teaching staff from the various campuses. So if the course coordinators are designing the course in terms of pedagogy and the facilitating technological features, it seems reasonable to assume that their conception of learning and teaching and their experience with the modes of course delivery influences the final design of the course and therefore, the behavior of the students within that course. With the Indicators project and in our context, it’s the course coordinators and the students who, we believe, are most likely to benefit from our explorations into learning analytics. David extrapolates on how learning analytics can assist staff and students in this post.
I am mentioning context here as I am hearing folk talk about the application of AI and learning analytics with a view to ‘replace the human’. To me, this seems to overlook the fundamental complexity of learning and teaching or indeed the complexity of anything involving human beings. Thoughts?