David’s recent blog post talks about institutional eLearning using TPACK as a lens. A brief conversation with David yesterday resulted in an idea for using TPACK to analyse how universities are applying learning analytics.
David’s post talked about the three facets of TPACK:
• Technological knowledge – how to use technologies.
• Pedagogical knowledge – how to teach.
• Content knowledge – knowledge of what the students are meant to be learning.
TPACK suggests that the most effective eLearning results when these knowledge types are combined.
University information systems collect an amazing array of data that can be used to inform and enhance learning and teaching. The process of analyzing the collected data for the enhancement of learning and teaching is broadly known as learning analytics.
One of the things that was plainly evident to me from the many conversations at the ASCILITE 2012 and Southern SOLAR flare conferences last year, was that many, if not most, universities are looking to learning analytics to improve their student retention rates. CQUniversity is no different in that we are also looking at how our collected data can contribute to a reduction in our student attrition statistics.
Our paper to ASCILITE last year pointed out some likely problems that universities will face when attempting to implement learning analytics in any meaningful way, not the least of which is the problem of organisational silos. David’s post hints at this when he says that technological knowledge is typically housed within the institutional IT division; pedagogical knowledge is housed within the central learning and teaching division and the content knowledge is housed within the faculties.
From what I am hearing, learning analytics projects in universities are mostly encapsulated within the institution IT division and it is not hard to understand why. From a senior manager’s perspective, learning analytics is about data from IT systems and this falls into the domain of the institution’s IT division. This would seem to be a logical choice on the surface, but I would suggest this approach is far less than ideal as it fails to include context.
As we pointed out in our ASCILITE paper, the interpretation of learning analytics data is almost impossible without reference to the context from which it was taken due to its non-causal nature. Then we have the problem whereby learning analytics data tends to produce clear patterns at the macro-levels (institution/school/faculty) and seemingly random patterns at the micro-levels (course/student group/students). Based on our five years of experience with a learning analytics project, any of these problems preclude the (useful) application of learning analytics by any single organisational entity within a university.
I am hoping that TPACK can help me explain things a little better. For example, we know that attrition is a complex beast and the reasons for student attrition vary greatly from student to student. Given the diversity of reasons for why students drop out of their university studies, it would seem that an approach to addressing student attrition based entirely on technology is woefully inadequate. Like effective eLearning, using learning analytics to address student attrition also needs to include pedagogical and content knowledge in addition to technological knowledge. It’s the intersection point of these three knowledge areas that will be most effective.
Where to from here
My blog post from last week gave some insight into how we are trying to address student attrition by providing teaching academics with better information within their learning and teaching context. I am currently expanding this trial and trying to move away from the overly technical approaches used to date. You will notice from the previous post that once a student who is at risk of failing has been identified, we provided the teaching academic with the ability to email the student as a way of conducting an initial intervention. TPACK would suggest that this does not go far enough in providing the teaching academic (or students) with pedagogical suggestions on what to do next. So I am now thinking about how include pedagogical advice or suggestions into the system for the next version that is due to start next term.
One example is that the system in its current form only considers the student within a single course. Some students have next to no Moodle activity across all of the courses they are currently attempting. The intervention requirements for this student exceeds the responsibility for any single teaching academic. So perhaps there is an opportunity to include the student support area as an alternative to a simple course based intervention. Additionally, there needs to be more intervention options for the teaching academics besides the mail-merge facility. Not to mention a mechanism for getting some of this data out to the students so they better appreciate their situation.
Any comments or suggestions would be warmly welcomed?