The Indicators project is an analytics project that has been running at CQUniversity for several years now. Interest in analytics appears to be booming at the moment with a large number of universities instigating projects. For me a worrying trend appears to be the IT centric approach that universities are taking. To explain my concerns, we need to first have a look at some of the definitions around analytics. The following table from Siemens (2011) broadly outlines some of these definitions.
Type of Analytics
Level or Object of Analysis
Educational data mining
|Course-level: social networks, conceptual development, discourse analysis, “intelligent curriculum”||Learners, faculty|
|Departmental: predictive modeling, patterns of success/failure||Learners, faculty|
|Institutional: learner profiles, performance of academics, knowledge flow||Administrators, funders, marketing|
|Regional (state/provincial): comparisons between systems||Funders, administrators|
|National and International||National governments, education authorities|
Educational data mining is concerned with developing methods for exploring the unique types of data that come from educational settings and using those methods to better understand students and the settings in which they learn.
Academic analytics marries statistical techniques and predictive modeling with the large data sets collected by HEI, including those collected by the LMS. Academic analytics has been described as business intelligence for HEI and is focused on the needs of the institution, such as recruitment, retention and pass rates (Open University, 2012).
Learning analytics is more specific than academic analytics as it is focused exclusively on the learning process (Siemens & Long, 2011) and is often based on learning and teaching theories (Open University, 2012). Applications that apply analytics into the learning environments perhaps fit into the learning analytics definition.
As educational technologist who has been tinkering with analytics for the last few years, I get concerned when I hear IT companies saying things like:
“By analyzing student data and getting down to ever finer detail, educators and administrators using these analytic systems gain a much deeper understanding of the student, enabling the decision-makers to anticipate the next stage, the next need, specific performance challenges, and even potential outcomes, and guide the affected individual to the right action for a given situation.”
I get concerned about these things for a number of reasons, but two in particular stand out.
Analytics as an IT thing.
Based on my observations, there appears to be an underlying assumption that analytics belongs with IT departments. The rise of managerialism in higher education has meant that organizational structures are based on decomposition into specialized units with rigid command and control processes. This has lead to institutional silos that limit and inhibit cross-unit information sharing, cooperation and collaboration. Eg IT belongs to the IT department; student administration belongs to the student admin section etcetera. While I see a large role for IT in educational data mining, academic analytics and learning analytics, the interpretation and application of analytics information has to involve the educators as per the learning analytics section of George’s table above. Individual courses are highly contextual in that the patterns of behavior that students exhibit will be quite different from course to course.
Assumptions of causality.
A danger exists where correlations found within analytics data are seen as universal constants and this leads to decision making that assumes that patterns in the data are reproducible. IT companies and IT departments, and to a certain extent management, like high-level abstract data, averages and summaries. Performance based analytics data is based on exhibited student behavior and it is unlikely that this will be repeated from one course to the next due to the ever-changing context. A lot of the correlations we have found with the Indicators project are quite distinct at the institutional or even the departmental levels. These correlations are rarely as distinct when looking at individual courses or students as per the following figures that are merely comparing the number of distance student clicks within Moodle course sites and their resulting grades.
Siemens, G. (2011). Learning and Knowledge Analytics. Retrieved 1/11/2011, 2011, from http://www.learninganalytics.net/?p=131