Analytics is not an IT thing

The Indicators project is an analytics project that has been running at CQUniversity for several years now. Interest in analytics appears to be booming at the moment with a large number of universities instigating projects. For me a worrying trend appears to be the IT centric approach that universities are taking. To explain my concerns, we need to first have a look at some of the definitions around analytics. The following table from Siemens (2011) broadly outlines some of these definitions.

Type of Analytics

Level or Object of Analysis

Who Benefits?

Learning Analytics

Educational data mining

 

Course-level: social networks, conceptual development, discourse analysis, “intelligent curriculum” Learners, faculty
Departmental: predictive modeling, patterns of success/failure Learners, faculty

Academic Analytics

Institutional: learner profiles, performance of academics, knowledge flow Administrators, funders, marketing
Regional (state/provincial): comparisons between systems Funders, administrators
National and International National governments, education authorities

Educational data mining is concerned with developing methods for exploring the unique types of data that come from educational settings and using those methods to better understand students and the settings in which they learn.

Academic analytics marries statistical techniques and predictive modeling with the large data sets collected by HEI, including those collected by the LMS. Academic analytics has been described as business intelligence for HEI and is focused on the needs of the institution, such as recruitment, retention and pass rates (Open University, 2012).

Learning analytics is more specific than academic analytics as it is focused exclusively on the learning process (Siemens & Long, 2011) and is often based on learning and teaching theories (Open University, 2012). Applications that apply analytics into the learning environments perhaps fit into the learning analytics definition.

As educational technologist who has been tinkering with analytics for the last few years, I get concerned when I hear IT companies saying things like:

IBM is a leader in metrics and uses learning analytics to gauge learning effectiveness, drive learning recommendations, and aid in decision making.”

or this

By analyzing student data and getting down to ever finer detail, educators and administrators using these analytic systems gain a much deeper understanding of the student, enabling the decision-makers to anticipate the next stage, the next need, specific performance challenges, and even potential outcomes, and guide the affected individual to the right action for a given situation.”

I get concerned about these things for a number of reasons, but two in particular stand out.

Analytics as an IT thing.

Based on my observations, there appears to be an underlying assumption that analytics belongs with IT departments. The rise of managerialism in higher education has meant that organizational structures are based on decomposition into specialized units with rigid command and control processes. This has lead to institutional silos that limit and inhibit cross-unit information sharing, cooperation and collaboration. Eg IT belongs to the IT department; student administration belongs to the student admin section etcetera. While I see a large role for IT in educational data mining, academic analytics and learning analytics, the interpretation and application of analytics information has to involve the educators as per the learning analytics section of George’s table above. Individual courses are highly contextual in that the patterns of behavior that students exhibit will be quite different from course to course.

Assumptions of causality.

A danger exists where correlations found within analytics data are seen as universal constants and this leads to decision making that assumes that patterns in the data are reproducible. IT companies and IT departments, and to a certain extent management, like high-level abstract data, averages and summaries. Performance based analytics data is based on exhibited student behavior and it is unlikely that this will be repeated from one course to the next due to the ever-changing context. A lot of the correlations we have found with the Indicators project are quite distinct at the institutional or even the departmental levels. These correlations are rarely as distinct when looking at individual courses or students as per the following figures that are merely comparing the number of distance student clicks within Moodle course sites and their resulting grades.

References

Siemens, G. (2011). Learning and Knowledge Analytics.   Retrieved 1/11/2011, 2011, from http://www.learninganalytics.net/?p=131

Advertisements

2 thoughts on “Analytics is not an IT thing”

  1. Hi Colin, you make a number of excellent points in your analysis of the analytics business. I would have to agree that the actual course design has a great deal to do with student online interaction. A course that is set up purely to transmit information, particularly where that information is contained in readings from texts, is likely to have low interaction. If the course instructors place little value or emphasis on online interaction via discussion forums the interaction will be even lower. Other factors can influence interaction. In a Masters course I undertook the aggressive interaction style of the course coordinator killed the communication in the LMS, but the communication continued between students underground via email. It is my understanding that this experience convinced some students to switch institutions.

    We tend to believe that high levels of LMS activity are desirable given the correlation between activity and performance, but too much interaction can cause student and instructor fatigue.
    As with all things balance is important and high quality studies based on what appear to be affective and sustainable courses are, in my view, the answer.

    All the best from the top of the Australian Mainland.

    1. G’day Scot

      Good to hear from you again. I guess from an analytics perspective where we are trying to provide staff and students with information that they can use to inform their decision making, context is king. The story told by ‘clickstream’ analytics data varies greatly depending on the level of analysis, discipline, student cohort and on and on and on. Our current thinking is to embed the analytics data at the point and time of need where the folk operating in that context can:
      a. make sense of the data based on their knowledge of the context
      b. make decisions based on a range of inputs of which the data is only one
      c. monitor the data for change as decisions are made.

      Col.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s