This “waffle” was conceived after watching the following video that one of my colleagues distributed. It raised some interesting points that could potentially relate to the indicators project. Another video relating to the same thing can be found here at TED talks.
Student engagement is well represented in the literature and can be summarised as the behavioural intensity and emotional quality of a person’s active involvement during a task (Reeve, Jang, Carrell, Jeon, & Barch, 2004). We know that students’ engagement in the learning process is a key performance indicator (KPI) when trying to predict student success and some of the Indicators project’s data tends to confirm this. Universities essentially provide the students with an environment that facilitates interactions with between the students, the course content and the course instructor. They are also given opportunities to reflect upon what they have learned in the form of summative and formative assessment tasks and activities. So I guess you could summarise learning as a series of cycles each containing a process of interacting and reflecting where engagement is the student’s degree of participation in these processes. Knowledge creation 101 I guess.
Is the process any different for teaching staff in terms of improving their practice? They still require feedback and reflection in order to improve their practice don’t they? If learning from a student’s perspective is comprised of interactions and reflection, than I don’t see how this is vastly different for the teacher or any worker in any trade if it comes to it. I am often surprised at how little credence is given to the role of the teacher in a learning situation. Given that the teacher is usually responsible for developing the course content, course activities and facilitates the social discourse, I’d suggest that they are the capstone upon which the improvement of teaching and learning occurs. Trigwell’s (2001) model of teaching seemingly confirms this.
So I am saying that staff engagement in the process of improving teaching and learning is important, yet it seems to me to be underrepresented in both the literature and organizations. For example a search on Google scholar for the term “staff engagement” returns 1650 results while a search for “student engagement” returns 33400 results. I find this interesting as how do you achieve student engagement without engaged staff? Ok I realize that it is vastly more complex than this and I tend to over-generalize/over-simplify but can you see my point? Teaching staff are very important when you are trying to improve teaching and learning at a university and like the students, they require feedback and opportunities for reflection in order to improve.
A possibly related and interesting point raised by the video mentioned previously, was that the stick and carrot approach to managing workers, works only when the task is mechanical or algorithmic in nature (by stick and carrot I mean reward the good and ignore, or punish the bad). As soon as cognitive function is required for a task, the stick and carrot approach no longer works. This means that workers involved in tasks that are more than rudimentarily cognitive in nature (such as teaching), reward no longer works as a motivator for improvement and creativity in the worker. They say that, in this situation, worker motivation is, instead, linked to three factors:
- Autonomy. The ability to be self directed.
- Mastery. The desire to improve.
- Purpose. The sense of purpose in what they do.
So where am I going with this and how does it relate to the Indicators project? The indicators project is broadly about maximising the potential of data that is already being collected by the university for the purpose of improving teaching and learning. That is, data that is currently either not being used or not being used to its full potential. We know that learning management systems (LMS) and student administration systems contain a wealth of data that can be scrutinised for correlations relating to teaching and learning. Information such as student activity counts versus results from previous offerings can be given to the teacher, at the point of need, in order provide them with a point of reflection on how their current student cohort is performing. This could give the teacher a sense of ownership over the evaluation of their course in context. So its not a stick or carrot but more a vessel for autonomy and mastery and allows the academic to analyse their LMS course during the course of a term.
The use of data mining and data analysis is nothing new to universities and most have some sort of business intelligence unit dedicated to the extraction and analysis of corporate data. Where they have gone wrong in my opinion is that these units are often focused on providing data specifically for management. Don’t get me wrong, strategic data on enrolment numbers, student demographics, fail rates, pass rates etcetera is excellent and highly useful information for everyone working for a university and possibly links university staff with a greater sense of purpose as mentioned before. However there is a single point upon which the university depends, that is the teacher. They are the “point of contact” between the university’s “product” and the student. I’d argue that teachers need ‘tactical’ data as well as ‘strategic’ data.
CQUniversity recently produced an academic dashboard that provides the teachers with an interface into the university’s strategic data. They define the dashboard as an interface that displays an organisation’s strategic data and trends. It is an excellent tool that provides an enormous amount of information to teaching staff and gives the teaching staff an idea of how their courses relate to other courses and programs in terms of fail rates, pass rates, campus performance etc.
However while it may have a limited ability to identify potential issues with a specific course or program, it doesn’t help the teacher rectify the problem. A nice analogy is that the strategic data is required to fight the war whereas tactical data is required to fight the battles that make up a war. I’m not arguing that strategic data is more or less important than tactical data, but rather they are both a critical element of data driven decision-making for universities.
Another point of difference between the Indicators project and business intelligence data relates to the point of need. Strategic data can generally be viewed in isolation from the day-to-day activities of staff and students. It is strategic data that cannot generally be used in the running of a particular course day-to-day. The Indicators data works best when placed at the point of need. Using the example of the Indicators ‘at risk’ student system, we know the average level of activity for students who received passing grades at this point in the term for previous offerings of this course. This is compared to the activity levels of current students to give the teacher some ‘tactical’ information that can be used proactively to address potentially lagging students. The point of need for this ‘tactical’ data is, in the case of CQUniversity, the Moodle LMS.
So business intelligence areas tend to produce abstracted and strategic data. I believe that teachers need additional data, that is tactical and contextual.
Reeve, J., Jang, H., Carrell, D., Jeon, S., & Barch, J. (2004). Enhancing Students’ engagement by increasing teachers’ autonomy support. [Journal Paper]. Motivation and Emotion, 28(2).
Trigwell, K. (2001). “Judging university teaching.” The International Journal for Academic Development 6(1): 65-73.