A strange result

The folk involved in the Indicators project are always on the lookout for interesting correlations between student activity on the learning management system (LMS) and their resulting grade. One of the first correlations we found was the following, that shows a relationship between the number of clicks students make on the LMS and their resulting grade. It also shows number of clicks each grade group made on discussion forums. Note that flex students in the CQUniversity context typically refers to students who are pursuing their studies via distance education and this is often facilitated via the Moodle LMS.

The ‘Y’ axis is the number of clicks (hits) on Moodle while the ‘X’ axis is the different grades the students received. While the sample size is somewhat small by Indicators project standards (5872), it does show that, on average, the more students use the LMS, the better their results. Of course, and like most of the correlations we have found, they are quite distinct when looked at by average over large sample sizes, but become far less clearly defined when the number of students being sampled is reduced.

One of the things I am particularly interested in, are the patterns around student utilization of LMS discussion forums. At a basic level there are three different types of interaction in any online learning situation:

  • Interactions with content. This is represented by clicks on web pages, links and files such as PDF and PPT.
  • Interactions with other learners. Often this is represented by forum posts, replies and reads amongst the student group.
  • Interactions with the instructor. This is also often represented by forum posts, replies and reads between the students and the instructor.

The following chart shows the relationship between the number of flex student clicks within the LMS discussion forum and their resulting grade. Note that n=8928.

As you will notice, the pattern is similar to what we saw before with the overall hits on the LMS; the more the flex student interacts with the LMS discussion, the better their result. To further expand on this the following chart shows the relationship between the number of forum posts and replies the flex students make and their resulting grade.

Again the trend is fairly clear. The students receiving the higher grades have a tendency to make more posts on the LMS discussion forums. None of this should really come as any great surprise to anyone involved in online education. There is ample research that shows the link between student engagement and their resulting grades and one of my earlier research projects suggested that there is some value in using hits on the LMS as a proxy measure for student engagement. This is based on the fact that online students are ‘invisible’ to the instructor in that the instructor cannot see the ‘glint in their eye’ and this makes it hard to determine if their motivation is waning.

Over the weekend, I was trying to develop a script that produced ratios of the three engagement types that I mentioned earlier. Just to see what correlations, if any, were present. One of the first results from the script is in the following chart and shows something that is both very interesting and also quite perplexing. It shows what percentage of each grade group of flex students are hits on the discussion forum.

Why on earth do the failing students spend more energy (for the want of a better word) on discussion forums than pass or even credit students. Sure the number of clicks the failing students make on discussion forums is less than both pass and credit students, but in terms of the proportion of their hits that are forum based, they exceed the pass and credit students. So give that, in the past, we have shown that instructor interaction with students on discussion forums has a significant influence the level of effort that online students make, I compared this result with students from courses where instructors demonstrated high levels of forum against students from courses where instructors demonstrated low levels of forum activity.

Interestingly, aside from showing the same pattern as the previous chart where failing students exhibited a higher proportion of forum hits than passing and credit students, this chart also suggests that higher instructor activity on the forums leads to proportionally higher student activity on the forums. Not a great surprise I guess but I am really curious as to the dip in the chart lines when looking at student forum activity in relation to their overall activity. Any ideas anyone?


5 thoughts on “A strange result”

  1. I was actually wondering today how my practice (as a student taking four Moodle courses) might impact these types of correlations. Don’t think it explains the funny results above, but maybe.

    First, what I do.

    I’ve subscribed to all my forums. i.e. I get the messages in email. I read most of them there. The only times I actually visit the forums is if I want to reply, or odd other times.

    I assume there’s a chance that this practice could throw out your hits calculations. I don’t think the version of Blackboard we used had the option to get them as email. Did it?

    In connection with your findings. Perhaps the failing students didn’t subscribe? So they have to spend more time visiting the sites.

    Not sure this applies to your problem, but at least it might be something to think about.

  2. Well, I am witnessing student behaviors in my (blended, music) course this year and last that are consistent with these data.

    Broadly, I’m seeing three typical patterns of student response:

    1) The very bright, highly engaged students start by accessing content. They then leap into the forums, often starting new threads introduce new content (“The tutorial on rhythm today was really interesting. It made me think of this band that wits songs in 11/8 and 9/4; here’s the YouTube link”). You’d think these students would have the highest content/interaction ratio: except they can’t help themselves; they see learning as an interactive process and they keep coming back to the forums to reply to other students, particularly on threads they have started, which pushes up their interaction score. These will be the. HD/D students

    2) The middle group of students are content-driven. They access the site to access the “information”, thus go straight for the content hits. They will then post to the forum, usually once, because it’s assessed, and their posts are very much parroting what has previously been said, and rarely engage with the arguments or ideas of other students. Interestingly, given what I just said, their posts are almost always replies – in hat they hit the “reply” button, and almost never start a new thread. But they are not replies in a genuine sense. These students have the highensnt content/interaction ratio, and their forum scores probably overstate the interaction that ha s actually taken place. These students are the Pass students.

    3). The fail students find the content intimidating, uninteresting, or incomprehensible. They often bypass the content on the site, but go straight to the forum where they will be most visible. They will post replies to other students, but these typically are short, entirely subjective and uninformed by the issues. These students have low overall hits on the site, but as they ave bypassed the content entirely, they have a high proportion of their hits in the forums – either because they can’t/won’t grasp the content; or because they are calculating that they get the biggest assessment reward for the least effort by going to the forum where they think I’m more likely to notice them. These are the Fail students.

    1. Thanks Jonathan

      A lot of what you say makes sense and resonates with my experience at CQUniversity. This in conjunction with the fact that students can subscribe via email to Moodle forums like David mentioned, creates a complex environment if you wish to use Moodle statistics for predictive purposes.

  3. Col, the email thing probably reduces the value of the hits count. But posts and replies still have to be done in the forum.

    Picking up on something that Jonathan said, I wonder if you can distinguish any differences between the posts and replies of students in each grade. The simplest measure would simply be length. What about mapping the average length of posts/replies for each grade.

    A more complex approach might be to pass the posts through some sort of automatic complexity check, must be a few of those around.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s