Online Student Engagement

Preface.

The following is a piece of writing that I submitted as part of a Masters project. Apologies for the formatting, some of which was lost in the transfer to the Web.

Introduction.

A great deal has been written about student engagement and its importance to universities. Despite the absence of a universally accepted definition of what constitutes engagement, it has been linked to undergraduate academic achievement, student attrition, student retention, student motivation and institutional success. Clearly defining engagement and identifying its measurable components can assist universities in enhancing their efforts towards improving student engagement. Identifying indicators of student engagement allows universities a degree of measurability that can be used to inform and improve upon existing practices. This is especially true when students are increasingly enrolling in courses that are predominately delivered online without face-to-face interactions with their teachers and peers.

The widespread uptake of learning management systems by universities has fundamentally changed the environment within which online students engage with their studies. The change in learning environments has also led to changes in the ways that students are engaging with course resources, teaching staff and each other. Distance learning via learning management systems can occur without face-to-face contact between students and teachers and this can mean that traditional measures of student engagement such as class attendance are impossible to gauge (Douglas & Alemanne, 2007). However, learning management systems accumulate vast amounts of data on student behaviour that can be used to inform and improve online student engagement.

This study seeks a broad definition for engagement that is contextually appropriate for a multimodal university like CQUniversity. It then unpacks the definition into component parts that are compared with a well-known model for good practice in higher education to assist in assuring the definition’s validity. The study then reports on the initial exploration of existing institutional data sources, such as the learning management system, as vehicles for providing indicators of student engagement in online, undergraduate education. A range of variables that influence student engagement in online courses are identified from the literature and assessed against institutional data to determine their influence on student engagement online.

Engagement

In higher education, engagement has become a catch-all term most commonly used to describe a compendium of behaviours characterizing students (Krause, 2005). It has even been suggested that student engagement could be used as an indicator of institutional teaching quality (Kuh, 2001). Furthermore it has been said that at a certain level of analysis, engagement is taken to provide a singularly sufficient means of determining whether students are engaging with their study and university learning community in ways likely to promote high-quality learning (Krause & Coates, 2008). But what is engagement and how can it be measured? (Bulger, Mayer, Almeroth, & Blau, 2008) state that measuring engagement and its link to learning is challenging and this is especially true when the term engagement is often used in broad terms to describe a range of behaviours that learners exhibit. An investigation into what engagement is, and factors that influence engagement, is required before metrics for its measurement can be determined.

Most of the research into measuring student engagement prior to the widespread adoption of online, or web based classes, has concentrated on the simple measure of attendance (Douglas & Alemanne, 2007). While class attendance is a crude measure, in that it is only ever indicative of participation and does not necessarily consider the quality of the participation, it has nevertheless been found to be an important variable in determining student success (Douglas, 2008). However, it could be said that class attendance is used as a metric for engagement, simply because it is one of the few indicators of engagement that are visible, or external to the student. For example, student motivation is often linked closely with engagement and has been defined as an internal state or condition that activates behaviour and gives it direction (Huitt, 2001). Participation could be seen as an indicator of behaviour activated by a student’s motivation and is measurable in online education, albeit with the same limitations concerning the quality of the participation. While participation is evidently an important aspect of student engagement, engagement is a broad construct that encompasses more than just participation.

Defining Engagement

Stovall (2003) suggests that engagement is defined by a combination of students’ time on task and their willingness to participate in activities. Krause and Coates (2008) say that engagement is the quality of effort students themselves devote to educationally purposeful activities that contribute directly to desired outcomes. Additionally, Chen, Gonyea and Kuh (2008) say that engagement is the degree to which learners are engaged with their educational activities and that engagement is positively linked to a host of desired outcomes, including high grades, student satisfaction, and perseverance. Other studies define engagement in terms of interest, effort, motivation, time-on-task and suggest that there is a causal relationship between engaged time, that is, the period of time in which students are completely focused on and participating in the learning task, and academic achievement (Bulger et al., 2008).

A basic tenet of the research into engagement is that students’ activity, involvement and effort in their learning tasks is related to their academic achievement. While there does not appear to be a single definition for engagement, the following definition represents an aggregation of the literature. Engagement is seen to comprise active and collaborative learning, participation in challenging academic activities, formative communication with academic staff, involvement in enriching educational experiences, and feeling legitimated and supported by university learning communities (Coates, 2007, p. 122). This definition suggests that engagement is the amalgamation of a number of distinct elements including active learning, collaborative learning, participation, communication among teachers and students and students feeling legitimated and supported. While it is not possible to provide universally accepted interpretations for the elements that comprise the definition, it is possible to provide an overview of their meanings.

Active Learning

Active learning is generally defined in the literature as any instructional method that engages student in the learning process, and requires students to perform meaningful learning activities and think about what they are doing (Prince, 2004). It has also been described as the process of talking, writing, relating to and reflecting on what is being learned, rather than passively receiving information (Chickering & Gamson, 1987). The core components of active learning are student activity and engagement in the learning process (Prince, 2004).

Collaborative Learning

Collaborative learning, as the phrase implies, recognizes that learning is collaborative and social, not competitive and isolated. “Working with others often increases involvement in learning. Sharing one’s own ideas and responding to others’ reactions sharpens thinking and deepens understanding” (Chickering & Gamson, 1987, p. 2). Prince (2004) defines collaborative learning as any instructional method in which students work together in small groups toward a common goal. Some authors have suggested that collaborative learning encompasses cooperative learning, which has been described as a structured form of group work where students pursue common goals while being assessed individually. Prince (2004) refers to collaborative learning and cooperative learning as two distinct entities with different philosophical roots. In either case the core element is the emphasis on student interactions rather than learning as a solitary activity (Prince, 2004). Communication between students and between staff and students is a fundamental requirement for collaborative learning (Veerman & Else, 2001).

Learning Community

Linked with collaborative learning and communication is the remaining element of the Coates (2007) engagement definition, which suggests that students need to feel legitimated and supported by their university learning community. A broad interpretation defines community as the result of interaction and deliberation by people brought together by similar interests and common goals

(Rovai, 2002). This is especially important in a distance-learning context as dropout rates tend to be higher in distance education programs than in face-to-face programs (Rovai, 2002). It has also been theorized that students will increase their levels of satisfaction and the likelihood of persisting in a college program in they feel involved and develop relationships with other members of the learning community (Tinto (1993) Cited in Rovai, 2002). Others have said that feelings of community are known to significantly affect online learning performance and that community is an essential part of successful online education (Black, Dawson, & Priem, 2008). It is clear from the literature that participating in a learning community is an important part of online education and subsequently, is an important part of the engagement definition.

While the component parts of the Coates (2007) engagement definition have been identified and explained, comparing these components with a recognised framework of good practice in education can assist in ensuring the validity of the definition. The seven principles framework by Chickering and Gamson (1987) is closely associated with student engagement and aspects of the Australasian Survey of Student Engagement were developed using the seven principles framework (Australasian Survey of Student Engagement, 2009; Macquarie University, 2009).

Seven Principles

Chickering and Gamson’s (1987) seven principles of good practice in undergraduate education have been referred to as a guiding light for quality undergraduate education and represents a philosophy of student engagement (Puzziferr-Schnitzer, 2005). The seven principles are also contextually appropriate for CQUniversity as they are listed as part of the 2010 teaching and learning plan and form part of a strategy for increasing student enrollments and retention (CQUniversity, 2009). The following table illustrates the alignment between the Coates (2007) definition of student engagement and Chickering and Gamson’s (1987) seven principles of good practice in undergraduate education.

Table 1. Alignment of Coates’ (2007) definition of engagement and Chickering and Gamson’s seven principles of good practice in undergraduate education

Element of Coates’ (2007) definition of engagement Chickering and Gamson’s (1987) seven principles of good practice in undergraduate education
Active and collaborative learning 2. Develops reciprocity and cooperation among students.3. Uses active learning techniques.
Formative communication with academic staff. 1. Encourages contacts between students and faculty.
Involvement in enriching educational experiences 5. Emphasizes time on task.6. Communicates high expectations
Feeling legitimated and supported by university learning communities 1. Encourages contact between students and faculty.2. Develops reciprocity and cooperation among students.

4. Gives prompt feedback.

The Coates (2007) Definition

The Coates (2007) definition of engagement and its constituent components is generally representative of the literature and even forms part of the Australian Survey of Student Engagement of which, more than half of all Australian and New Zealand universities are participants (Australasian Survey of Student Engagement, 2009). The three main elements of the engagement definition, active learning, collaborative learning and learning community are well represented in the literature and provide a granular breakdown of what constitutes engagement. Understanding the component parts of engagement can assist in identifying its component elements that can then be measured. How these components of engagement are measured is dependent on the learning environment within which student engagement is going to occur as the learning environment facilitates the interactions learning requires.

Learning Environments

Because the method of course delivery defines the environment in which the students engage with their learning, it is a key consideration when discussing student engagement. Some courses are delivered face-to-face; some via a blend of online and face-to-face and others are delivered fully online. In a traditional face-to-face class, students attend lectures and tutorials, and can participate in learning activities while in the presence of the instructor and their peers. A fully online course is typically delivered via the Internet with all the interactions between the learners, content and instructors facilitated by ­web based technologies, while blended courses use a mix that involves face-to-face teaching augmented by web or online components.

The learning environment, including an online learning environment, encompasses the systems and dynamics that facilitate and enable student engagement (Coates, 2006). It is reasonable to assume that the learning environment will have an influence on how students engage with their learning. Aside from the learning environment’s influence on the design, building and delivery of courses (Coates, James, & Baldwin, 2005), the demographic of the students choosing online environments for theircourses can also be factor that influences engagement. Dutton, Durron, and Perry (2002) state that online students are older and are less likely to be enrolled in traditional undergraduate programs but are more likely to be lifelong learning students.  They go on to say that online students are more likely to have job or childcare responsibilities, longer average commutes to campus and they are often more experienced with computers (Dutton et al., 2002, p. 17). All of these are factors can influence the level of student engagement in learning environments including online learning environments.

As distance learning using web delivery is the fastest growing segment of postsecondary education, it is important to evaluate its effect on learner engagement (Chen et al., 2008). Distance education via web delivery is typically delivered by enterprise wide learning management systems which have become integral to university teaching and learning environments (Rankine, Stevenson, Malfroy, & Ashford-Rowe, 2009). Learning management systems are software systems that synthesize the functionality of computer-mediated communications software and online methods of delivering course activities and materials (Jennings, 2005). Coates (2005) states that learning management systems influence engagement and research into their effect on engagement is still in its infancy.

Learning management systems

Learning management systems (LMS) are at the forefront of the online technologies making a serious impression on patterns of learning and teaching in higher education (Coates, 2006). LMS, also commonly referred to as course management systems (CMS) and virtual learning environments (VLE), are becoming ubiquitous at universities around the world, adding a virtual dimension to even the most traditional campus-based institution (Coates et al., 2005). In a relatively short time they have become perhaps the most widely used educational technology in higher education, only ranking behind the Internet and common office applications (West, Waddoups, & Graham, 2006). They are being used for presenting online or technology-enhanced classes and it has been said that they influence pedagogy and therefore engagement by presenting default formats that are designed to guide instructors toward creating courses in certain ways (Lane, 2009). If LMS are affecting pedagogy, then they are likely to be affecting student study habits, learning and engagement (Coates et al., 2005).

Whilst LMS have the potential to influence student engagement, research into how they do this is largely in its infancy and is often based on assumptions about campus learning environments (Coates, 2006). It has been argued that the rapid adoption of LMS has occurred in a vacuum of research into their teaching and learning effectiveness (Lopes, 2008). Most, if not all, of the interactions enabled by the LMS are asymmetric, which is where the student is responsible for logging in and engaging with course material without prompting or instruction.  This means that students who require substantial instructor direction may have problems with an environment that demands a certain level of self discipline (Douglas & Alemanne, 2007) and this could conceivably influence their confidence and motivation, both of which can influence their level of engagement.

Others have questioned how the LMS is influencing students’ confidence and motivation for learning, their understanding of the significance of what they have learned and even say that LMS are encouraging increasingly independent and perhaps isolated forms of study (Coates et al., 2005). This seemingly supports research that suggests that rates of attrition for online students range between 20-50% higher than on-campus students (Dawson, Macfadyen, & Lockyer, 2009). This is possibly because LMS can affect the way students explore and contextualize learning resources as well as the way they receive summative and formative feedback. While the degree to which LMS are affecting student engagement in universities is not clear, the importance of engagement is established in the literature and therefore further research into measuring engagement within LMS is warranted in order to identify and address inhibitors that LMS place on engagement. Fortunately, LMS collect extensive data on how staff and students are using the systems and this could be invaluable for universities endeavouring to improve student engagement through the measurement and monitoring of student engagement.

Academic Analytics

A fortunate effect of the almost ubiquitous adoption of LMS as a solution for online course delivery in universities, is their ability to track and store vast amounts of data on student and designer behaviour (Heathcoate & Dawson, 2005) . Typically, LMS record all actions made by users once they are logged into the system and this data is subsequently stored in an associated database. The process of analysing institutional data captured by an LMS for decision making and reporting purposes is called academic analytics (Campbell & Oblinger, 2007) and it has been shown that analysis of captured LMS data is directly relevant to student engagement, evaluating learning activities and can usefully answer other important questions (Dawson & McWilliam, 2008).

The quantity and diversity of data accessible to higher education institutions are now making it possible to exploit more fully the potential of academic analytics in order to inform a range of key activity within the academy, from strategic decision-making to instructor teaching practices. The challenge for higher education institutions is no longer simply to generate data and make it available, but rather to readily and accurately interpret data and translate such findings into practice (Dawson & McWilliam, 2008).

While there is a growing interest, there is minimal research into how information generated by university systems can be harnessed in the design, delivery and evaluation of learning and teaching practices (Beer, Jones, & Clark, 2009). It has also been said that although academic analytics cannot measure learning, it does allow researchers to assess trends such as student engagement, the relationship of LMS use and grade performance and other things that may provide credible proxies for actual learning or at least interesting indicators of learning (Caruso, 2006). This project is using the process of academic analytics to identify some of these “indicators of learning” as they apply to student engagement. While LMS accumulate vast quantities of data on staff and student behaviours within the system, they often lack appropriate tools for the extraction and reporting on the captured data as well as tools for interpreting meaning from the extracted data (Dawson & McWilliam, 2008).

The fundamental measure of student experience with an LMS is the degree to which students use the system (Caruso, 2006). This appears to align with the historical precedent where class attendance is used a metric for measuring face-to-face student engagement (Douglas & Alemanne, 2007). In a face-to-face learning environment, quantifying every student utterance and action is almost impossible in a large class. However, an LMS hosted learning environment enables every mouse click by every student within the system to be automatically tracked for analysis at a later date. It could be said that this actually expands on what was available in a face-to-face learning situation. However, an LMS records every mouse click by every user and there are often thousands of users. This generates enormous quantities of data that has to be aggregated and analysed against a backdrop of educational effectiveness in order to provide meaning to the data.

Research methodology

This study into engagement has been enabled by a broader, CQUniversity sponsored project called the Indicators project that is looking at ways that data captured by an LMS can used by an institution to improve teaching and learning (Beer et al., 2009). Data from three separate systems (figure 1) has been summarised and aggregated into a homogenised database that facilitates the speedy querying of previously disconnected data.



Figure 1. The Indicators project’s data sources.

The Blackboard LMS used in this study was commissioned at CQUniversity in 2004 and has recorded almost every staff and student click within the system over this time. There were 4722 courses delivered via the Blackboard LMS between 2004 and 2009 Term 1 of which, 2674 were undergraduate courses that contained one or more flex students. In the context of CQUniversity a flex student is a student who is studying via distance without significant face-to-face instruction. The focus of this study is on these 2674 courses that contain undergraduate flex students whose typical source of interaction with their instructors, peers and instructional material, is via the Blackboard LMS.

It is proposed that by restricting the scope of this study to undergraduate flex students, variables in online student behaviour such as the influence of face-to-face instruction, is reduced in order to produce more accurate results. The other two systems identified in figure 1 relate to the unified student administration system used at CQUniversity that provides information on each student, such as student grades, age and gender.

The Blackboard activity database recorded the user, course and location within the course of each and every mouse click that occurred within the system. A significant amount of processing was required in order to group these activities, firstly by course and then by student. Two methods were used to extract and aggregate the data used in this study. The first involved the use of structured query language scripts (SQL) executed on the Indicators database and the second used Perl scripts to perform more elaborate query processing where it was required. For instance, Perl scripts were used to automate some of the aggregated data that required tens of thousands of SQL transactions to generate. Using a mix of SQL and Perl, data from the original sources such as the Blackboard activity database and the student administration system was aggregated into two levels of data, course and student.

The course level data contains overview information on activity in each of the courses targeted by this study and includes data such as student numbers, staff numbers, discussion board activity, the number of files as well as the year and term of the course offering. The student level data contains details of each of the 91284 students’ activities within each of these courses and includes data such as the number of clicks on the course site, the number of visits, posts and replies to discussion forums, student grades, ages and nationalities.

CQUniversity’s Blackboard LMS is typical of most LMS in the way it records information on staff and student activity within the system. The quantity and diversity of data available makes it possible to exploit more fully the potential of academic analytics in order to inform teaching and learning (Dawson & McWilliam, 2008). This paper is an exploration of how data captured by CQUniversity’s LMS can help inform teaching and learning with relation to engagement and its influencing factors. However, academic analytics and the data itself have some serious limitations that require consideration.

Limitations of the data and the methodology used

There are significant limitations to what quantitative evaluations of LMS data can tell us (Heathcoate & Dawson, 2005) and the quantitative approach used in this study can only demonstrate patterns within the data and not necessarily causations. For example, student grade is not necessarily indicative of learning but is indicative of the students meeting assessment criteria that may or may not be a measure of effective learning. Similarly, class attendance has been used as an indicator for engagement in face-to-face classes (Douglas & Alemanne, 2007) but it does not indicate the quality of engagement or even learning. The same holds true for measuring student participation by click-count within an LMS. While the number of clicks can be measured, it is impossible to determine the learning that has occurred as a result of the clicks.

In a complex educational setting there is an interplay of many variables which places significant constraints on what a purely quantitative analysis of data can achieve (Beer et al., 2009). However it has also been shown that such analysis is directly relevant to student engagement and can provide useful information (Dawson & McWilliam, 2008). So while potentially useful in that the analysis of LMS activity data can help reveal patterns and relationships, it does not tell the whole story as to the value or significance of these patterns (Seifert, Updated 2004). This is important to note as this study is analysing archival data from an LMS and due to the limitations inherit in this approach, any correlations or results should be interpreted as indicative and not absolute.

Despite the limitations inherent in the data and in the approach take by this study it does afford some practical advantages.  It uses existing sets of data that typically are not used by institutions and are often purged at regular intervals to reduce storage requirements (Beer et al., 2009). Additionally the visualization of student LMS usage can be automated so as to provide teaching staff and administrators both live and longitudinal representations of how students are engaging with their LMS hosted courses overall.  The first step in making use of LMS usage data is to establish a baseline of student activity within the LMS which then be used to highlight variables that cause deviations from this baseline.

LMS indicators and engagement.

The core components of active learning are student activity and engagement in the learning process (Prince, 2004). It could be expected that a student who is actively involved in their LMS hosted course would visit the course more frequently, and for longer periods of time, than students who are less engaged. It would seem reasonable to expect that, generally, students who are more actively involved in their courses should receive higher grades than students who are not demonstrating active involvement in their courses.

As an example of how LMS data might be indicative of active learning, the following table (table 2) and the associated figure (figure 2) displays the student click counts grouped by grade for 91284 online, undergraduate students. It includes the number of students as well as the average number of clicks and standard deviation for each grade group. The assumption made here is that click count is an indicators of student participation where student participation has been said to be an important predictor of engagement and student success (Prince, 2004).

Table 2. Online undergraduate student clicks grouped by grade.

Grade Student Count Click Count average Standard Deviation Percentage of total cohort
HD 8777 782.39 1095.61 9.62%
D 19180 714.91 924.12 21.01%
C 21128 562.89 718.52 23.15%
P 17584 437.04 585.45 19.26%
F 24615 176.51 311.12 26.97%

Figure 2. Average student clicks on the LMS grouped by grade

Table 2 and Figure 2 show a general correlation between the number of clicks by students within the LMS and their resulting grade across a large sample size consisting of 91284 online undergraduate students. However there is also a high standard deviation that is indicative of the degree of variance or volatility in the mean result. This indicates that while the clicks mean is calculated from a large population size, there is a large degree of variation between the minimum and maximum values that contribute to the mean which would be expected from such a large sample across a diverse range of courses. Whilst the degree of variation in the data is significant, this is mitigated somewhat by the sheer size of the population and in any case, is not as important as the trend indicated by the data (figure 2) which identifies a correlation between participation (clicks) and academic achievement (grade). Additionally, and due to the large population size (n=91284), the results have been determined to be statistically significant. Regardless, it is the correlation between participation and academic achievement that is the focus of this study based on the hypothesis that an indicator of student engagement is the number of clicks they make within a course offering which, to some extent, can be validated by their grades.

Figure 2 (Page 11) indicates that, on average, the more engaged students are, as indicated by their click count, the better their academic achievement. However, learning is more complex and diverse than a representation of student clicks on a web site can possibly demonstrate. Although data detailing student activity linked with academic achievement potentially provides a useful indicator of student engagement, there are a host of variables that can influence the number of clicks students make within an LMS. While the following sections analyse some of these variables, it is by no means a comprehensive list of factors that can influence LMS participation.

Variables influencing student participation in online courses

There are a host of factors identified in the literature that can influence the way that students participate in online courses. These include a broad range of factors such as teacher participation in discussion forums, course design, class size, age and gender (Vrasidas & McIsaac, 1999). Other factors such as students’ prior experience with computer mediated communication cannot be measured using LMS data alone, but it is known as a contributing factor to student engagement in the LMS hosted courses (Vrasidas & McIsaac, 1999). The following sections show how LMS data can demonstrate changes in student participation rates based on the influence of various criteria. They show how teacher discussion board participation, course design, class size, student gender and age are all variables that can influence the rate of student participation with LMS courses.

Teacher Participation in LMS Discussion Forums

Collaborative learning, cooperative learning and learning communities are all unpinned by interactions facilitated by communication. The main mechanism, via which communication is facilitated by an LMS, is through the use of class discussion forums. The number of posts and replies that students make on the discussion forums is quantifiable and while this could be seen as an indicator of engagement, it does not include students who visit the forums without making a posts or replies. It has been argued that students’ participation in class discussion forums can be used as a predictor of student sense of community (Dawson et al., 2009). The first of the seven principles suggests that good practice encourages contact between students and faculty (Chickering & Gamson, 1987) and other research has suggested that there is a significant correlation between student motivation and their participation in LMS discussion forums (Dawson et al., 2009). This is reinforced by Black et al. (2008) who state that collaboration between students and online teachers is necessary to effectively cultivate a thriving online community (Black et al., 2008). A simple indicator of student-faculty contact is the presence of staff posts and replies in LMS hosted discussion forums. The following data was extracted from the LMS based on whether or not the teaching staff for the courses involved made posts or replies to the class discussion forum. It shows the number of clicks made by students in courses with and without staff contributions to discussion forums.

Table 3. Student grades and average clicks for discussion forums with staff posts and replies (n=45424).

Grade Student Count Hit Count average by students Standard Deviation
HD 3996 1145.44 1281.77
D 10083 962.69 1048.92
C 11076 744.09 820.96
P 8907 582.11 684.61
F 11362 245.66 384.16

Table 4. Student grades and average clicks without staff posts and replies on discussion forums (n=30856).

Grade Student Count Click Count average by students Standard Deviation
HD 3251 224.03 405.39
D 6076 270.34 465.43
C 6688 248.01 385.67
P 5740 207.46 325.15
F 9101 77.92 145.76

Figure 3. Student average grades and clicks with and without staff participation in discussion forums.

The 45424 students who participated in courses where the teaching staff made one or more posts or replies to the discussion forums, appear to have a distinctly higher average number of clicks for each grade group than the 30856 students whose teaching staff did not use the discussion forums. Additionally, the failure rate for students in courses with staff discussion board participation as calculated from table 4, was 25% as opposed to 29.5% for students in courses where teaching staff did not participate in the discussion forums. The limitations inherit in academic analytics data and the de-contextualized representation of what transpired within these courses prevents a definitive causal relationship to be established. However, it is an indicator or clue that suggests value of LMS discussion forums to distance students.

According to the results in figure 3, student engagement in courses where the staff contributed to discussion forums is higher than courses where the staff did not contribute to discussion forums. This aligns with the first of the seven principles that suggests good practice encourages contact between students and faculty (Chickering & Gamson, 1987) and the engagement definition that suggests that collaborative learning and a sense of learning community are both factors influencing student engagement (Coates, 2007).

Course Design

It has been said that the structure of an online course influences the degree of interaction and participation exhibited by students (Vrasidas & McIsaac, 1999).

The following table and chart compares the average click count versus grade data for three undergraduate courses before and after the courses underwent a course redesign by an instructional designer. The author was involved in the redesign of these courses and the teaching staff, assessment and underlying resources did not change as a result of the course redesign. However, the overall course structures changed significantly as activities were replaced with group work components that were introduced to facilitate and encourage more interaction between students as well as between students and teaching staff.

Table 5. Undergraduate flex student click count. Pre and post course redesign.

Pre-Course Redesign Post-Course Redesign
Grade Number of Flex Students Click Count average Number of Flex Students Click Count average
HD 48 729.81 11 2884.18
D 81 578.51 36 1892.64
C 57 475.05 42 1503.43
P 50 560.16 32 957.97
F 50 155.37 20 628.00

Figure 4. Click count versus grade averages pre and post course redesign.

Figure 4 shows a significant contrast between the participation rates of students in the three courses before and after a course redesign by an instructional designer. The participation rate for the students in each grade group increased significantly following the implementation of the new course designs.  From the perspective of a purely systems based enquiry, figure 5 tends to indicate that course design elements such as group work and enhanced student interactions, has an impact on the way that students engage with their online course.

Class size

Class size has long been recognized as a factor that influences student engagement and student achievement, although the influence of online learning environments is unclear (Hewitt & Brett, 2005; Vrasidas & McIsaac, 1999). In the case of CQUniversity, it is also difficult to determine the effect of class size as courses containing online undergraduate students often be delivered to face-to-face students and this will pollute data showing the influence of class size on engagement for online undergraduate students.

The average number of flex students in undergraduate courses at CQUniversity is 34 and the following table and charts shows the click count versus grade averages for courses with more and less than the average number of flex students.

Table 6. Average student clicks before and after course redesigns.

Below average class size Above average class size
Grade Number of Flex Students Click count average Number of Flex Students Click count average
HD 2759 499.50 8061 1096.58
D 5034 477.83 17471 966.12
C 5018 362.37 19142 756.97
P 3724 222.83 16349 604.60
F 3171 127.03 14927 392.45

Figure 5. Average student clicks grouped by grade based on class size.

Figure 5 results are surprising in that the literature generally points towards class size having a negative impact on engagement (Vrasidas & McIsaac, 1999) and if LMS participation is representative of engagement, smaller class sizes would be expected to have higher levels of participation than larger class sizes. However figure 5 shows that the larger the number of flex students within a course, the higher the average rate of engagement. A potentially significant influence on the accuracy of the results in figure 5 is that most of the courses sampled also had students other than online students and therefore is not representative of the entire class. This will have an undetermined affect on student engagement that has not been captured with figure 5 and is another variable that requires further investigation. While it seems unlikely that an increase in class size would positively influence student engagement, figure 5 demonstrates that class size does have an effect on student engagement and therefore requires consideration if LMS participation is considered as a worthwhile indicator of student engagement.

Student Gender

Research has suggested that gender has an influence on student engagement in online courses generally due to online female students having less available study time than face-to-face female students or even online male students (Blum, 1999). The following tables (tables 7 and 8) and figure 6 displays the same average student click count grouped by grade for male and female online under graduate students.

Table 7. Male student participation versus grade (n=9182).

Grade Student Count Click Count average Standard Deviation Percentage
HD 1035 719.33 984.64 11.3%
D 1989 696.19 846.71 21.7%
C 2028 569.24 663.32 22.1%
P 1600 383.54 517.90 17.4%
F 2460 158.87 254.93 26.8%

Table 8. Female grade student participation versus grade (n=19812).

Grade Student Count Click Count average Standard Deviation Percentage
HD 2153 968.86 1131.21 10.9%
D 4479 894.24 1037.18 22.6%
C 4737 709.36 806.86 23.9%
P 4024 587.79 690.02 20.3%
F 4419 224.78 380.58 22.3%

Figure 6. Rates of participation. Male and Female (n=28994)

While the trend that indicates a correlation between the quantity of clicks a student makes within the LMS and their resulting grade is still evident, there is a difference in that the male students sampled generally recorded less clicks on the LMS than the female students in each grade grouping. The male students also had a higher failure rate than the female students, 26.9% against 22.3%, and showed less variation between the distinction and high distinction grades. Figure 6 indicates that, generally, female students are more engaged in their online courses than their male counterparts and this is somewhat validated by their lower failure rate. Based on these results, gender appears to have some influence on student engagement. However, further research is again required in order to ascertain the true meaning of the correlation exposed by the data.

Student age

There is research that suggests that the millennial generation are the most computer literate of all the generations and that they are most comfortable learning from web-based tools such as an LMS (Nicholas, 2008). The following chart shows the LMS click count for 26743 students grouped by grade for four different age groups. The millennial generation is generally referred to as students born after 1981 (Nicholas, 2008) and is represented in the under 20 and under 30 categories. At the time of writing, not all student ages were available to the author so the following table demonstrates the break down into age categories for the 26743 undergraduate flex students whose ages were available at the time of writing.

Table 9. Sample sizes for each student age category.

Grade Ages < 20 Ages 20-30 Ages 30-40 Ages > 40
HD 148 1210 975 768
D 334 3023 1667 1328
C 503 3445 1566 1152
P 483 3249 1057 772
F 453 3001 1006 603
Total 1921 13928 6271 4623

­

Figure 7. Click count versus Grade by Student Age

Figure 6 generally continues to demonstrate the linear trend between LMS participation and academic achievement for students in differing age brackets. However there are several features to note in this figure that potentially raise questions for future research. The under 20 students achieving a credit grade used the LMS more than the distinction students in the same age bracket while the over 40 students who received a distinction grade used the LMS more than the high distinction students in the over 40 age bracket.

Additionally, it has been argued that the net generation are more comfortable with online learning in general (Nicholas, 2008), yet they do not appear to use the LMS as much as students in the higher age brackets. This is potentially an important consideration for universities who are increasingly choosing LMS for their online course delivery partly based on an assumption that younger students expect to use advanced technologies as part of their learning (Coates et al., 2005). However, while the data can illuminate correlations such as this, further research is required to identify the causal relationships that are occurring.

Discussion

Teacher participation, course design, class size, student gender and student age as just a small subset of factors that can influence online student engagement. As an example of how erroneous measuring student engagement can be using only LMS data, a student’s experience with technology can result in them choosing to download course documents to their local computer rather than access the documents every time they do some study. This means that from the perspective of LMS data they have only visited the documents once during the term whereas other students may choose to click on the documents at every visit and would thus appear more engaged when using only LMS data as a measure. Course discipline, course paradigm development, student motivation, teacher’s experience and the teacher’s conception of learning and teaching are just some of the range of factors that can influence students’ engagement in an LMS hosted course. Based on the data exposed by this project there are some potentially valuable methods by which the data can be utilized for institutional advantage.

Student engagement data from the LMS can be presented to students for informational and motivational reasons. If students can be shown the degree of effort required to pass a particular course matched with an indication of their degree of effort to date, it may lead to enhanced effort by the student. This concept is currently being trialed at Pudue University and has reportedly lead to a significant increase in student engagement in courses where the effort tracking system is in effect (Purdue University, 2009). The longitudinal study of captured LMS data can also be used in a multitude of ways.

The study of LMS feature adoption by teaching staff over time is important as it is not the provision of LMS features but their uptake that really determines their educational value (Coates et al., 2005). It has been shown that teaching staff tend to adopt LMS features along a continuum beginning with content dissemination features and moving to more complex features such as quizzes and evaluation surveys over time, as they gain experience with the new teaching medium (Malikowski, Thompton, & Theis, 2007). This can assist with training and development when new LMS systems are adopted by universities (Beer et al., 2009). The features adopted by teaching staff, such as discussion forums shown in figure 3, have an effect on student engagement as they form part of the environment in which student engagement occurs. The criticality of teaching staff in LMS hosted courses as indicated in figure 3, potentially highlights the most puissant use of LMS data, which is aiding teacher reflection.

The importance of teacher reflection is well known and it has been said that improving teaching practice cannot be achieved without the teacher converting their teaching experience into knowledge through the process of reflecting upon their practice (McAlpine & Weston, 2004). Longitudinal LMS data allows the teacher to visualize online student behaviours over time and can also show the influence of course design changes and specific LMS feature against student engagement. This can potentially provide the teaching staff with a new tool through which they can reflect upon their practices and the effect their practices are having on student engagement.


Conclusion

Figure 8. Overview of this study into online student engagement

Based on the increasing needs for universities to assess their efforts in improving student engagement, this study has suggested a broad definition for engagement that aligns with an established model of educational effectiveness in undergraduate education. It showed that measuring engagement is difficult and that learning environments affect the ways that students engage. Online learning environments are becoming increasingly common and learning management systems are the typical vessel for online course delivery.

The study looked at how data captured by learning management systems can potentially be used by the academy for measuring, informing and improving student engagement. Variables such as course design, teacher participation, class size, student gender and student age were identified as factors that influence student engagement and were also identified as factors that require considerable further research before their influence on engagement can be fully understood.

This study into student engagement has shown that while universities are not significantly utilizing captured learning management system data, it has the potential to become an important tool for teacher reflection by demonstrating how their course activities and resources are being used by online students.
References:

Australasian Survey of Student Engagement. (2009). Engaging Students for Success. Camberwell, Victoriao. Document Number)

Beer, C., Jones, D., & Clark, K. (2009). The indicators project identifying effective learning: adoption, activity, grades and external factors. Paper presented at the ASCILITE 2009. from http://indicatorsproject.wordpress.com/2009/10/09/the-indicators-project-identifying-effective-learning-adoption-activity-grades-and-external-factors/

Black, E. W., Dawson, K., & Priem, J. (2008, 17 March 2008). Data for free: Using LMS activity logs to measure community in online courses. Internet and Higher Education, 11, 65-70.

Blum, K. D. (1999). Gender Differences in Asynchronous Learning in Higher Education: Learning Styles, Participation Barriers and Communication Patterns. Journal of Asynchronous Learning Networks, 1(3), 20.

Bulger, M. E., Mayer, R. E., Almeroth, K. C., & Blau, S. D. (2008). Measuring Learner Engagement in Computer-Equipped College Classrooms. Journal of Educational Multimedia and Hypermedia, 17(2), 129-143.

Campbell, J. P., & Oblinger, D. G. (2007). Academic Analytics. Educause Article.

Caruso, J. B. (2006). Measuring Student Experiences with Course Management Systems [Electronic Version]. Educause, 2006, from http://net.educause.edu/ir/library/pdf/ERB0619.pdf

Chen, P.-S. D., Gonyea, R., & Kuh, G. (2008). Learning at a distance [Electronic Version]. Journal of online education, 4. Retrieved October 2009, from http://innovateonline.info/index.php?view=article&id=438&action=login

Chickering, A. W., & Gamson, Z. F. (1987). Seven Principles of Good Practice in Undergraduate Education [Electronic Version]. AAHE Bulletin, from http://tls.vu.edu.au/learning_and_teaching/guidelines/VU4/Chickering%20and%20Gamson%201987%20VU%204.pdf

Coates, H. (2006). Student Engagement in Campus-based and Online Education. Retrieved 23rd October 2009, from http://www.cqu.eblib.com.ezproxy.cqu.edu.au/EBLWeb/patron/

Coates, H. (2007). A model of online and general campus-based student engagement. Assessment & Evaluation in Higher Education, 32(2), 121-141.

Coates, H., James, R., & Baldwin, G. (2005). A critical examination of the effects of learning management systems on university teaching and learning. Tertiary education and management, 11(2005), 19-36.

CQUniversity. (2009). Draft Learning and Teaching Management Plan 2010 [Electronic Version], 4 November 2010. Retrieved 4th November 2010, from http://www.cqu.edu.au/academic_board/ecab/2009/November%2009/ECAB%20041109%20Item%209%20Draft%20LT%20Management%20Plan%20v4%20041109.pdf

Dawson, S., Macfadyen, L., & Lockyer, L. (2009). Learning or performance: Predicting drivers of student motivation. Paper presented at the Same places, different spaces. Proceedings ascilite Auckland 2009, Auckland.

Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicator of learning and teaching performance: Queensland University of Technology and the University of British Columbia. (A. L. a. T. Council o. Document Number)

Douglas, I. (2008, 14 December 2008). Measuring Participation in Internet Supported Courses. Paper presented at the 2008 International Conference on Computer Science and Software Engineering, Wuhan, China.

Douglas, I., & Alemanne, N. D. (2007). Measuring Student Participation and Effort. Paper presented at the International Conference on Cognition and Exploratory Learning in Digital Age, Algarve, Portugal.

Dutton, J., Durron, M., & Perry, J. (2002). How Do Online Students Differ From Lecture Students? Journal for Asynchronous Learning Networks, 6(1), 20.

Heathcoate, L., & Dawson, S. (2005). Data Mining for Evaluation, Benchmarking and Reflective Practice in a LMS. E-Learn 2005: World conference on E-Learning in corporate, government, healthcare and higher education.

Hewitt, J., & Brett, C. (2005). The relationship between class size and online activity patterns in asynchronous computer conferencing environments. Computers in Education, 49(2007), 13.

Huitt, W. (2001). Motivation to Learn [Electronic Version]. Educational Psychology Interactive. Retrieved 31st October 2009, from http://chiron.valdosta.edu/whuitt/col/motivaton/motivate.html

Jennings, D. (2005). Virtually Effective: The Measure of a Learning Environment [Electronic Version]. Emerging Issues in the Practice of University Learning and Teaching. Retrieved 1st November 2009, from http://www.aishe.org/readings/2005-1/jennings-Virtually_Effective.html

Krause, K.-L. (2005, 21-22 September 2005.). Understanding and promoting student engagement in university learning communities. Paper presented at the Sharing Scholarship in Learning and Teaching: Engaging Students, James Cook University, Townsville.

Krause, K.-L., & Coates, H. (2008). Students’ engagement in first-year university. Assessment & Evaluation in Higher Education, 33(5), 493 – 505.

Kuh, G. D. (2001). Assessing What Really Matters to Student Learning. Inside the national survey of student engagement. [Electronic Version]. Retrieved 22nd October 2009, from http://cpr.iub.edu/uploads/Assessing_What_Really_Matters_To_Student_Learning_(Kuh,%202001).pdf

Lane, L. M. (2009). Insidious Pedagogy: How course management systems affect teaching [Electronic Version], 14, from http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2530/2303

Lopes, V. M. (2008). Course Management Systems and Campus-Based Learning. Seneca College.

Macquarie University. (2009). Student Engagement Principles [Electronic Version]. Retrieved 15th November 2009, from http://www.mq.edu.au/ltc/pdfs/Engagement_Principles.pdf

Malikowski, S., Thompton, M., & Theis, J. (2007). A model for research into course management systems: bridging technology and learning theory. Journal of educational computing research, 36(2)(2007), 24.

McAlpine, L., & Weston, C. (2004). Reflection: issues related to improving professors’ teaching and students’ learning. Instructional Science, 28(5), 363-385.

Nicholas, A. J. (2008). Perferred Learning Methods of the Millennial Generation. International Journal of Learning, 15(6), 8.

Prince, M. (2004). Does Active Learning Work? A Review of the Research. Journal of Engineering Education, 93(3), 8.

Purdue University. (2009). Signals tells students how they’re doing even before the test.   Retrieved 26th November 2009, from http://news.uns.purdue.edu/x/2009b/090827ArnoldSignals.html

Puzziferr-Schnitzer, M. (2005). Managing Virtual Adjunct Faculty: Applying the Seven Principles of Good Practice. Paper presented at the Distance Learning Administration 2005. from http://www.westga.edu/~distance/ojdla/summer82/schnitzer82.htm

Rankine, L., Stevenson, L., Malfroy, J., & Ashford-Rowe, K. (2009). Benchmarking across universities: A framework for LMS analysis. Paper presented at the Ascilite 2009. Same places, different spaces. from http://www.ascilite.org.au/conferences/auckland09/procs/rankine.pdf

Rovai, A. (2002). Building Sense of Community at a Distance. International Review of Research in Open and Distance Learning, 3(1).

Seifert, J. W. (Updated 2004). Data Mining: An Overview. Retrieved. from http://www.fas.org/irp/crs/RL31798.pdf.

Stovall, I. (2003). Engagement and Online Learning [Electronic Version]. UIS Community of Practice for E-Learning. Retrieved October 2009, from http://otel.uis.edu/copel/EngagementandOnlineLearning.ppt

Veerman, A., & Else, V.-D. (2001). Collaborative learning through computer-mediated communication in academic education. Paper presented at the European Perspectives on Computer Supported Collaborative Learning : Euro-CSCL, Maastricht McLuhan Institute.

Vrasidas, C., & McIsaac, M. S. (1999). Factors influencing interaction in an online course. American Journal of Distance Education, 13(3), 22-36.

West, R. E., Waddoups, G., & Graham, C. R. (2006). Understanding the experiences of instructors as they adopt a course management system. Educational Technology Research and Development, 55(1), 1-26.

7 thoughts on “Online Student Engagement”

  1. Hi there! I’m part of the Isurf project that we are building up at CQUniversity Noosaville. One of our upcoming research projects intend to measure engagement within online learning environments using our specially designed lab. Follow us on Twitter and Facebook for updates!

Leave a reply to Evaluation methods | jonradcliffepgcap Cancel reply