The Indicators Project.
The Indicators project proposes to use statistical analysis of previously untapped data sources to assist in the design and evaluation of courses and programs at CQUni in a framework defined by the seven principles of good practice in higher education. The project will attempt to gather data from sources such as Learning Management System logs and tallies, administration system data and web server logs in order to devise a system that will assist in the evaluation and design of courses by identifying patterns of behavior in staff and students in the CQUni context. It is also hoped that we can, via a process of comparative analysis, identify aspects of online courses that may require attention based on patterns of staff and student behavior when compared to other courses within the CQUni context.
The understanding is that there is no magic bullet for course evaluation (Oliver & Conole 1998), and the quantitative basis for the indicators project has some limitations that mean it shouldn’t be viewed in isolation from other evaluation methods but may serve as an early indicator of possible issues. As Dawson & Heathcote (2005) state:
“A systems scan of designer and user behavior within (the LMS) can never describe in full how designers and users are engaging with the use of online environments for teaching and learning. Policy interventions, staff development activities and discipline culture all contribute to shaping designer and user behavior within the online environment. Therefore, utilizing a systems view to codify designer and user behavior is ‘indistinct’, but can play in the refinement, ratification and benchmarking of broader evaluation strategies”.
On top of this we have multiple cohorts of students such as flex, on campus and international campus students. Using an instrument to measure the online interactions of an on campus student who has the opportunity to regularly liaise with teaching staff face-to-face is obviously not going to generate accurate data when compared to a student whose engagement is wholly online. However we are able to filter on campus and the contrast between the flex and on-campus students will also produce additional data that can be analysed.
Although the project tends to focus on the pedagogical implications of the data, the results should hold some interest for the technical folk as well. Data such as learner distribution over time and connections from inside the campus will be available as well as the usual data such as hit counts etcetera. Other simple pedagogical measurements include topics such as student/staff posting/reply ratios as indicated by Burr (2004)
“where active collaboration among learners is occurring, an increase in the ratio of replies to postings will be observed; conversely where the forum is being used as a simple question and answer resource, the ratio of replies to posts is anticipated to be approximately a 1:1 ratio”.
The next step in the project is to complete a literature review and design the web instrument based on the findings of the review and the requirements of the anticipated users. eg. Curriculum designers and faculty staff. The following factors need to be linked clearly to the project.
The seven principles.
- encourages contact between students and faculty,
- develops reciprocity and cooperation among students,
- encourages active learning,
- gives prompt feedback,
- emphasizes time on task,
- communicates high expectations, and
- respects diverse talents and ways of learning.
Online learner interactions
- Learner – Content -> Cognitive presence.
- Learner – Instructor -> Teaching presence.
- Learner – Learner -> Social presence.
- “The fundamental idea underlying engagement theory is that students must be meaningfully engaged in learning activities through interaction with others and worthwhile tasks. While in principle, such engagement could occur without the use of technology, we believe that technology can facilitate engagement in ways which are difficult to achieve otherwise. So engagement theory is intended to be a conceptual framework for technology-based learning and teaching.” (Kearsley & Schneiderman, 1999).
Burr, L. (2003). The nature of interaction within and online computer mediated environment. Published thesis. Charles Sturt University 2003.
Dawson, S. & Heathcote, E. (2005). Data Mining for Evaluation, Benchmarking and Reflective Practice in a LMS. E-Learn 2005: World conference on E-learning in corporate, government, healthcare & higher education, Vancouver, Canada.
Kearsley, G. & Schneiderman, B. (1999). Engagement theory: A framework for technology-based learning and teaching. Originally at http://home.sprynet.com/~gkearsley/engage.htm. Retrieved 14:42, 11 September 2006 (MEST) from google cache.
Oliver,M. & Conole, G. (1998). Evaluating Communication and Information Technologies: A toolkit for practitioners. Active Learning, 8, 3-8.