My institution recently implemented minimum service standards for courses being developed on the new learning management system (LMS). This has proven to be somewhat controversial and, rightly or wrongly, some folk have raised concerns at the way it has been implemented. Some of their concerns align with those that David mentioned and some just do not appreciate the uninvited oversight. Either way their discomfort with the minimum standards is of serious concern for the curriculum design group as the group is being seen as somewhat responsible for the standards and our role relies heavily on our relationship with the teaching staff. This got me thinking about how the need for minimum service standards arose and what was broken that lead to these standards.
The minimum service standards were developed from the Seven Principles of Good Practice in Undergraduate Education (Chickering & Gamson 1987)with the idea that they would serve as a guide or benchmark for staff developing courses during the rollout of the new LMS. This was based on data extracted from the previous LMS that showed the following;
Of the 417 courses considered, 35% did not use discussion forums, 78% did not use virtual groups, 21% had no documents, 89% did not use quizzes, and 13% did not receive any hits at all (Tickle, Muldoon & Tennent, 2009).
Considering the following chart from the Indicators project that shows a distinct correlation between instructor presence on discussion forums and the rate of engagement by students studying online, I find the figure showing 35% of courses without discussion forums somewhat disturbing.
Note that the longitudinal results from the Indicators data show this figure evolving over the lifetime of the previous LMS and it while it shows the rate of discussion forum usage rising steadily over time, it highlights an important point that folk in our unit have been speaking about for some time. That is the absence of data available to teachers and decision makers around teaching and learning. Perhaps if data, such as has been exposed by the Indicators project, had been available to the teachers and administrators, the new LMS rollout may not have required minimum service standards.
There appears to me to be some alignment between Universities, their LMS and what Snowden says about the US Army in some of his Podcasts. He says that they were exceptionally efficient at collecting data but woefully ineffective at putting the data to any constructive use. Universities accumulate a vast amount of data on students, via the student administration systems, and the LMS, which records a wealth of data on how staff and students use the system. I guess the difficult question is how universities can make use of the data that they accumulate.
I doubt that there is a university around that does not have some sort of business intelligence unit that creates complex data cubes of students, courses and performance for use by administrators. While in this context business intelligence units provide a useful service to the organization, it, perhaps arguably, does not directly assist the core university business of teaching and learning. I suspect there are several reasons for this and not the least of which is the complexity of meaning within the data and ownership of the data. Whatever the reasons, a fundamental problem as I see it, is the absence of good data for people involved in teaching and learning, whether they are administrators, support or teaching staff.
Chickering, A. W., & Gamson, Z. F. (1987). Seven Principles of Good Practice in Undergraduate Education [Electronic Version]. AAHE Bulletin. Retrieved July 12, 2009, from http://tls.vu.edu.au/learning_and_teaching/guidelines/VU4/Chickering%20and%20Gamson%201987%20VU%204.pdf
Tickle, K., Muldoon, N. & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. In Same places, different spaces. Proceedings ascilite Auckland 2009. http://www.ascilite.org.au/conferences/auckland09/procs/tickle.pdf