2. How it began http://farm4.static.flickr.com/3196/2804845277_1829975f23_o.jpg
3.
4. Who is involved? http://farm3.static.flickr.com/2257/3534516458_48e4e8595f_b.jpg
5. Who to blame for the Indicators Project Ken Clark (FABIE) Colin Beer ( Curriculum Design Unit) Thanks also to: Rolley Tickner Nathaniel Fitzgerald-Hood David Jones
14. Staff engagement and activity http://farm1.static.flickr.com/96/224456132_12a1791aea_o.jpg
15. Data sources http://farm4.static.flickr.com/3045/2920562020_e808543f0b_o.png
16.
17.
18. http://www.flickr.com/photos/zigazou76/3636704534/ “ Utilising a systems view to codify designer and user behaviour is ‘indistinct’, but can play in the refinement, ratification and benchmarking of broader evaluation strategies.” (Heathcote & Dawson 2005) Limitations!
19. Ongoing use of the process of reflection is essential for building knowledge, and increasing knowledge increases one’s ability to use reflection effectively and to develop as a teacher . (McAlpine & Weston, 2004) http://farm3.static.flickr.com/2017/2206733790_ac6328b1c8_b.jpg
20. A model for research into course management systems that equally considers technical features and research into how people learn. (Malikowski et al., 2007) The Malikowski Model
CDDU was looking at the Blackboard backend database with a view to providing some automation. Realized that recorded activity within Bb wasn’t being purged as it should have been Huge amount of staff and student activity back to 2004
To provide the academic with tools for reflection on their course. To develop a model by which we can benchmark the LMS against the seven principles
SKIP
I t ’ s a longitudinal study that we hope to repeat with in the incoming LMS. The model that we’re developing is independent of a particular LMS or insitution
There are two parts to the project First is teacher reflection. P roviding teaching staff with a tool via which they can reflect upon the statistical performance of their course.
D evelop a model via which we can measure the effectiveness of the LMS against the seven principles of good practice T his can be used to compare one LMS to another.
We’ve considered a variety of frameworks such as
Dawson and Heathocte split the LMS into four user/designer activity domains: :Communication and collaboration – chat rooms, forums, email archives :Information dissemination – files, media (such as video and audio), links to website or readings, text :Information design/navigation – dynamic menus/navigation bars. Pages and page styles, topics and templates :Assessment – Assignment upload areas, multiple choice, short answer, scenario, quizzes They only looked at one of the elements of the domain, the communication collaboration, and comapred use across faculties
Three Categories in the CMS research model: Transmitting course content – files, news and gradebook Evaluating students and creating class discussions Evaluating courses, instructors and creating adaptive instruction
B ut the main framework we are developing is based on t he seven principles of good practice in undergraduate education.
U sing this framework we can demonstrate indicative data on student engagement W hen, how and what their activities are focussing on. A nd how their activities are reflected in their grades.
D ata can also be gathered on staff activity and the effect this has on student grades and engagement.
Our project is based on t wo primary data sources P eoplosoft student results Blackboard backend database that records every click within the system. Normally this is purged regularly but in our case this hasn’t been happing leaving us with a great opportunity for research.
We’ve taken both of these datasources and aggregated them into a single location that automatically updates on a weekly basis A good chunk of the work so far on the Indicators has been to get this working smoothly.
This tables gives you some idea as the scale of the stored data. Just about every click made in the system has been recorded and is held in the database.
Before we go on I need to stress that the quantitative data has some serious limitations such as. D oesn ’ t account for how learners or staff are engaging in the LMS course. F or example more tech savvy students may download a pdf off Blackboard rather than click on it every visit to the site. T his will skew the data. C urrently limited to Blackboard only. W e are trying to get similar access to the Moodle backend database. The main message results are indicative NOT absolute. With that in mind
The first of the the two broad goals of the I n dicators project is to Provide teaching staff with tools and information that they can use to reflect on their courses. This came about by some research we are currently writing up based on the Malikowski model.
The Malikowski model looks closely at LMS feature adoption and in particular the complexity of learning objectives that the various LMS features try to address. We have some problems with parts of the model that don’t necessarily fit our context but it has given us a way of looking at LMS feature adoption over time There are three categories of the model that arguably can relate to the complexity of the learning goals various LMS features are meant to address. Information dissemination Class discussion and evaluating students. The third relates to evaluating courses and instructors as well as adaptive instruction but the evidence of these is vague and is something we haven’t yet looked at in detail.
Our preliminary findings reinforce the findings of the original researchers and suggest that information dissemination features of the LMS are quickly adopted and also quickly level out as you can see in the table for 2007,2008 and 2009.
The features that the model suggests fitting with more advanced learning objectives as indicated by the 3 columns on the right hand side are adopted more gradually as teaching staff gain experience with teaching via an LMS. However its good to note that forum adoption at CQUniversity is exceptional. Forum usage is quite high and is still growing. This particular aspect of the Indicators research is still in progress and we hope to publish in the next few weeks. What is does solidify for us is how pivotal the teacher role is, even in LMS based courses because it relies on them to adopt and support the features offered by the LMS which enable deeper more active learning. we intend to help provide teaching staff with tools and information that may assist their course delivery. So we developed the Indicators web site.
What your seeing here is a screen capture off the Indicators web site that shows student activity over time. Most LMS provide this information but we have j us t taken it a step further by allowing the overlay of grade information and distinguishing between student cohorts. Plans for the future include Moodle and even performing some automated analysis on the data before its presented to the teacher.
Another example of how we are using the Indicators data is CDDU’s Moodle assistant whereby staff transitioning to Moodle can enter their course and receive some general information and a map based on how their course is structured. This can make the task of visualizing their course design a little easier and may reduce the pain of the transition. Lets have a look at some of the more global data.
The is the average hits on the site for each student grade across a sample of over 300,000 student course units. As you’d expect the more hits on the course site the better the students grade would tend to be. Do you think this would be the same for staff? The more hits by the staff member the better the average student result?
H ere is the same plot but with the flex students added as a separate line which is in green. I guess you’d also expect that flex students would have more hits on the course website than average.
This is average hits across all grades but separated by campus. F lex have twice the hits as oncampus and three times the hits of AIC.
O ne of my pet interests online communities and how we can better facilitate communities across a wholly online cohort. A verage forum visits overall in the blue and flex students in the green We have plans to conduct further research into forum use in the LMS that looks at work based on Heathcoate and Dawson that looks at staff/student post/reply ratios to determine the level of conversation ocuring within a forum.
How would you interpret this result? The average number of files per course is increasing.
Institutionally we can gauge ROI on LMS investment Improve T&L by identifying patterns of behaviour in the data captured by the LMS.
We needed a critical lens that we could use to evaluate the mountains of data we have accumulated.
Fitting with CQUniversity’s teaching and learning strategy we chose the seven principles of good practice in undergraduate education by C&G. The next part is to develop a model whereby statistical analysis of the whole LMS can be used to determine how we are delivering on the seven principles.