In the last 5 years, there has been a rise in what we might call “large-scale digital learning experiments.” These take the form of centralized courses, vendor-created courseware, online homework systems, MOOCs, and free-range learning platforms. If we mine the research, successes, and failures coming out of these experiments, what can we discover about designing better digital learning experiences and technology for learning?
1. Learning at Scale
Using Research To Improve Learning Practices
and Technology for Teaching
Maria H. Andersen, Ph.D.
Faculty, Westminster College, Utah
25. The impact of findability on student motivation, self-
efficacy, and perceptions of online course quality
Authors: B. Simunich, D. Robins & V. Kelly
Published: 2015
Study size: n=81, students had to complete 7 tasks, navigating either a
well-constructed course (according to QM) or a “broken”
course
26.
27. 7.00x Introduction to Biology: The Secret of
Life MITx on EdX Course Report - 2013 Spring
Authors: D. Seaton, J. Reich, S. Nesterko, et al.
Published: 2014
Study size: 38k signups, 3K certified
(Seaton et al, 2014)
29. 6.00x Introduction to Computer Science and
Programming MITx on edX – 2012 Fall
Authors: S. Features, P. Grimson, J. Guttag
Published: 2014
Study size: 84.5k signups, 5.7K certified
30. (Features et al, 2014)
Introduction to Computer Science and Programming
31. 14.73x The Challenges of Global Poverty
MITx on edX – 2013 Spring
Authors: P. Black, P. White
Published: 2014
Study size: 40k signups, 4.6K certified
33. Examining Engagement: Analysing Learner
Subpopulations in MOOCs
Authors: R. Ferguson, D. Clow
Published: 2015
Study size: 34k signups, 7k full participants, 4 MOOCs
36. Raise your hand if you tell your students
or children to be careful sharing on the
Internet.
Keep your hand raised if you think it’s
easy to get students to participate in
these without points.
Raise your hand if you use discussion
boards with your students.
37.
38.
39.
40.
41. Comments in MOOCs: who is doing the
talking and does it help?
Authors: B. Swinnerton, S. Hotchkiss & N.P. Morris
Published: 2017
Study size: 25k active learners, 8k commenters, 5k students who
completed a pre-course survey (roughly half commenters
and half non-commenters)
45. Predicting Student Retention in Massive Open
Online Courses using Hidden Markov Models
Authors: G. Balakrishnan
Published: 2013
Study size: 30k enrolled students in “Software as a Service” MOOC
49. Comments in MOOCs: who is doing the
talking and does it help?
Authors: B. Swinnerton, S. Hotchkiss & N.P. Morris
Published: 2017
Study size: 25k active learners, 8k commenters, 5k students who
completed a pre-course survey (roughly half commenters
and half non-commenters)
57. Examining Engagement: Analysing Learner
Subpopulations in MOOCs
Authors: R. Ferguson, D. Clow
Published: 2015
Study size: 34k signups, 7k full participants, 4 MOOCs
59. Samplers Visit, but only briefly (commented in some courses)
Strong Starters Complete the first week, but then dropout
Returners Made it through 2 weeks, but then dropout
Midway Dropouts Dropped out around assessment 3 or 4
Nearly There’s Made it through ¾ of the course, then dropped
Late Completers Completed course, but submitted things late
Keen Completers Completed course, mostly on time (>80%)
(Ferguson & Clow, 2015)
61. Deconstructing Disengagement: Analyzing Learner
Subpopulations in Massive Open Online Courses
Authors: R. Kizilcec, C. Piech, E. Schneider
Published: 2013
Study size: 94k active learners in 3 computer science MOOCs
71. Your click decides your fate : Inferring Information
Processing and Attrition Behavior from MOOC
Video Clickstream Interactions
Authors: T. Sinha, P. Jermann, N. Li et al.
Published: 2014
Study size: 66k enrolled, 36.5k interacted with videos, there were 48
video lectures (producing 10 Gb of JSON data) from
“Functional Programming in Scala” run on Coursera
72. 10 GB of JSON video click data
Play (Pl)
Pause (Pa)
SeekFw (Sf)
SeekBw (Sb)
ScrollFw (SSf)
ScrollBw (SSb)
RatechangeFast (Rf)
RatechangeSlow (Rs)
(Sinha et al., 2014)
Most of us began our experience as educators here. We were students. And then maybe teachers. Possibly we still are. We have deep roots in classroom experience. In fact, you would be hard pressed to find someone without “classroom” as their first and dominant exposure to learning methodology. Classroom learning is so ingrained in us that it is almost impossible to escape its gravitational force
Teaching a small private in-person course is a bit like having a townhouse in a row of townhouses. As the instructor, it is your job to keep up the house. Each instructor caretakes their own house. Granted, sometimes one of the townhouses in the row might be in disrepair, but we try to just walk quickly by that one …
Teaching an online course forces a perspective shift. Instead of being in the same space with the students, you are looking in at each student working in their own little office space. And from the learner side, it sometimes feels like a leaderless learning experience. Where is the instructor? [pause] There he is. In all seriousness though, isn’t that what it feels like? We’ve gone from being in the same space with learners to peeking in from the outside?
I was dropped head first, like many of you, into the experience of designing a learning experience on a much larger scale. We went from looking in at students sitting in their offices to managing learning on the size of a small city. Massive Open Online Courses hit higher education with much hoopla, forcing many professors, institutions, and silicon valley, to “confront” online learning for the very first time. Now the instructor was really just a replicant – packaged up and delivered to the learner, wherever they might be.
Finally, I want to draw attention to the skyscrapers of accredited large-scale CBE providers. (CBE is competency-based education, for those of you new to the term) These courses can contain thousands of students. They start and end on their own schedule, they work at their own pace, and they are “serviced” by a team of educators (evaluators, coaches, and content experts).
And with each iteration of teaching an online course, the uncomfortability with the status quo in classroom learning increases. And perhaps, if you are like me, you find that you also have growing unease with the richness and depth of learning in the ONLINE experience.
After watching all of this progression from multiple perspectives, I began to wonder … what have we actually learned? We’ve looked at learning through a statistical sample size larger than we have ever had before. What do we actually know now? Should we be adjusting anything down the line in the smaller courses? By the way, there will be a quiz.
The purpose of showing you this data and relating these experiences (often from at-scale courses and MOOCs) is not because I want to encourage you all to go out and do these things. It is because for the first time in history, we have extremely large data sets that tell us about natural eLearning engagement and persistence patterns.
How many times a year to you get a chance to “start” courses. For most of you it is 2-3 times. Every time you carefully rework your syllabii to ensure the best possible outcome. What happens a week or two later? You wish you’d done it differently. And now you have to wait 6 months.
My first role outside of Academia was to launch Canvas Network, a MOOC platform built on Canvas.
In MOOC-land, I was able to launch a course, watch several hundred/thousand students wander through the “startup materials” and then iterate for courses launching the next week. Every week we discovered lessons about what worked and what didn’t. And then we did it over with a new course. It was a bit like the movie Groundhog Day.
Not only were we starting new courses every week, but they were really LARGE courses, so we were able to see patterns of interaction quickly.
It turned out there was a problem with course design. Let me ask you if you recognize the pattern on this slide.
It’s IKEA. And when you are in IKEA, it is very difficult to imagine the big picture map of the store. All we can do is follow the path.
As instructors, we can see the whole plan from above.
Learners are in the maze, and they can’t see the big picture.
Wouldn’t it be nice to have a transporter to get right to where you want to go?
It’s helpful to give students their own visual anchors to the course, a way to navigate via landmarks.
And speaking of findability, why can’t students find information in the syllabus? This is the universal instructor look of frustration around answering a question with an answer clearly provided in the syllabus.
But the problem is that syllabii have become like End-User-License Agreements. Who reads them? When you had a student a 10-page single-spaced document full of text, much of it in legalese provided by the administration, there is good reason they don’t read them.
Same thing for attachments. It is more difficult to download and read attachments on mobile devices. Sometimes it’s not even obvious that there IS an attachment. If you want to guarantee nobody reads it, make it an attachment in the LMS.
If, on the other hand, you want the Syllabus to get read, make it in a clickable FAQ format.
And, for the administration side, simply convert the clickable FAQ to a printed document to submit for permanent paperwork.
But don’t just take my word for it. There have actually been studies. Here’s a small-n study on eLearning in particular – navigation is a huge frustration point for students. But also, there are thousands and thousands of studies and hundreds of books on the topic of usability in software experiences (which is essentially what we deliver though eLearning platforms).
Next lets talk about deadlines, and the frequency of deadlines. We’re going to look at the data from three MOOCs kind of chosen at random, because all of them show the same kind of pattern.
Exams are weeks 5, 9, and 13. See the spikes?
Another study really brings home the kinds of activities that bring students back into the LMS to engage.
Red lines are a weekly email. Activity drops to HALF between emails/deadlines. The email/deadline sucks the learners back in for another cycle through another week. Don’t feel bad about frequent deadlines and announcements.
If you want students to engage with your course only once a week, send out one announcement/email or make one assignment deadline a week. That will pretty much guarantee it. However, if you want engagement 2-3 times a week, you’ll need more than one.
We have taught the younger generations of students to NOT put information on the Internet. It will affect admission to college. It will affect scholarships. It will affect your career. And so a whole app ecosystem developed around information NOT lasting on the Internet. Is it at all surprising that our brand-new freshmen are hesitant to participate in discussion boards (on the Internet)?
We actually need to help students get back on the Internet superhighway with professional behavior in things like discussions.
So let’s talk about the learning community and things like discussions.
The strange things about online courses, especially self-paced ones, is that they are a very lonely experience for learners. Because of the asynchronous nature of the much of the interaction with the course, the interaction (if it exists) is built primarily around content. We see this not only in CBE courses but also in MOOCs that are increasingly being designed to function self-paced and sans instructor.
“Superposters” (creators) make more than 2 comments.
Boxplots show the majority of commenters fall in the range from 2-4 comments for the whole course. This gives us some sense of natural non-incentivized participation rate. If you want participation/interaction, you will have to incentivize it.
Students who don’t participate in discussion boards are more likely to drop a course.
Participating in online discussion
Even lurking is an activity that correlates with persistence. Too bad it’s near impossible to get analytics on who has viewed threads, huh?
Here’s some kind of great irony, our most social-media saavy students (the 18-20somethings) posted less (by median) than the older students. These students are used to a world where conversation disappears – impermanence.
Teach students to actually make unique and eye-catching discussion subject lines.
Drive the lurkers back in to see what another student said that was so great.
Use several discussion prompts instead of just one discussion prompt.
Consider breaking the class into smaller groups, and having each discussion group complete the discussion. Smaller groups means more opportunity to actually participate in a meaningful way, not just rephrasing what others have said.
Raise your hand if any of the following happened to you in the spring semester
Got sick
Broke up with a partner
Sick kids at home
New baby at home
Car broke down
Really big deadline at work
Stay standing if found it was easy to keep up with your job while these things were going on.
From week to week in the courses, researchers tracked the student movement between categories.
These are the students that are on track from week to week.
One way learners move through the course is On-track to Out, another is On-track to Auditing.
Another path is On-track to behind, to Out/Audit.
Pay particular attention to the students who ”fall out” each week. EVERY WEEK, students fall out of the course.
Once you get left behind in a digital course, you can’t catch back up. You not only have to do the work for the week you missed, but the work of the week you are currently in. It’s double work, in the week after you’ve had a crisis.
Let’s talk about video. There’s a lot of research we could talk about here, but I want to share a particular study that I found fascinating.
These codes were put into n-grams, and those n-grams were clustered together into behavior groups. Let’s look at a couple.
These behavioral action categories were then collected to infer the students’ perception of the video lecture segment. Compare the two regions where the information processing index is high.
One final thought on this. Which gets more watches on the Internet? The perfectly well-scripted video? Or the one where something unexpected happens?
Finally, I want to talk about the lesson that convinced me to walk away from a perfectly good job. Cold turkey.
The Sleeping, Eating, Bonding problem. Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. International Handbook of Information Technology in Primary and Secondary Education. J. Voogt and G. Knezek. New York, Springer: 43-59.
The Sleeping, Eating, Bonding problem.
From Dede’s text: “learning is a human activity quite diverse in its manifestations from person to person, and even from day to day”
For us to imagine that we are going to A/B test learning into submission is a farce. Creating a learning experience that results in “sticky” learning is not the same thing as tweaking an interface to get people to buy a product.
What if, 20 years from now, there will be some students who have never experienced anything but online learning. What if that’s your child? Are they better off for it? Or worse? I think we would all agree that our goal is to not just achieve parity between online learning and classroom learning, but to surpass it.