The literature review found that research on K-12 online learning has generally fallen into two categories: potential benefits and challenges. However, most studies to date have been descriptive rather than rigorous examinations. Research on student performance in online versus face-to-face settings has produced mixed results, but these studies often fail to account for differences in the types of students in each setting. More rigorous research is needed that considers student and teacher characteristics and implements design-based research approaches to better understand the effectiveness of online learning.
4. Literature Reviews
1. Rice (2006)
– Journal of Research on Technology in
Education
1. Barbour & Reeves (2009)
– Computers and Education
1. Cavanaugh, Barbour, & Clark (2009)
– International Review of Research in Open
5. What does the literature say?
• “based upon the personal experiences of
those involved in the practice of virtual
schooling” (Cavanaugh et al., 2009)
• described the literature as generally falling
into one of two general categories: the
potential benefits of and challenges facing K-
12 online learning (Barbour & Reeves, 2009)
6. What about research?
• “a paucity of research exists when
examining high school students enrolled
in virtual schools, and the research base
is smaller still when the population of
students is further narrowed to the
elementary grades”
(Rice, 2006)
7. Is this a problem?
“indicative of the foundational descriptive work
that often precedes experimentation in any
scientific field. In other words, it is important to
know how students in virtual school engage in
their learning in this environment prior to
conducting any rigorous examination of virtual
schooling.”
(Cavanaugh et al., 2009)
8. What does the research say?
1. Comparisons of student performance based upon
delivery model (i.e., classroom vs. online)
2. Studies examining the qualities and characteristics
of the teaching/learning experience
– characteristics of
– supports provided to
– issues related to isolation of online learners (Rice, 2006)
1. Effectiveness of virtual schooling
2. Student readiness and retention issues (Cavanaugh
et al., 2009)
10. Student Performance
• performance of virtual
and classroom students
in Alberta were similar
in English and Social
Studies courses, but
that classroom students
performed better
overall in all other
subject areas (Ballas &
Belyk, 2000)
11. Student Performance
• over half of the students who
completed FLVS courses
scored an A in their course and
only 7% received a failing
grade (Bigbie & McCarroll,
2000)
• students in the six virtual
schools in three different
provinces performed no worse
than the students from the
three conventional schools
(Barker & Wendel, 2001)
12. Student Performance
• FLVS students performed
better on a non-mandatory
assessment tool than students
from the traditional classroom
(Cavanaugh et al., 2005)
• FLVS students performed
better on an assessment of
algebraic understanding than
their classroom counterparts
(McLeod et al., 2005)
14. Students and Student Performance
Ballas & performance of virtual and participation rate in the
Belyk, 2000 classroom students similar assessment among virtual
in English & Social Studies students ranged from 65% to
courses, but classroom 75% compared to 90% to
students performed better 96% for the classroom-based
in all other subject areas students
Bigbie & over half of the students between 25% and 50% of
McCarroll, who completed FLVS students had dropped out
2000 courses scored an A in of their FLVS courses over
their course and only 7% the previous two-year
received a failing grade period
15. Students and Student Performance
Cavanaugh et FLVS students performed speculated that the virtual
al., 2005 better on a non- school students who did
mandatory assessment take the assessment may
tool than students from have been more
the traditional classroom academically motivated and
naturally higher achieving
students
McLeod et FLVS students performed results of the student
al., 2005 better on an assessment performance were due to
of algebraic understanding the high dropout rate in
than their classroom virtual school courses
counterparts
17. The Students
• the vast majority of VHS
Global Consortium students
in their courses were
planning to attend a
four-year college (Kozma,
Zucker & Espinoza, 1998)
• “VHS courses are
predominantly designated as
‘honors,’ and students
enrolled are mostly college
bound” (Espinoza et al., 1999)
18. The Students
The preferred characteristics
include the highly motivated,
self-directed, self-disciplined,
independent learner who
could read and write well,
and who also had a strong
interest in or ability with
technology (Haughey &
Muirhead, 1999)
19. The Students
• “only students with a high
need to control and structure
their own learning may
choose distance formats
freely” (Roblyer & Elbaum,
2000)
• IVHS students were “highly
motivated, high achieving,
self-directed and/or who liked
to work independently” (Clark
et al., 2002)
20. The Students
• the typical online student
was an A or B student
(Mills, 2003)
• 45% of the students who
participated in e-learning
opportunities in Michigan
were “either advanced
placement or
academically advanced”
students (Watkins, 2005)
22. Student Reality???
• two courses with the highest enrollment of online
students in the US are Algebra I & Algebra II
(Patrick, 2007)
• largest proportion of growth in K–12 online
learning enrollment is with full-time cyber schools
(Watson et al., 2008)
• many cyber schools have a higher percentage of
students classified as “at-risk” (Klein, 2006)
• at-risk students are as those who might otherwise
drop out of traditional schools (Rapp, Eckes &
Plurker, 2006)
35. Problematic Research
Online 7 principles of Interviews with teachers and course
Course effective online developers at a single virtual school,
Design course content with no verification of whether the
for adolescent interviewees’ perceptions were actually
Barbour learners effective or any student input at all for
(2005; 2007) that matter.
Online 37 best Interviews with teachers at a single
Teaching practices in virtual school selected by the virtual
asynchronous school itself. Their teachers’ beliefs
DiPietro et online teaching were not validated through observation
al. (2008) of the teaching or student performance.
41. Assistant Professor
Wayne State University, USA
mkbarbour@gmail.com
http://www.michaelbarbour.com
http://virtualschooling.wordpress.com
Notes de l'éditeur
Benefits = Expanding educational access; Providing high-quality learning opportunities; and Allowing for educational choice Challenges = Student readiness issues and retention issues
American Journal of Distance Education (United States) - 8 US Journal of Distance Education (Canada) - 4 Cdn / 1 Aus Distance Education (Australia) - 2 Aus / 4 US Journal of Distance Learning (New Zealand) - 1 NZ / 1 Cdn / 1 US-Cdn Last five years - 24 articles out of a total of 262 related to K-12 distance education
This is actually quite normal
Common link between both assessments is the pre-occupation of researchers with comparing student performance in an effort to show the effectiveness of online
As research into comparing student performance between face-to-face and online environments is both the common theme and, by far, the dominant theme... Let ’ s take a closer look at this body of research.
Canadian province of Alberta - online students do as well as classroom students in Social Studies and English, classroom students better in other areas
In their two-year evaluation, Bigbie and McCarrol found that more than 50% of Florida Virtual students get As in their courses and very few students failed I n examining 6 virtual schools in 3 Canadian provinces, Barker and Wendel found that online and classroom students performed the same
In a NCREL funded study, Cavanaugh and her colleagues found online students in Florida performed better than classroom students Similarly in another NCREL funded study, McLeod and his colleagues found online students in Florida performed better in algebra
But does this tell really tell the full story???
Ballas & Belyk had dramatically differing participation rates - how would the 20%-30% missing from the online group have scored? Bigbie & McCarroll had a significant drop-out rate in the online courses - how would the results have differed had those students stayed enrolled?
Cavanaugh and her colleagues speculated that the online students were simply better students McLeod and his colleagues speculated their results were due to the fact that weaker students had dropped out of the online course
Let ’ s examine who the literature says is enrolled in K-12 online learning...
First year evaluation of VHS - majority are planning to attend a four year college Second year evaluation - most are honors students and college bound
Highly motivated, self-directed, self-disciplined, independent learners who could read and write well, and had a strong interest in or ability with technology
need to control and structure their learning highly motivated, high achieving, self-directed, independent workers
A or B students half are academically advanced or AP students
However, is that really the description of all K-12 online learning students?
Supplemental - algebra Full-time - higher proportion of at-risk students
The research is based upon the best and the brightest.
However, we know from practice that this does not reflect all or even the majority of K-12 online learners. So the population of students the research focuses on is one of the main limitations of the usefulness (and even the believability) of much of that research.
Another problem is what we measure... 1. Correlation does not equal causality 2. Single studies measure if there is a difference between two groups beyond chance Need for meta-analysis...
A K-12 online learning example from earlier Cathy took 16 individual studies and combined the results to determine an overall effect size.
Things that hurt student learning
0.15 - The amount a student would increase simply from being a year older and a year wiser / maturity
0.25 - The amount student learning increases based upon an average teacher
0.40 - The magic number... If it doesn ’ t reach beyond 0.4, it likely isn ’ t worth it. Some scholars have argued as high as 0.6 or 0.8. Recall earlier I mentioned three different meta-analysis related to K-12 online learning.
Cavanaugh (2001) - developmental effects Cavanaugh et al. (2004) - reverse effects Means et al. (2009) - online = teacher effects & blended = developmental effects + teacher effects
In fact, if you look at many of the factors that proponents of K-12 online learning trumpet, most have little impact on student learning beyond what an average teacher and the normal process of aging would have. So, what do Hattie ’ s findings tell us?
Good teachers and the act of teaching well can have significant impacts Some design and delivery lessons applicable to K-12 online learning: direct instruction, mastery learning, worked examples, concept mapping, setting goals But this is just the research on student performance, what about the other research?
Most of the remainder research is also problematic - primarily due to methodological limitations and overreaching Barbour - principles of effective online course design based upon interviews with teachers at a single virtual school DiPietro et al. - best practice in online teaching based upon interviews with teachers at a single virtual school
Which naturally leads to the question of how should we be doing educational research when it comes to K-12 online learning?
Begins with the involvement of the local stakeholders in identifying the challenges to be addressed and the interventions to be used. In addition to trying to solve the local problem, there is a focus on the development of theory to explain what occurs in the local context. Finally, there are multiple cycles of analysis and revisions to ensure that changes become part of the routine of those involved in the system. Probably the only example or closest example of DBR in the K-12 online learning field is the Virtual High School Global Consortium.
The VHS Global Consortium was established through a five year, $7.4 million federal grant, as such there was an expectation for evaluations and research. This research was conducted with the VHS staff as a full participant (i.e., being involved in identifying the issues that needed to be examined, assisting in the design and completion of the research, implementing the recommendations, and then repeating the process to ensure the recommendations had the desired outcomes). As a result of these cycles of inquiry that examined a variety of problems in this specific context, along with the close relationship between VHS staff and the SRI International evaluation team in the design of both the virtual school and the evaluations, much of what is still known about virtual schools comes from this refined approach. It is also worth noting that the VHS has not only continued to operate since the end of the initial federal funding, but has thrived. In fact, the VHS is one of the largest and most successful virtual schools, and one of the few virtual schools that doesn ’ t rely upon direct government funding.
It all depends. If all we have is a single method that we use to design, delivery and support online learning; how is that really different than the teacher that lectures every single day, expecting their students - who are frantically taking notes - to keep up. The way we design, delivery and support online learning targeted to at-risk students must be different than online learning targeted to AP students. One of the best ways to figure out how to do that and do it well, is to have our research focus on local challenges through methodologies like design-based research.