This presentation was given at the OpenCourseWare Consortium Global Meeting in May, 2011. It describes some of the results from an evaluation project initiated by Open.Michigan in September 2010. Full results can be found at tinyurl.com/omevaluation.
Measuring Our Impact: The Open.Michigan Initiative
1. Measuring our Impact: The Open.Michigan Initiative CC: BY-NC-SA, Choconancy1Flickr Emily Puckett Rodgers Open Education Coordinator, Open.Michigan OpenCourseWare Consortium Global Meeting 2011 Except where otherwise noted, this work is available under a Creative Commons Attribution 3.0 License. Copyright 2011 The Regents of the University of Michigan
5. Evaluation Design “Through a combination of existing and new qualitative (including surveys, interviews and user feedback) and quantitative (including web analytics, published resources) data, we aim to measure Open.Michigan’s impact on the University of Michigan campus and on the broader open education landscape, as well as its progress toward overall objectives and mission.” Mission and Objectives Environmental Scans Review Implementation Document Study and Review Analysis Organizational analyses Research development and deployment
12. Strategic Vision Open.Michigan’s Strategic Vision Open.Michigan enables University of Michigan faculty, students, staff and others to share their educational resources and research with the world. Open.Michigan’s efforts contribute to two primary goals: to sustain a thriving culture of sharing knowledge at U-M to provide comprehensive public access to all of U-M’s scholarly output http://tinyurl.com/openmichiganvision
46. Ghana Emergency Medicine Collaborative“Having the big block M on a lot of high quality produced teaching modules that are made available to the world… is mission consistent, so what’s the business that we’re in here? We create and distribute knowledge.” -Paul Courant, “Why Open is Important” interview
47. Impact: Surveys CTools survey 40.7% instructors never heard of OCW 75.6% students never heard of OCW http://tinyurl.com/ctools2010survey
63. Advisory committeeIn the next three years, Open.Michigan will: Producemore and richer content as OER with the various campus units, improve modularity, instructional design, and accessibility of U-M OER Increase the visibility and discoverability of U-M resources through a combination of marketing and metadata Draw participants from more parts of campus to expand its disciplinary coverage Ensure OER production is an embeddedpart of the academic life on campus
>2007 pilot; 2008 launch >UMMS (Office of Enabling Tech) and School of Info>2008 launch, UMMS with goal of making first 2 years of medical school content openly licensed.>small staff (currently 4 full time); needed student (volunteer centric approach) support> Large research institution with ~40,000 students and a very distributed academic system across units; 3 campusesUMMS 1,100 students; LSA 19,000
>Start using Google Analytics and start doing systematic and consistent surveys and evaluations for our work. >Phase One: Define the mission and objectives of Open.Michigan >Phase Two: Conduct environmental scan of other Evaluations in Open Education >Phase Three: Complete inventory and review of internal documents Develop analyses such as PEST and SWOT >Phase Four: Formulate and implement strategy; Develop survey tools>Phase Five: Distribute survey Analyze data and synthesize results
>Analysis of production, impact, and external knowledge of the open materials created>Other orgs doing similar things (government, institution and nonprofit orgs; OCW & OER)> Qualitative: questionnaires, surveys, interviews for qualitative data, and focusing almost entirely on >Quantitative: web analytics>Analyzed: the production of open material, the use (impact) of open material, and knowledge and opinion of the open materials provided by those outside of the providing organization itself. Overall: >>ISKME case study frameworks and “Adoption and Use of Open Textbooks” >>CIC Open Access awareness environmental scan, 2010>>Open Learning Initiative activities>>Open Learning Network activities>>Atkins, Brown et. al. “A Review of the Open Educational Resources (OER) Movement: Achievements, Challenges and New Opportunities” Hewlett report 2007>>Surveys, analytics, follow up and logs.>NSSE report on student engagement>CIBER report on social media use >OER/OCW specific evaluations: >>Tufts: site users who were also faculty members Tufts OCW “positively affects their teaching practices by providing additional teaching materials, by enabling them to integrate Tufts materials into their courses, by increasing their knowledge levels in certain areas and impact how course materials are developed by emphasizing instructional technology.”>>MIT: Self-reports and analyses showed that educators, students, and self-learners alike all use OCW & satisfied by the content available, and the same survey show high impact on their learning goals>>Wikiwijs (Dutch government OER) peer review and recommendation system
>Vision document: outlines formalized and explicit mission and goals for Open.Michigan>Priorities for future growth and a three-year plan. >Created in tandem with determining the outcomes that we could measure. Future Growth:Increase support for OER productionExpand and improve OER offeringsAdd value to content and services
> 3 platforms have hosted our content over the past few years: 1) eduCommons: hosted all our OER there2) /education: site-integrated (static Open.Michigan pages) eduCommons instance of OER.3) OERbiteverything is hosted on our Drupal instance including static pages and OER; our wiki and blog are also measureable. >Look at all Google Analytics reports to get full picture of growth. >Referring sites over past 3 yearswikipediayoutubefacebook>First year: local medical gateways and large entities like creative commons and ocwc>Second year: traffic from international sites in china and India as well as social sites like facebook and twitter>Third year: getting more traffic from a variety of sites including ocwc, other oer sites like einztein and project sites
5/3/10-5/3/11: 77,808 visitsover 12,000 are from the Ann Arbor areaover 31,000 are internationally based >Site visits: Coming from 127 countries; China, India and Great Britain being the largest. >Downloads 1231 (last year): syllabi; and pdfs.
>Steadily grown in resources: most in SI and UMMS where we have strongest ties.>M1/M2 sequences being published at a regular rate (32% published as of late 2010, must get permission and clear material); now piloting synching audio from lectures to our cleared slides with good results. >dScribes to date: 69 graduate students2 undergraduate students1 librarian 1 staff member 1 visiting scholardScribes have assisted Open.Michigan and 41 faculty in publishing 39 courses
>Investment (financial resources, staff, admin support) devoted to OER production and development>Have good horizontal investment and some support from high administration but need more Examples of support: >SI (we are a preferred employer, invited to participate in local events to recruit students, use their space, heavily recruit from their cohort for employees and volunteers 2008-2011)>Library (coordinate activities, co-facilitate events, participate in local activities with them 2010-2011)>MERLOT (2011) joint collaboration on grant funded projects>Global Reach (share information and facilitate each other’s activities) 2010-2011
>Survey results: supportRespondents ambivalent about participating in Open.Michigan efforts (mostly in the neutral or “agree” categories) >Surveys based on environmental scan of other evals, CTools survey, own community interest >CTools survey>>U-M’s Sakai LMS>>2007-2010 had OCW/Open.Michigan questions> Our survey>>all students, all faculty and staff units that support teaching and learning) >>survey about 20 questions, measured sharing habits (do you share, with whom do you share , awareness of and use of OER, social comparison (I would share “if” and support of Open.Michigan)>>Awareness, Value, Use 2010 Results: >Mostly neutral in their responses to OER use and impact (students and instructors)
>Administered via email>Staff included: technology, instruction and design Staff respondents were heavily from the libraries, which may have heard of us more than other staff units across campus Of the surveys, the staff is our least confident survey sample size and response rate.Sharing habits (what, with whom, why)Awareness of Open.Michigan and OERUse of OER (why, how, with whom, create or use?)Impact of OER (in teaching, learning or professional development)Support of OER efforts and Open.MichiganAwareness Students>94% students don’t know about Open.Michigan>84% students never heard of OER >Validity testing: Haven’t heard of OER (1409) vs Heard of/Used/Created OER (256): significantly more students have not even heard of OER than those who have heard of, used or created OER combined. Faculty>82% faculty don’t know about Open.Michigan>79% faculty never heard of OER>>121 never heard of; 73 heard of/used/created Staff79% staff don’t know about Open.Michigan55% staff never heard of OER
>Faculty share lecture slides 79% of the time; syllabi 63% of the time >Staff share reading resources 72% of the time; lecture slides 32% of the time otherwise they share data, administrative information or communications. >Students are mostly sharing on google docs, websites or dropbox: posting to networks (closed) or openly online
Lecture slides, syllabi, assignments most popular used/viewed/adapted by faculty and students. Faculty >Faculty: Viewing how other faculty in my area are approaching materials (32.4%) >Faculty: Encourage colleagues to publish on Open.Michigan (43.9%)> 45.8% use course materials themselves > 37.5% publish their course materials on Open.Michigan >> Used lecture slides from individual lectures in all three categories - viewing, downloading, and adapting. >> Use of OER as a database of resources >> Syllabi and assignments viewed more often than they were downloaded or adapted> Assignments were adopted almost as many times (8) as they were viewed (13) Students >Students: previewing prospective courses in depth before I register (35%) >Students: 56.5% would use course material from Open.Michigan > 49.3% would encourage other students to use course material on Open.Michigan > 22.2% would support the publishing process >>Students heavier on the use/viewing side of OER use and less on the adapting side. >>Individual lecture slides (also 1st by far), syllabi (2nd by a good margin), and assignments, the fourth most popular option. >E-textbooks proved to be more popular among students than databases of resources > Most popular OER: individual lecture slides (the most popular among both groups), syllabi and assignments.
- Indeed making an impact but at a mid-level of administrative support and in the broader national/international open education landscape.Still need to work on education about the benefits of OER use and production in the classroom and in learning settings. Still need to work on getting a simple, clear message across to our faculty, students and staff who create learning resources and who use them. Positive results from dScribes, where they use OER more, understand copyright law better, and understand how to use open licenses. They have also reported learning results (being better prepared for courses, tests or learning how to study better through this process) (dScribe exit interviews 07-08 Medical survey).This is what the people want: the set of actionable items. Use words like:Free ShareableAdaptableAttributablePublicNot: Open Educational Resources or “distributable”Staff Staff did not seem to show a preference for one kind of OER use over another. They do, however, appear to view OER much more than they download and/or adapt them.
>Overall what we’re finding aligns with the vision statement we created. >Future Growth:Increase support for OER productionExpand and improve OER offeringsAdd value to content and services >Ongoing evaluation: tailored to the interests and needs of our community>Based on the feedback from the surveys: focus on delivering a clear, consistent message and making OER that is directly useful to our community.>Closing the Loop- continue analysis and determine if our assessments are making a difference and making sure we implement changes that are based in community interests and needs.
>Awareness of Open.Michigan and OER>Refine survey questions/make simpler >Biggest comment and feedback was “I still don’t know what you do or what OER is!” (mostly from students). > Scope our impact research closer to individual communities and thus focus groups>Not a research unit: a practice and research unit: balance good thought with action. Use of our data >Google Analytics: developing a module in OERbit to show this information to others>Aggregate survey results: available on request