Images accompanying this talk come from a possibly gold bearing river somewhere in Montana.
In the Library we were getting curious about student success and whether the library was making a contribution at all. We were aware of Open Data being released by University of Huddersfield based on activity, such as loans, recorded by their Library Management System. This curiosity was shared on a national scale with the JISC being prepared to fund a number of projects to investigate this very question. So, like the gold panning guys, we decided to get stuck in. How hard could it be?
Huddersfield proposed a hypothesis to test: “There is a statistically significant correlation across a number of universities between library activity data and student attainment”. Their own data analysis seemed to show that this was likely, but looking at data from a variety of institutions should show whether there was something unique about Huddersfield or the way they were collecting the data.
There were a number of Libraries wanting to partner with Huddersfield in investigating this question. In the end seven were selected based mainly on their confidence in being able to supply a variety of data.
Like Mrs. Beeton’s Rabbit Stew recipe famously beginning, “First catch your rabbit..” In our case, this turned out to be easier than we expected. We were able to get student success data from the people who held that data and add to it information from Eduserv on Athens usage and the Library Management System for loans. We have an access management control gate in the main library and a computer network used in all the library branches. A student ID number was common to each of these sets, which made it easy to merge the different spreadsheets into one. We used Microsoft Excel’s vlookup function to do this. Once we had all the data on one spreadsheet we could anonymise it by deleting the student identifier column. The different partners in the project sent the data they could collate to the University of Huddersfield where a researcher completed an number of statistical tests to determine whether the findings had any degree of significance. At DMU we tried toi visualise the data foir ourselves using charts to see if we could spots any patterns going on.
After importing the data into PASW (the successor to SPSS) Huddersfield sent us Box Plot charts (http://en.wikipedia.org/wiki/Box_plot) like this.
We looked at the averages by faculty shown in charts like this. Any suggestions on the causes of the differing results here?
We looked at the averages by faculty shown in charts like this. Any suggestions on the causes of the differing results here?
This graph produced the most discussion among us. Does it show that women work harder or that male reading strategies are more efficient?
There is statistically significant relationship between both book loans and e-resources use and student attainment. And this is true across all of the universities in the study that provided data in these areas. In some cases this was more significant than in others, but our statistical testing shows that you can believe what you see when you look at our graphs and charts!