Extending the reach and learning new skills: IL the web conferencing way. Foster Jones
1. Extending the reach and
learning new skills: IL the
web conferencing way
Juanita Foster-Jones and Katharine Reedy
2. The Open University Library
• Serving 8000 dispersed
Associate Lecturers
(ALs)
• Online resources
embedded in courses
The challenge: supporting ALs in their
development
3. Why web conferencing?
• Something that would replicate face to face experience
• Interactivity is key
• Ability to demonstrate online services
• Distance becomes immaterial
4. Core functionality
• Application share
• Whiteboard
• Survey
• Agenda builder
• Voting buttons
• Voice over IP
• Text chat
A demonstration…
5. What we were investigating
• Whether the medium was
appropriate
• What skills the trainers
needed
• Cost benefit comparison
to face to face
• What the users thought
6. Designing sessions
Early sessions
• Limited interactivity
• Limited use of tools
–Application share
–Text chat
–Voting buttons
Later sessions
• Increased interactivity
• Hands on practice
• Increased use of tools
–Application share with
mark-up
–Drawing tools
–Surveys
7. Impact of learning design
0
10
20
30
40
50
60
70
Introduction Blogs and
Wikis
Overall
Excellent
Good
9. Reactions: System
Rating Centra E-meeting
0
5
10
15
20
25
30
Ease of use Suitability for
delivering training
Supporting
interaction
Excellent
Good
Fair
Poor
10. Reactions: Presenter
Rating the trainer
0
5
10
15
20
25
30
35
40
Response to
questions
presentation
skills
handouts Pace/level
Excellent
Good
Fair
Poor
11. Learning I now have a plan of action to utilise
the information and skills
demonstrated which will help develop
my research skills. [AL]
I found it particularly useful that Helen helped me to
‘cut through the jargon’…and I now feel more like
having a browse to find some interesting articles!
[Student]
12. Behaviour: 6 month feedback
• Low response rate 13 ALs in total
• 45% reported a noticeable or significant change in
confidence using library resources
• Change in practice
–I’m now very proactive in advertising the library’s
resources to my students. [AL 1]
–More focused advice to students. [AL 2]
–More likely to use the library resources than before
now that I know my way around a bit. [AL3]
13. Impact on the practitioner
• Supporting reflective practice – peer
observation
• Action learning
• Developing training skills
Kolb learning cycle
14. The future
• Benchmarking
• Roll out
• Helpdesk – look at
potential for student
support
The goal: Extending our reach to our users…
wherever they are
[2 mins KJR]
Initially conducted a scoping report looking at online tutorials e.g. Safari, animations such as Captivate and Camtasia and web conferencing.
But these produced a “lesser” experience for the user. Tutorials like Safari are self directed, and there is no-ne to turn to when you have questions
Animations are only surface learning, they are suitable for quick “how to” guides, but lack true interactivity i.e a dialogue
Web conferencing offers the chance to have a real time conversation, demonstrate services to users, and offer limited hands on practice. Based on the scoping study we went with Centra emeeting for our initial pilot
[10 mins JFJ discuss and demo at same time]
What is Centra? Short explanation at this point before describing in detail (NB I’ve put the 3 slides at the end that show the interface)
Application share: allows you to share a program on your desktop, or ask a participant to share a programme on their desktop. You have master control of this, but you can give participant control of the application share so that you can watch them perform a task.
Whiteboard: allows you to insert a whiteboard for collaborative activities such as mind mapping, making notes
Survey tool: to set up multiple choice questions to evaluate understanding/prior knowledge
Voting buttons: to check awareness, gauge responses
Voice over IP: presenters and participants can talk to each other using PC
Text chat: this can be used as a back up for participants if they have no audio. Encourages interaction for less confident members
Agenda Builder: the ability to create an agenda using PowerPoint slides and placing markers for interactive components such as application share before the meeting. This facilitates preparation and sharing of content
Show a demo of application share of library site
Demo whiteboard & survey
Text chat
[1 min JFJ]
At the start of the pilot we had no idea as to how it would work. Whilst we could see the tool had good functionality, how practical would it be to use it, how easy would it be to create sessions, what skills we need. It definitely felt a bit like “blue sky” gazing.
Key to this was having a go, collecting evaluation data so we could see what was working, the impact etc
[2 mins JFJ]
Early sessions e.g. introduction to library services, were limited in terms of interactivity
Main interaction was users communicating questions to presenter through use of VOIP or text chat. Also used voting buttons to check they were still with me.
This reflects the need for trainers to gain confidence within the medium. Needed to get used to managing the environment and the users within that before did anything too clever. Learning how to deliver meant slowing down the pace, ensuring instructions are clear, how to keep tabs on 2/3 area of screen at same time
Requires multitasking on a grand scale - like patting your head and rubbing your stomach at the same time – preparation and rehearsal absolutely key. Need to be aware of what could go wrong and have troubleshooting advice ready. Flexibility needed so as not to be flummoxed by technology, and variety of setups of participants, for example, some without microphones who could only communicate via text chat.
But wanted to see if we really could replicate a face to face session with hands on practice so designed a “Blogs and wikis” session where we got users to create a wiki page within the session
<play recording?/demo insert URL – probably not time?>
[1 min KJR]
What was interesting to note was the impact the learning design made on the perception of the users as to whether the tool supported interaction. In the evaluation we compared 2 sessions: A student based introduction to library services and a blogs and wikis session. The introduction session was talk and demo, the bogs and wikis had hands on activity
You can see here that on the student session most rated the interaction as good, yet on the blogs and wikis session it was split between excellent and good. The final column shows the average rating for all sessions
So from this we can deduce that how we design our sessions in terms of the interactions we build in does impact on the user experience. In summary – perhaps not surprisingly, people liked an interactive session more than just watching a demonstration.
[1 min KJR]
Our evaluation looked at both the system and the presenter.
Kirkpatrick (1994, p.21) identifies four levels by which you can evaluate training programmes: reaction, learning, behaviour and results. This pyramid shows what these levels represent, and to what extent we can measure these within this pilot.
Reaction: participants immediate reaction: measured via “happy sheets”, recordings and text chats
Learning: The extent to which attitudes are changed, knowledge increased or skills improved, Qualitative data from “happy sheets”, recordings and text logs from sessions
Behaviour: Behaviour changed as a result of training. This is dependent on desire to change, and having support and rewards for changing. 6 month follow up questions are sent to participants to gauge change in practice
Results: To what extent has training produced results e.g. increase in productivity , Given that library training delivers “soft skills” it is difficult to identify specific results
[1 min KJR]
‘Level 1 - Reactions’ provided feedback on people’s impressions immediately after the session.
The evaluation on the “happy sheets” covered two areas: use of the Centra system itself to determine its suitability for delivering training, and evaluation of the presenter, to identify whether they were competent in leading sessions in this medium.
As you can see the system was rated very positively by attendees
[1 min KJR]
The trainers were rated as excellent, which given the fact it was delivering in a new medium with the added complication of technical issues is no small achievement.
There is an anomaly in the data for the question on handouts. 16 users didn’t complete this question, despite the fact that within each session there were documents available to download which were in fact the handouts. For future evaluations this question should be reworded so that people understand which application within the technology this refers to.
In addition the sheets asked whether participants would recommend the event to a colleague. 86% of all respondents said definitely, with 12% responding possibly. This supports initial feedback within sessions where participants either said or recorded in text chat that they found the session “excellent”, “good” or “useful”.
[1 min KJR]
We received a number of comments from tutors and students on the happy sheets and from the recordings of sessions that indicated learning “The extent to which attitudes are changed, knowledge increased or skills improved,” had occurred. Here are just a couple of examples.
[1 min]
We asked ALs to rate whether their confidence had improved and also whether they had changed their practice. This is what we got
From this we have limited evidence that have satisfied ¾ Kirkpatrick’s levels.
It is not easy to get people to respond after 6 months [have also tried using this method of evaluation with face-to-face training], so are now trying after 3 months to see if this elicits a better response.
[5 mins KJR]
Made me think about learning design – how to make session as interactive as possible within the constraints of the medium (check at regular intervals that people haven’t dozed off or lost the will to live…) Try to make use of all the features and functionality of the software to add variety.
Worked together with Juanita to develop the training session I led and she sat in on a couple of practices as well as the actual session itself. Afterwards she provided structured feedback on how the session had gone and what could be done differently in future – very useful.
Knock-on effect on face-to-face training programme, for example, trying to evaluate training in a consistent way so that we can make meaningful comparisons between remote and F2F training. Also led to reflection on good practice in learning design – start with what people know and build on that, check understanding, allow time for people to reflect on and evaluate what they have learnt
Finally, it was very good fun (despite the odd technical blip) and stimulating to learn to use the system and start to investigate its potential for delivering training and support to remote users.
Summarise issues and benefits:
Issues
Lack of visibility – can’t read non-verbal cues
Slowness of pace – have to make sure everyone can hear, keep checking understanding
Group size – not suitable for more than 10 people at once
Time constraints – intense concentration needed, for trainer and participants, which means max session length of about 1.5 hours and can cover less ground due to slower pace. Not so great for hands-on
Accessibility – been addressed with some specially written guidelines?
Benefits
More cost-effective than face-to-face training as so much time is lost in travelling if send someone to where people are.
Can offer to students as well as ALs (currently the focus is on training tutors, so they can then pass on the info to their students)
Has enabled us to develop good practice in holding e-meetings
Providing opportunity for reflective practice for trainers
Has helped us to focus on good learning design
Benchmarking
Applying Kirkpatrick’s model to the evaluation data from last year we have been able to create average scores for each of our criteria. This is the benchmark we are using to compare all future sessions against. We are then using this as a performance indicator, setting ourselves a target that 75% should equal or improve on previous year. Thus we can assess quality of sessions, and continue to build on what we have achieved so far
Rollout
Other colleagues are being trained, the skills and good practice will be transferrable
Helpdesk
There is potential for supporting distance learners in other ways, and perhaps extending the use of web conferencing to ALs for their tutorials. For example, at present we have to diagnose problems by phone without being able to see the person’s screen – this could allow them to show us the problem and we could grab a screenshot of what’s going wrong.
The next step
The OU has chosen Elluminate, a similar system. The experience with Centra will be useful when Elluminate is rolled out for use with the OU’s VLE. Although the OU ultimately decided to opt for a different system to Centra, Elluminate is similar in many ways (mention a couple of key differences that relate to the OU’s particular situation?) and the sessions already designed should be reusable with Elluminate as the emphasis has been on good learning design. [we may have to leave this out… because the monsters]
What is Centra – what does it look like? (use only if demo fails)
Point out agenda – and explain that has tool that enables you to create content in advance, including putting placeholders for interactions such as white boards, application shares etc
Just point out quickly ways attendees can interact with presenter
Note that the web page being shared is displayed in the center frame. On the left are the leader, participants and agenda for the session. The leader is sharing their desktop
This shows the markup tools on an application share. Presenters can use these to highlight sections of the screen
Note that the web page being shared is displayed in the center frame. On the left are the leader, participants and agenda for the session. The leader is sharing their desktop