Engaging learners in eLearning—and retaining their interest—has been an ongoing challenge for eLearning managers. The effectiveness of classroom training is heavily dependent on the trainer’s personality. Providers of eLearning have been using game-based learning themes with interactive elements that allow users to interact using input devices like the mouse or touchpad, but users’ engagement is limited to their eyes and hands only. Certain concepts require users to have hands-on practice and more interaction than a laptop or mobile device can provide.
Experiential learning games assist trainers in delivering much more engaging and standardized sessions, and learners are able to practice skills in a close-to-real simulated environment during and after the learning session. The Kinect system identifies individual players through body gestures and voice recognition. You can use this feature of Kinect to create experiential learning games that can take engagement to new levels. Participants will see a live demonstration of motion-sensing learning games in real learning situations. You’ll explore ways to implement such experiential learning games for your own business needs to make learning—and the post-learning consolidation or retention process—much more engaging and effective.
In this session, you will learn:
1. About motion sensing technology and how to apply it to learning
2. The advantages of experiential learning games
3. When experiential learning games are most effective
4. Samples of Kinect-based solutions
5. Challenges in developing these solutions
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Using Xbox Kinect for Delivering Experiential Learning Games
1. Using Xbox Kinect for Delivering
Experiential Learning Games
Kinectify your Digital Learning Initiatives
Presented By –
Manish Gupta
CEO, G-Cube
Using Xbox Kinect for Delivering
Experiential Learning Activities
www.gc-solutions.net
2. Agenda
•
Motion Aware Learning Experiences
•
Motion and Learning in your Organization
•
Kinect based learning solutions
•
Core Aspects of building Kinect based Learning
Experiences
•
Other Motion Tracking Options
Using Xbox Kinect for Delivering
Experiential Learning Activities
www.gc-solutions.net
4. Motion Awareness
• Triggered via visuals, sounds, electrical and mechanical activities
• Detecting motion
– Motion detection for security and lighting
• Directional Motion
– Directional motion detection for whole objects such as detecting
approaching/receding vehicles
• Granular motion detection
– Parts of an object – from facial expressions & finger movements to
full body motions
Using Xbox Kinect for Delivering
Experiential Learning Activities
www.gc-solutions.net
5. Motion Aware Learning Experiences
• Inferring from Motion in learning
– The ability to understand what motion represents when learning is in
progress
– Could be a mixture of many things
• Eye tracking (focus of interest, blink velocity, tiredness etc.)
• Facial Expressions (smiling, frowning, bored)
• Head positions (looking at or away)
• Inferences from other bodily positions
– Leading to insights into concentration & focus, degree of engagement
etc. that can be used to modify learning environments dynamically
Using Xbox Kinect for Delivering
Experiential Learning Activities
www.gc-solutions.net
6. Motion Aware Learning Experiences
• Inferring from Motion for learning
– The ability to infer that the right motions are being learnt as
part of the learning
– Could be a mixture of many things
• A set of physical movements to accomplish an outcome (e.g.
dancing)
• The degree to which speed, force or pressure is being employed in
the action (e.g. learning a sport)
– Leading to insights into how well physical acts embodied in the
learning are being replicated by the learner
Using Xbox Kinect for Delivering
Experiential Learning Activities
www.gc-solutions.net
7. Motion Aware Learning Experiences
• Inferring from Motion through learning
– The ability for learning environments to change based on the
motions of the learner as she progresses through the learning
– Could be a mixture of many things
• Mimicking real world responses in simulated environments (e.g.
aircraft control, safety drill)
• Changing the context and difficulty of learning based on quality of
learners response (e.g. surgery)
• Adapting to trainees that have special disabilities (e.g. sign language
interfaces)
– Leading to learning environments to adapt to learner’s progress
or learner’s special needs
Using Xbox Kinect for Delivering
Experiential Learning Activities
www.gc-solutions.net
8. Activity
Have you experienced or used motion in any of the
ways described so far?
- In learning (motion when learning is in
progress)
- For learning (motion as an integral part of the
learning)
- Through learning (when learning environments
adapt to your motion)
If yes, what has your experience been on the
effectiveness of these motion aware learning
experiences?
Note: Please include both technological and human
observable motion
Using Xbox Kinect for Delivering
Experiential Learning Activities
www.gc-solutions.net
10. Motion & Learning in Your Organization
•
Some obvious statements
– Your e-learning could be made increasingly more engaging and fulfilling if
you knew how your users were experiencing it physically
– A lot of learning that involves physical motion can be now digitally
experienced and assessed to varying levels of detail
– Learning environments can adapt to user physical actions making them
more immersive, thereby increasing learning effectiveness
•
Some obvious caveats
– No technology (yet) can completely simulate real world touch and feel or
real world complexity
– Technology for implementing things such as eye tracking may be expensive
to procure and implement in their current state
– Some of these technologies may raise “big brother” privacy concerns
www.gc-solutions.net
11. Motion in Learning for Sales Function
Presentation Skills for sales people
• Body language
• Verbal Communication
• Facial Expressions
www.gc-solutions.net
12. Motion in Learning for Operations Function
Standard Operating Procedures • Safety Operations
• Operating Machines
• On Ground Training (aircraft
signaling, medical procedures)
www.gc-solutions.net
13. Motion in Learning for Recruitment Function
•
•
Psychometric Profiling
– Present lifelike immersive
scenarios
– Full voice and body language
analytics
Induction Program
– Immersive virtual exploration
of the company
– Foreign location orientation
www.gc-solutions.net
14. Motion in Learning for People Function
•
•
Employee Wellness Initiative
– Fitness training programs
– Competitive and
collaborative fitness based
events
– Gamification
Self Defense programs
www.gc-solutions.net
15. Motion in Learning for the Customer
Activity
For Customer Facing Solutions • Clothes Shopping
• Spectacles or Hairstyles
• Games for Discounts
• Shopping Recommendations
www.gc-solutions.net
16. Activity
Take the function(s) you are involved in
supporting in your organization.
Conjure up as many examples as you can
to employ motion in learning for your
organization.
Critics: Come up with as many
“yeah, but…” statements.
www.gc-solutions.net
18. Affordances
•
•
•
•
•
•
•
Gesture based computing
Facial Recognition
Speech Recognition using MS Speech engine
Object Reconstruction
Community (Xbox Live)
Integrated Real-time Video
Content (Kinect enabled experiences)
www.gc-solutions.net
19. Introducing Kinect
•
•
•
•
Color VGA video camera - This video camera aids in
facial recognition and other detection features by
detecting three color components: red, green and
blue.
Depth sensor - An infrared projector and a
monochrome sensor work together to "see" the
room in 3-D regardless of the lighting conditions.
Multi-array microphone - This is an array of four
microphones that can isolate the voices of the
players from the noise in the room.
Detects and tracks 48 points on each player's
body, mapping them to a digital reproduction of
that player's body shape and skeletal
structure, including facial details
www.gc-solutions.net
22. Speech
•
•
•
•
Speech recognition is one of the key functionalities of the
Kinect API.
The Kinect sensor’s microphone array is an excellent input
device for speech recognition-based applications.
It provides better sound quality than a comparable single
microphone and is much more convenient to use than a
headset.
Managed applications can use the Kinect microphone with
the Microsoft.Speech API, which supports the latest
acoustical algorithms.
www.gc-solutions.net
23. Kinect Interaction
•
•
•
•
•
Identification and tracking of primary interaction hand
of upto two users
Detection services for user's hand location and state
Grip and grip release detection
Press detection
Hover, select, wave, and other standard interactions to
minimize learning curve
www.gc-solutions.net
24. Kinect Fusion
•
•
•
Kinect Fusion provides 3D object scanning and model creation using a
Kinect for Windows sensor
The user can paint a scene with the Kinect camera and simultaneously
see, and interact with, a detailed 3D model of the scene
Kinect Fusion can be run at interactive rates on supported GPUs, and
can run at non-interactive rates on a variety of hardware
www.gc-solutions.net
25. Face Tracking SDK
•
•
The Face Tracking SDK’s face tracking engine analyzes input from a Kinect
camera, deduces the head pose and facial expressions, and makes that
information available to an application in real time.
For example, this information can be used to render a tracked person’s head
position and facial expression on an avatar in a game or a communication
application or to drive a natural user interface (NUI).
www.gc-solutions.net
26. Latest SDK 1.8
•
•
•
•
•
New background removal - An API removes the background behind the active user
so that it can be replaced with an artificial background
Realistic color capture - A new Kinect Fusion API scans the color of the scene along
with the depth information so that it can capture the color of the object along with
its three-dimensional (3D) model
Improved tracking robustness - This algorithm makes it easier to scan a scene.
HTML Interactions - It allows developers to use HTML5 and JavaScript to implement
Kinect-enabled user interfaces
Adaptive UI - Build an application that adapts itself depending on the distance
between the user and the screen—from gesturing at a distance to touching a
touchscreen.
www.gc-solutions.net
28. Design Considerations
Choose your use cases carefully
•
•
•
Embedding physical motion in learning can be a daunting task for
most of regular corporate learning because we are so used to
traditional eLearning and classroom modes
Kinect’s most obvious use cases are where physical motion is part
of the training (At this point in time, full gross limb movements are
better tracked as compared to fine grained limb movements (e.g.
fingers))
Building algorithms to do intelligent things (and not just skeletal
tracking or facial recognition) can get very complex very quickly
www.gc-solutions.net
29. Design Considerations
3D Worlds, Gamification, Serious Games and Simulations (GSGS)
•
•
•
•
Typically these will involve some components from 3D sets, and a
Serious Games / Simulation based component.
Using GSGS in motion based learning environments increases the
effectiveness and engagement levels
Learners are more motivated in competitive settings and there are
also avenues for rich collaboration between players
Bodily immersion into the gaming environment increases levels of
engagement
www.gc-solutions.net
30. Design Considerations
Leverage the Human Interface
•
•
•
Too often, the most used actions in an
educational solution maybe simple “point and
press” actions
Kinect provides a substantially different
interface to performing actions – use natural
actions as far as possible
Read the Kinect Human Interface Guidelines for
a detailed look at how the interface should be
designed
www.gc-solutions.net
31. Deployment Considerations
•
•
•
•
Space – the sensor requires around 4 ft distance from the user and
the user should not feel cramped executing the actions
Privacy – users may feel “odd” or “silly” performing physical actions
in front of the screen
Personalization – you would need to manage quick identification of
users on shared Xbox installations and map them to the corporate
directory
LMS – you could still integrate with an LMS if it supports web
services for communications or by using Tin Can API
www.gc-solutions.net
33. Leap Motion
•
The Leap Motion Controller senses
your hands and fingers and follows
their every move
–
–
–
–
Useful for learning interactivities
where minute hand movements are to
be tracked (e.g. working with small
instruments)
Mold, stretch, or bend 3D objects
Take things apart and put them
together
Interact with content using hand
movements
www.gc-solutions.net
34. Tobii
•
Tobii is one of the leaders in eye
tracking and gaze interactions. This
technology makes it possible to know
where exactly users are looking, which
can result into various applications –
–
–
–
From point and click to look and do
Interacting with content without
distractions
Can be used as assistive technology
Can be used to evaluate user content
interaction patters in controlled group
www.gc-solutions.net
35. Summary
1
Kinect and other gesture friendly devices are
exciting innovations and have the potential of
augmenting our current learning solutions
2
3
Top use cases will be in the areas where
physical motion is demanded as part of
the training or where serious games can
be built that involve bodily motion
Using Xbox Kinect for Delivering
Experiential Learning Activities
www.gc-solutions.net
However, Kinect based learning
solutions development is not easy and
design needs to be extremely strong
http://www.willatworklearning.com/2009/02/eye-movement-studies-should-we-be-using-them-for-learning-design.htmlhttp://elearnmag.acm.org/featured.cfm?aid=1833511http://i-know.tugraz.at/papers/adele-a-framework-for-adaptive-e-learning-through-eye-trackingBlink velocity and frequency togetherwith the eyelids’ degree of openness can provide information on the user’s tirednesslevel. Increasing tiredness is said to be indicated by increasing blink rate, decreasingblink velocity and decreasing degree of openness [Galley 2001].http://i-know.tugraz.at/wp-content/uploads/2008/11/70_adele.pdf
http://www.willatworklearning.com/2009/02/eye-movement-studies-should-we-be-using-them-for-learning-design.htmlhttp://elearnmag.acm.org/featured.cfm?aid=1833511http://i-know.tugraz.at/papers/adele-a-framework-for-adaptive-e-learning-through-eye-trackingBlink velocity and frequency togetherwith the eyelids’ degree of openness can provide information on the user’s tirednesslevel. Increasing tiredness is said to be indicated by increasing blink rate, decreasingblink velocity and decreasing degree of openness [Galley 2001].http://i-know.tugraz.at/wp-content/uploads/2008/11/70_adele.pdf
http://www.willatworklearning.com/2009/02/eye-movement-studies-should-we-be-using-them-for-learning-design.htmlhttp://elearnmag.acm.org/featured.cfm?aid=1833511http://i-know.tugraz.at/papers/adele-a-framework-for-adaptive-e-learning-through-eye-trackingBlink velocity and frequency togetherwith the eyelids’ degree of openness can provide information on the user’s tirednesslevel. Increasing tiredness is said to be indicated by increasing blink rate, decreasingblink velocity and decreasing degree of openness [Galley 2001].http://i-know.tugraz.at/wp-content/uploads/2008/11/70_adele.pdf
http://www.willatworklearning.com/2009/02/eye-movement-studies-should-we-be-using-them-for-learning-design.htmlhttp://elearnmag.acm.org/featured.cfm?aid=1833511http://i-know.tugraz.at/papers/adele-a-framework-for-adaptive-e-learning-through-eye-trackingBlink velocity and frequency togetherwith the eyelids’ degree of openness can provide information on the user’s tirednesslevel. Increasing tiredness is said to be indicated by increasing blink rate, decreasingblink velocity and decreasing degree of openness [Galley 2001].http://i-know.tugraz.at/wp-content/uploads/2008/11/70_adele.pdf