Presentations from 8 July 2015 CDE Innovation Network event. For more information see: https://www.gov.uk/government/news/cde-innovation-network-event-with-uk-defence-solutions-centre
22. Page 22
Modern rapid integration approaches
offer improved exploitation potential
Integration and customer specific tailoring
Open technical architecture and standards
International training market
Applications and components
23. Page 23
Modern rapid integration approaches
offer improved exploitation potential
Integration and customer specific tailoring
Open technical architecture and standards
International training market
IPR holders
Applications and components
Prime integrators IPR
Open IPR
Technology and
application
providers IPR
24. Page 24
Modern rapid integration approaches
offer improved exploitation potential
• consistency for customers
• access to latest investments for prime integrators
• technology providers matching capability to customer
need
Layered approach with open architecture is key to :
Individuals / small teams / larger teams / cross-service / coalition
Basic training, specific training, refresher/skills-maintenance, pre-deployment familiarisation & assessment, “mission preparation”
Enhancing existing training capabilities, as well as for inclusion in future training systems
Local benefit, but also wider benefit across defence.
Shared investment (“buy once, use many”)
Good, experienced, effective instructors make a huge difference, but rare & getting rarer.
How do we augment them, so they can spend more time delivering training rather than “wrangling” the training system.
Allow them to personalise the training to the trainee without creating additional workload?
Evidence to support their “trainers’ intuition”. Already doing some of this, where it is simple & straightforward (e.g. gunnery training – “gunnery module” for C-IOS)
All projects, that involve experimentation with humans, will need to under go some form of ethics evaluation. Typically that involves any training related trails where you will be gathering information on human response.
Ethical issues around personal data for Challenge 1 – e.g. heart rate, skin resistance, etc., for different tasks for a named individual
The protocol will need to be detailed by completing the ethics application form found on the MOD Research Ethics Committee (MODREC) web site shown below. A break point should be included after Milestone 1.
Warning : MODREC approval will likely take longer than anticipated, especially if you haven’t been through it before.
Ethical considerations for Challenge 2 – if proposing experiments on people
Suggest a phased approach:
Milestone 1: Gaining ethics approval for the project, including delivery of the research protocols.
Milestone 2: Proposed research that will be carried out subject to gaining ethics approval.
Reduce instructor cognitive loading
Provide trainees with accurate information on their performance of tasks and maintenance of competencies.
An example of this could be a feedback loop which evolves and responds to the trainee’s performance, keeping the trainee in a carefully maintained optimum training zone.
You need to think about both the human and technology perspectives in your proposal.
Not looking for a closed solution; a framework for gathering, analysing, storing & displaying training-related information; use of a “plug-in” structure so that data sources & analysis modules can change/grow over time.
Examples – being configured to enable access to specific local data availability/structures/formats; introduction of a new convoy performance “scoring module” for a new type of training system; new output modules can be added in, to allow others to exploit the information from the original training system (both for the individual, in future phases of training, but also to get a better understanding of the trainee ‘population’ over time, trends in Instructional effectiveness, etc.)
Early insights into an individual’s training needs. Low addition workload on Instructor to deliver individually customised training.
Mapping between training objectives / measures / immediate performance / outcomes.
e.g. Provide Instructors with insight on how the individual is performing compare to peers – current group & historical. Identify trends, early indicators, focus training where it will have the most benefit?
Important that it meaningful to the Instructor, easy to understand, credible/believable, easy to act upon
Measurable – progress compared to plan/instructor’s interventions; accuracy; rate/speed; etc.
e.g. for driver training, how well does trainee follow the ‘optimal’ path (where ‘optimal’ could be instructors’ path, average of the other trainees, theoretical optimal “driving line”, etc.). Use of physiological measures (appended or remote sensing) to track trainee’s stress levels, arousal, engagement, etal., with the task. Identify “learning style” preferences.
We are told that “it isn’t possible to replicate the feeling of live training in simulation”.
Why not?
“Suspension of disbelief” is often cited; what can we do, with evidence to support it, to close that “gap”?
eg using unrelated games to develop competencies. How to overcome the “face credibility” issues – “it doesn’t look like my ‘office’, so it can’t be of any use to me.”
I will now hand over to Ed Frankland from UK DSC…
Holding slide.
[NEXT]
Holding slide.
[NEXT]
“Oculus Rift” or other low-cost, contemporary VR … unless there is a new ‘spin’, say, in conjunction with some novel haptic technology, that provides additional capability above & beyond each on their own