This document outlines a 5-step process for using data to develop a data-driven learning and development strategy:
1. Benchmark current performance to establish goals and measure improvements.
2. Identify training needs by talking to stakeholders and understanding learner wants and needs.
3. Use collected data to inform the strategic direction of learning based on business and learner needs.
4. Implement training programs and evaluate their performance based on goals and metrics.
5. Engage in continuous improvement by testing changes, collecting more data, and refining the strategy over time.
2. A bit about me
• Over 5 years’ in the learning technologies
industry
• Worked at Kineo, Mind Click/Learning Pool,
Litmos Heroes and now Thrive
• Data-driven marketer, used data to create huge
results in marketing and sales pipelines
3. “More than 60% of HR professionals say using
data analytics in workforcedecisions is
important, but only 27% actually do it.”
www.thrivelearning.com
5. What are our training goals?
Let’s be honest…
…it all boils down to business improvements.
• Better leaders = happier, more engaged
employees
• Product training = more confident, knowledgeable
staff
• Compliance/E&D/H&S training = safer workplace
for all employees
www.thrivelearning.com
6. • Greater pressures on businesses to improve
• 35% Millennial workforce, 46% by 2020
• Dramatic declines in employee engagement
and learner connection
• Distinct shifts in company culture (or at least
expectations of change)
Why the sudden appetite for change?
9. Ways to use data in L&D
• Iterate and test to improve
approaches (establish pillars of
success through benchmarking)
• Target and segment audiences
for maximum impact
(engagement data)
• Prove ROI and L&D effectiveness
– prove your value, don’t become
obsolete
Only 27% of employees fully trust
the business they work for…
www.thrivelearning.com
10. Data-driven L&D strategy process
L&D goals identification Identify trainingneeds
Use data to develop
strategic plans
Implement plan Evaluate
Benchmark
Metrics/KPIs
Outline
desired
outcomes
Action plans
Target
audiences
Understand
learners and
‘market’
'Campaign’ brief
(training
implementation
plan)
Tracking
Reports
Continuous
optimisation
framework
11. Collect
• Employee surveys/interviews/happy
sheets
• Assessments pre and post
• LMS data
• GoogleAnalytics
BUSINESS DATA
• NPS scores
• Sales figures
• Management/leadership surveys
• Staff turnover/HR data
Visualise/Store
• Surveys =Tyepform (FREE)
• Spreadsheets (pivot tables are your BFF).
Smart graphs are also great.
• GoogleAnalytics
• Klipfolio (data visualisation tool) or
Visualizefree.com (FREE)
12. Step 1. Benchmarking
• Build and create your status quo – what does good
look like in your business?
• Establish baseline expectations to clearly measure
and monitor changes in performance
• Provide your strategies and goals with much
needed context
13. Why do we benchmark?
• Identify and prioritise specific areas of opportunity
• Understand your learners needs better
• Identify your strengths and weaknesses
• Setting goals and performance expectations based
on present performance
• Monitoring your performance and effectively
manage change
• Understand learners behavior to become even
more effective
14. How do you benchmark?
• Identify that which you want to benchmark
• Identify existing learning and explore trends
• Outline objectives and goals
• Develop an action plan
• Monitor results and adjust accordingly
… QUALITITIVEANDQUANTITATIVE DATA IS
REQUIRED
15. Step 2: Identify training needs
• Talk to key stakeholders and learners
• Define audiences or ‘personas’
• Understand learners and your ‘market’
• What do your employees want/need?
• How can you deliver learning which
meets those expectations?
16. Step 3: Use data to develop strategic plans
• Use data captured to define strategic
direction of L&D considering..
• What the business needs
• What learners need
• What learners want
www.thrivelearning.com
17. Step 4 and 5: Implement and evaluate
• Track performance of training programmes,
aligned with goals and KPIs
• Use engagement data to inform what learners like,
want and connect with. Do more of what works,
less of what doesn’t.
• Install Google Analytics
• This process never stops – continuous
improvement cycles over time
• A/B test where you can
(and never, ever stop)
19. This is just the beginning…
• You have to start collecting data for it
to be impactful
• Don’t be afraid to fail
• Start small – don’t bite off more than
you can chew
• Make sure you include your learners
in this dialogue
Notes de l'éditeur
Housekeeping:
Webinar will be recorded and mailed out early next week to all registrants. Slide deck will also be shared
Please pop any questions in the chat/Q&A area and I will answer them at the end
You’ve probably not heard of me – but I just wanted to explain a little about who I am and my experience.
I’m massively passionate about data and have found my marketing to be much more effective when I use to to inform my marketing decisions. I have driven millions of pounds of sales pipeline, improved customer engagement and advocacy and greatly increased lead generation, minimised lead churn and ultimately had extremely successful marketing campaigns and email marketing.
WHY? Apart from my charming and weird accent, data and strategy are the main culprits here.
Companies are increasingly pursuing data-driven talent decisions. Whether that’’s to anticipate and remediate skills gaps, eliminate bias in hiring or performance and rewards decisions, or leveraging business scenario planning to ultimately determine the workforce mix.
PWC report survey of over 1200 business and HR leaders. The survey findings highlight the need for organisations to invest in digital tools to drive people decisions. We see this as a ‘no regrets’ move in preparing for the future. But this requires the baseline data to be accurate, and the challenge today is that jobs don’t reflect what people do. Many companies don’t have accurate data on who does what and where, and few have an inventory of their people’s skills for development purposes. This is where using data and analytics can make a real difference.
In my experience, data has been the difference between making an impact and driving real change and nothing really being affected. Gut instinct just isn’t enough anymore and clearly we need better skills. So where do we start?
https://globenewswire.com/news-release/2018/11/13/1650171/0/en/Organisations-are-not-doing-enough-to-prepare-for-the-future-of-work-finds-PwC-report.html
When I started to think about planning this webinar, I thought a lot about the purpose of L&D. I wrote a huge (somewhat ranty) article a while back on LinkedIn about data needing context – it’s purposeless otherwise. If you collect data, but you don’t know why, what for or indeed what your goals or reason for collecting this data are, the data really means very little.
So, I wanted to start with some context. I wanted to reflect on what our purpose in L&D truly is and indeed, what is the purpose of training? What is our context?
POLL
OUR PURPOSE!
The exponentially growing global economy, fast-paced business world, and ultra-modern technological advancements are compelling everyone from a small company to big corporations to increase their client base and grow the business more.
- LEADERS = (lower turnover, improved morale, reduced onboarding costs)
PRODUCT = more sales
COMPLAINCE = less lawsuits minimization of lawsuits/inclusion of employees.
So it all boils down to driving better value for the business from the human resources you have. Of course, there are more fluffy, nice feeling sentiments towards developing your employees. But don’t tell me you would be delivery AML, GDPR courses etc if there wasn’t a threat of legal action.
So – L&D is all about delivering value. But how on earth is value presently being proven? How does L&D even know how to add value without data? I’ve always been bamboozled that L&D isn’t expected to prove their value, much like marketing is. Marketing functions scrabble to join the dots between investment and returns (ie, if I pay for ads on Google, I want to know how many people have clicked it, how many people have converted and indeed my ROI, how many people became customers as a consequence. This takes time – data like this is not gained overnight). Isn’t it mad that L&D makes massive investments with little emphasis on the impact of these training programmes. Did the business benefit? Did the learner benefit? Who even knows?
I cast my mind back to when I first joined the LT industry in 2013. My first LT event was plagued with mobile learning. Two years later, gamification. Let me ask you, would you have really attended a webinar on data analytics even two years ago? Probably not. But data is not a trend.
So why the sudden appetite for change?
So why the sudden appetite for change?
- Greater pressures on businesses to improve - L&D knows they need to keep up but they don’t know where to start. Data helps to contextualise requirements, better understand the current learning landscape and understand your learners.
- More and more millennials at work - Millennials are 35% of the workforce. By 2020 they’ll be 46% of the working population. They expect something different to other generations – and these subsequent changes are a knock on effect of more and more Gen Y entering the workforce. I know you’ve heard all the analogies, 80% of web content consumed will be video next year. Personalised experiences. The internet of things is working hard to drive personalised experiences and learners now expect this level of interaction everywhere. That means work too.
- Huge, notable and dramatic declines in learner connection (take some stats out of the learner experience report) – According to our learner engagement report, Nearly 60% of 18 - 34 year olds contemplated leaving their role last year, Only 27% of learners fully trust the business they work for and 52% of learners do not find their present training engaging. Of that 52%, 62% of employees who found training disengaging also considered quitting in the past year.
Company cultures are changing - (maybe show the modern learner infographic) - transient workforces, WFH. Typical models just aren’t conducive with modern ways of working. Expectations of autonomy (now we fix our own problems, find our own solutions) etc…
Clearly a lot needs to change – so where on earth do you start?
First of all – let’s talk about how we presently deliver training programmes. They’re very linear, they have a clear beginning and end point – training is delivered in sequential, curriculum based patterns. Our CTO talked about this in a blog recently, that we’re using adademic training models and trying to apply them to L&D which doesn’t often work. And then when the training is done, what then? It is forgotten about. Maybe in 5 years it will be re-visited for a revamp… but is this really the right way?
We’re stuck in compliance mindsets…. …the only goal is to complete. No wonder we haven’t bothered ourselves with data – it’s had no context in the traditional way we deliver learning. Irony is that completions don’t align with most of the L&D goals we discussed initially (improving business value – barring the compliance models)
Over the past decade, we marketers been accumulating software, tools and lots of data in an effort to modernize and digitize our organizations.
I started thinking about this in my world of marketing. Why do I collect and use data, and how might these same standards apply to L&D? Why is marketing so similar to L&D? We’re trying to make people do stuff they don’t really want to do.- we’re trying to change mindsets and behaviours
So being the millennial that I am, I googled marketing campaigns. Every single image showed a cyclic model.
And then I thought - well geez, how many L&D programmes are cyclic? How many times have L&D released a piece of learning and measured it and then said, well shit that didn't work. Let’s try something else.
Nope - it’s popped out into the ether, and never to be seen again. Once done, trapped in the deep deep recesses of your dusty old LMS. Where’s the iteration? Where’s the desire to make things better?
In the years I’ve spent working at LT companies, the need for training is always very clear and explicit to our customers. “We need some product training.” “We need some leadership training.”
But often these needs are completely baseless - they come from other key stakeholders, sales managers, C-suite etc expressing a clear need for training which is likely just based on subjective opinion and perhaps gut feel.
This iterative model is, in my opinion, a new way of looking at organisational training – and capturing and analysis data is the first step in that journey. So, let’s use a marketing campaign approach to have a much more structured approach to learning, using data collection, analysis and subsequent iterations. So where do we start?
First of all – I think it’s important to understand the practical applications of data captured before we begin discussing how it’s captured and indeed what to do with it when it is.
Talk to bullet points
This isn’t a finite list by any means, but it shows you three very clear and distinct data streams which all ultimately help to improve L&D’s approach to delivering training.
POLL
How many of you presently use data to influence your L&D strategies?
I have adapted a traditional digital marketing strategy process for L&D – highlight a clear order of events to not only capture data, but use it and organise it in a way which will drive impact for both learners, L&D and the business. I know this looks linear, but the reality is, step 3 4 and 5 should be in a continuous cycle of iteration. So let’s example what a data drive strategy process COULD look like in
Identify goals – this is where your data collection begins and benchmarking will be a massive part of that. This is where you set your status quo and create clear goals and agendas for your L&D activities. Provides direction and aligns efforts.
This is your context – what are your training requirements, how do they align with your goals. What do you learners want vs what key stakeholders need? Understanding your ‘market’ – what’s your learning landscape? What technologies do you have at your disposal? How can you use this tech to meet your goals.
After benchmarking and identification of training needs, you will have a huge amount of qualitive and quantitative data to help create clear strategies and action plans which work to serve your goals and meet business needs.
Implementation and iteration stages – other types of data collected here include engagement data, implementation plans etc
This is about orchestration. It’s being adopted as a mantra by successful CMOs, with their sales and customer success colleagues, to connect internal AND external tech, processes, resources and data to better deliver customer experiences. This means more commitment to connecting it all, and a greater focus on how to draw insights and action from the plethora of customer data now at our disposal. With this in check, we can selectively add and bet on the innovative initiatives that differentiate our brand and business. – L&D is exactly the same .
So, how do we bring all of these important assets together to create a more high-performing organisation?
I just wanted to showcase really quickly, before I go into step 1, in a succinct slide, the sorts of data you can be collecting and what you can do with it.
Not finite – just some starting points for you to consider varied data sources at your disposal.
POLL: Do you presently benchmark training efforts in your business?
In the lead up to this webinar, I had a few registrants highlight some of the things they wanted to learn about when it came to data in L&D – and I really felt that there was a clear lack of understanding about the role data can play. It’s an intimidating endeavor.
Many of them asked me things like: How can I prove people learned something? If I deliver leadership training, how can I tell if someone becomes a better leader or manager? How do you know was a good leader before? How can you measure who learned the most? These questions really got me thinking – measuring the impact of training is really hard – but not impossible. It goes back to what I said earlier, which is all around context.
This is why YOU MUST start with benchmarking before anything else. It’s nigh on impossible to measure whether something has had an impact if you don’t know where you started vs where you ended up. So for leadership programmes, before you implement it, measure the performance of your leaders. Surveys, staff sentiment etc. This data alone is useful for the business to have some context on its performance, but when you do this again in 6 months or a year, after the training - if you have aligned your training to clear goals - then you will be able to correlate training with performance. BENCHMARKING has to be the baseline. If you can’t get this sort of data - launch training programmes in a smaller batch first, which should make collecting benchmark data more easy.
NOT INSTANT GRATIFICATION – remember when I mentioned my campaigns earlier, I have a sales cycle of anywhere between 3-9 months, so seeing results takes time. You may need to take some time to better understand how long it takes
KPIs – you don’t have to put a number against it – IMPROVE. Doesn’t have to be constrained.
This is a critical starting point for any L&D department which is serious about using data for business improvements.
Talk to bullets.
I bet you are all thinking, well this is great Ashley but I have no idea where to start with benchmarking. I’ll show you where to start.
BULLET 1
Where do you want to measure and monitor performance in L&D? People thing strategy is this huge complex thing, but really, it’s just a structured, organized approach to delivering something.
Some starting ideas could be:
*Overall training performance (perhaps correlated with business performance), ie are your learners finding training valuable/useful (LEARNER IMPACT)
* Impact of training (again, how can you measure the impact of training? Learner sentiment and feedback. Performance reviews, surveys (BUSINESS IMPACT)
Going back to the earlier queries that someone asked me, how do you measure the effectiveness of a leadership programme for example. You need to understand the current state of your leaders – so undertaking researching to understand their current performance is critical. Things like annual surveys, even delivering surveys before and after leadership training. Remember to not change more than one variable as then the results become inconclusive (A/B) testing.
Identify existing learning and explore trends - Evaluate existing training performance (LMS data/learner feedback/staff turnover/staff sentiment etc). You need both business data and L&D data here, as well as quantitative learner feedback data. Identify areas which are performing well, identify any trends in training requests (ie could requests for training in the sales team signify a wider training problem, quality of learning etc)
Outline objectives - After the results of the analysis have been interpreted and communicated to the appropriate people, goals should be established – these should be concrete, attainable and in line with your corporate strategy. List all goals in your L&D strategy. Consider SMART goals
Develop an action plan - Define specific, concrete actions to be taken on how to meet objectives.
Monitor results and adjust action plan accordingly - Continuously monitor the results of the benchmarking efforts and ensure the action plans are consistently applied.
Once you have clear benchmarking data, you can see which areas of the business are performing well (and which are not). Off the back of your action plans and benchmarking exercises, you should clearly be able to understand and define clear training needs to fill skills gaps, upskill employees or even enhance current methods of training to drive better business performance.
When I am creating campaigns in marketing, I think really hard about who I am creating them for. Your data will help inform and identify clear training needs, and then these will translate into the training you create/deliver - your ‘campaigns’ as it were.
The finding from your benchmarking will help align and focus your strategies and approaches.
After collecting business data, training performance data and more, you should be in a really great state to develop clear strategies. Your benchmarking will have allowed you to better understand where you are as a business, what is performing well and poorly and this should all tie into your overall strategy. Your qualitative and quantitative data will allow you to better align your strategies with both the requirements of the business and learner, all the while allowing you to monitor and measure performance over time.
Business wants:
More productive, effective staff
Lower turnover
Increased sales
More connected, engaged employees
Learners want:
Learning to reflect their outside experiences
Personalised interactons, modern learning
Bite-sized engagements rather than long, lengthy courses
Learners NEED:
To feel confident, knowledgable and secure in their role
To feel a part of something – Millennials have said they would take a massive paycut to work at organisations where they feel PURPOSEFUL
To be upskilled – feel like the organization is investing in them
Do what you can to create more personalized experiences – experiences which reflect the consumer-grade experiences your learners are having outside of work.
Some systems aren’t going to give you the data you need to get things like engagement stats. Our LXP provides loads of data and even automates a lot of this, ie allows you to serve content to learners based on their specified preferences/likes/wants etc and gives you pre-generated reports on what’s being most effective.
Engagement stats are a huge enhancer here as they will allow you to much more quickly affect change based on learners’ behavior, but you can still capture and analyse data which will better inform you.
A/B testing can be extremely effective – especially if you have already benchmarked. Subject in push emails, colours of web pages, types of content (ie does a pdf or a video perform better etc etc). TEST TEST TEST. Get more data and improve further. Affect your action plans and ensure your objectives are constantly applied.
But even if you don’t have a more modern learning platform, your traditional LMS will have data you can use. Installing GA will allow you to see what pages people are visiting, and its easy to use interface means you can clearly see which content, or pages, are most effective based on page dwell time, bounce rate etc. Things like Klipfolio or other data visualization tools will help you create really clear dashboards and these will be good when presenting data to the c-suite etc.
Shifting business wide mindsets is not an easy task
C-Level still favours gut decisions (you’ll need data to help your arguments about using data, ironic, I know). Over 60% of business influencers trust cognition and gut instinct to make decisions and nearly 60% rely on gut feelings to make work decisions. I worked with a CEO once who refused to acknowledge the data I had that completely contradicted his stance on what we should do in marketing. My data wasn’t good enough to convince him initially, but after 6 months of declines in performance, he started listening.
Ways to convince your C-suite that data is better:
Use data to convince them of data – highlight areas of poor and high performance – correlate with learner sentiment/trust/turnover or any other compelling data that signifies why training is affecting the bottom line.
How is L&D demonstrating impact to C-suite? – Want more budget? Prove impact and prove value.
So how do you convice these people? Touch on a mix of emotion (e.g. anxiety, excitement), intuition (e.g. assumptions, challenges) and cognition/data (e.g. rationale, logic)
This is just the beginning – this will start to help establish a data-driven approach to your L&D functions and give you a clear baseline, or springboard, to action from. The key here is learning from your present (and past) efforts and doing what you can to improve on them through innovation, new approaches, new tech and more. Don’t be afraid to fail – learn from your mistakes and use data, rather than gut instinct to help you do things better.
You have to start collecting data for it to be impactful. At first it might feel like you are collecting in vain, but the reality is, once you have yourself to compare yourself to, then things start to get much more interesting.
Like, for example, Thrive has been around 2 months. I look like a hero at the moment, wooo loads of site visits, increasing month on months, Loads of new leads. But when you start from zero, of course everything looks great. The real great stuff with data is when you get quarter on quarter comparisons, or indeed YoY.
Small ways to start:
- Before you launch your next piece of learning, measure learner sentiment on the subject at hand. Measure if it changes X months after implemented? (benchmarking)
Analyse your current performance – identify top 5 and worst 5 pieces of performing content. Why are these performing poorly or bad?
Things to remember:
Don’t change too many variables (talk to A/B testing) otherwise you won’t be able to correlate performance changes to learning.
Don’t bite off more than you chew – starting with analysis your entire training catalogue is going to be a real headache. Don’t be afraid to draw a line under some stuff, maybe recency (nothing more than 2 years old).
Make sure you include your learners in this dialogue – their sentiment and feedback is going to be the real breaker here (but also bare in mind that people don’t know what they want, stat around people not wanting personalised ads but still engage with them)
- Unfortunately I cannot give you a more prescriptive model than this because each business’ requirements are unique and it’s hard to say what will work for one will likely not work for another. This really is trial error and get comfortable with failing – it’s really the only way to learn