User Experience –
Streaming Analytics
• 100 millions video views each day
• 60% desktop - 31 % mobile - 8% tablet - 1% TV
• 97% VOD - 3% live
• World Wide (Fran...
• Increasing user engagement => Raising revenue
• Multi dimensional : loading, engagement, rebuffering, video quality
• Us...
• How to implement data pipeline ?
• How to understand what is going on ?
• User quality metrics
• Video quality metrics
•...
Player events Data aggregation
Visualization
Data pipeline : Architecture Overview
• real-time activity
• watched > 0
• 1 month
Data visualization : Heatmap example
• More data beats better models - avoiding overfitting
• Better data beats more data - cleaning outliers
• The 80/20 rule
...
• Choosing metrics => process not deterministic
• User engagement : Played , Watched , Watched ratio?
• Rebuffering event ...
• CDN comparison
• Routing optimization
• Country:KR
• Stream type : recorded
Data analysis : latency / CDN
• CDN comparison
• Routing optimization
• Country:KR
• Stream type : recorded
Data analysis : kbps / CDN
• seekNb
• negative correlation
• stream type : recorded
• 1 month
Data Analysis : seekNb / engagement
• buffering measure choice
• negative correlation
• stream type : recorded
• 1 month
Data analysis : buffering ratio / eng...
• rebufferingNb
• Negative Correlation
• stream type : recorded
• 1 month
Data analysis : rebufferingNb / engagement
• quality switch
• ABR algorithm
• stream type : recorded
• 1 month
Data analysis : level avg
State of ABR - stream tech comparison - VoD
rebufferingNb, percentage per tech worldwide
native
83.6%
hls.js
89.4%
flashls...
State of ABR - stream tech comparison - live
rebufferingNb, percentage per tech worldwide
native
70.4%
hls.js
73.6%
flashl...
introduce history parameter to bandwidth estimation in
inspired from
ABR now based on two bandwidth moving average
• a fas...
uses
Are these magic numbers suitable for our use case ?
ABR magic numbers
A/B testing ABR
define 20 traffic segments, each using a different config
enable in production …
Iteration 1 Fast average ...
A/B testing ABR
wait for enough samples ( ~ 1 million per group)
compare key metrics
• rebuffering rate
• rebuffering rati...
State of ABR - stream tech comparison - VoD
number of rebuffering, percentage per tech worldwide
native
83.6%
hls.js
89.4%...
State of ABR - stream tech comparison - live
number of rebuffering, percentage per tech worldwide
native
70.4%
hls.js,s=0,...
nb of level switch - live
hls.js,s=0,f=0
hls.js,s=9,f=5
network delivery
use streaming metrics to rank CDNs per region / ISP
redirect stream to best CDNs based on past history
tr...
tommy.nacass@dailymotion.com
guillaume.dupontavice@dailymotion.com
https://github.com/dailymotion/hls.js
thanks!
Prochain SlideShare
Chargement dans…5
×

Paris Video Tech - 1st Edition: Dailymotion Améliorer l'expérience utilisateur grâce aux analytics

358 vues

Publié le

Dans ce talk, nous expliquons comment chez Dailymotion nous exploitons nos analytics pour optimiser notre delivery vidéo ainsi que notre player/media engine.

Publié dans : Technologie
0 commentaire
0 j’aime
Statistiques
Remarques
  • Soyez le premier à commenter

  • Soyez le premier à aimer ceci

Aucun téléchargement
Vues
Nombre de vues
358
Sur SlideShare
0
Issues des intégrations
0
Intégrations
11
Actions
Partages
0
Téléchargements
4
Commentaires
0
J’aime
0
Intégrations 0
Aucune incorporation

Aucune remarque pour cette diapositive

Paris Video Tech - 1st Edition: Dailymotion Améliorer l'expérience utilisateur grâce aux analytics

  1. 1. User Experience – Streaming Analytics
  2. 2. • 100 millions video views each day • 60% desktop - 31 % mobile - 8% tablet - 1% TV • 97% VOD - 3% live • World Wide (France = ~ 30%) • 69% html5 - 31% flash Introduction : Dailymotion Facts
  3. 3. • Increasing user engagement => Raising revenue • Multi dimensional : loading, engagement, rebuffering, video quality • User experience is context sensitive • device • content Goal : What is user experience ?
  4. 4. • How to implement data pipeline ? • How to understand what is going on ? • User quality metrics • Video quality metrics • Network quality metrics • How to improve user experience ? • Optimize delivery • Optimize player Goal : Improving user experience
  5. 5. Player events Data aggregation Visualization Data pipeline : Architecture Overview
  6. 6. • real-time activity • watched > 0 • 1 month Data visualization : Heatmap example
  7. 7. • More data beats better models - avoiding overfitting • Better data beats more data - cleaning outliers • The 80/20 rule • P-value - measure uncertainty Data pipeline : Basic Rules
  8. 8. • Choosing metrics => process not deterministic • User engagement : Played , Watched , Watched ratio? • Rebuffering event => waitTime > X ms ? Data pipeline : choosing key metrics
  9. 9. • CDN comparison • Routing optimization • Country:KR • Stream type : recorded Data analysis : latency / CDN
  10. 10. • CDN comparison • Routing optimization • Country:KR • Stream type : recorded Data analysis : kbps / CDN
  11. 11. • seekNb • negative correlation • stream type : recorded • 1 month Data Analysis : seekNb / engagement
  12. 12. • buffering measure choice • negative correlation • stream type : recorded • 1 month Data analysis : buffering ratio / engagement
  13. 13. • rebufferingNb • Negative Correlation • stream type : recorded • 1 month Data analysis : rebufferingNb / engagement
  14. 14. • quality switch • ABR algorithm • stream type : recorded • 1 month Data analysis : level avg
  15. 15. State of ABR - stream tech comparison - VoD rebufferingNb, percentage per tech worldwide native 83.6% hls.js 89.4% flashls 90.6%
  16. 16. State of ABR - stream tech comparison - live rebufferingNb, percentage per tech worldwide native 70.4% hls.js 73.6% flashls 80.6%
  17. 17. introduce history parameter to bandwidth estimation in inspired from ABR now based on two bandwidth moving average • a fast one : adapting down quickly • a slow one : adapting up more slowly bw estimate = min ( fast, slow) Data-Driven Development : ABR Algorithm
  18. 18. uses Are these magic numbers suitable for our use case ? ABR magic numbers
  19. 19. A/B testing ABR define 20 traffic segments, each using a different config enable in production … Iteration 1 Fast average Slow average control group 0 0 test group 1 0 1 test group 2 0 2 ... 1 1 test group 18 1 9 test group 19 1 10
  20. 20. A/B testing ABR wait for enough samples ( ~ 1 million per group) compare key metrics • rebuffering rate • rebuffering ratio • user engagement • average quality, quality switches iterate/circle around best samples
  21. 21. State of ABR - stream tech comparison - VoD number of rebuffering, percentage per tech worldwide native 83.6% hls.js 89.4% flashls 90.6% hls.js,s=15,f=4 90.7% hls.js,s=9,f=4 90.2%
  22. 22. State of ABR - stream tech comparison - live number of rebuffering, percentage per tech worldwide native 70.4% hls.js,s=0,f=0 73.6% flashls 80.6% hls.js,s=9,f=5 79.3% hls.js,s=7,f=5 74.7%
  23. 23. nb of level switch - live hls.js,s=0,f=0 hls.js,s=9,f=5
  24. 24. network delivery use streaming metrics to rank CDNs per region / ISP redirect stream to best CDNs based on past history transcoding A/B test different fragment duration media engine / player optimization start rendition progressive fragment parsing (Fetch API) Next data driven improvement
  25. 25. tommy.nacass@dailymotion.com guillaume.dupontavice@dailymotion.com https://github.com/dailymotion/hls.js
  26. 26. thanks!

×