1. Streaming Video to HTMLJeff TapperDigital Primates@jefftapper
2. Who am I?• Senior Consultant at Digital Primates– Building next generation client applications• Built video applications for many of the mostattended streaming events.• Developing Internet applications for 17 years• Author of 12 books on Internet technologies
3. Agenda• Video and the Internet today• Understanding HTTP Streaming• What are the Streaming options without aplugin?• Understanding MediaSource Extensions (MSE)• What is DASH• Making it work in a browser• Questions
4. Video is dominating the Internet• Desktop: Video makes up 50% of traffic at peak periods– notably 30% from Netflix and 11% from YouTube• Mobile: Video traffic is growing exponentiallyFixed Internet Mobile Internet
5. HTTP Adaptive StreamingMedia Capture & EncodingMedia Origin Servers HTTP Cache Servers Client Devices00101010000101001010101000111001110100011010101001010100001010010101010001110011101000110101010010101000010100101010100011100111010001101010100101010000101001010101000111001110100011010101Encode each segmentat multiple bitrates2 Split the video intosmall segments1Make each segmentaddressable via a HTTP-URL3Client makes decisionon which segment todownload4Client splices togetherand plays back5
6. HTTP Streaming Landscape• Apple’s HTTP Live Streaming (HLS)• Microsoft’s Smooth Streaming• Adobe’s HTTP Dynamic Streaming (HDS)• And many more…
7. The challenge• Most agree that HTTP Streaming is the mostefficient choice• Different devices support different streamingprotocols• No one standard is currently supportedubiquitously• Results in media being served in severaldifferent formats to support the broadestrange of devices
8. What do browsers support?• Unfortunately, Progressive Download is the onlyubiquity supported option• Different Browsers support different videocodec’s– H.264– webM– Etc.• Safari (iOs and MacOS only) natively supports HLS• MediaSource Extensions in Chrome (and soonothers)
9. MediaSource Extensions (MSE)• MSE allow for pieces (segments) of media tobe handed to the HTML5 video tag’s bufferdirectly.• This enables HTTP Streaming in HTML• Not universally supported, yet.
10. What is MPEG-DASH DASH – Dynamic Adaptive Streaming via HTTP International open standard, developed andpublished by ISO Addresses both simple and advanced use cases Enables highest-quality multiscreen distributionand efficient dynamic adaptive switching Enables reuse of existing content, devices andinfrastructure Attempts to unify to a single standard for HTTPStreaming
11. DASH and codecs• The DASH specification is codec agnostic• Any existing or future codec can work withDASH• DASH manifest describes which codec is used– Different codecs store the actual video datadifferently
12. DASH264• H.264 is dominant format today• Many vendors and service providers arecommitted to supporting/enabling DASH264• Provides support for today’s requirementssuch as DRM• H.264 is backed by rigorous testing andconformance
13. DASH Industry Forum• Addressing the dramatic growth of broadbandvideo by recommending a universal deliveryformat that provides end users with the bestpossible media experience by dynamicallyadapting to changing network conditions.
14. DASH Industry Forum• Objectives:– promote and catalyze market adoption of MPEG-DASH– publish interoperability and deployment guidelines– facilitate interoperability tests– collaborate with standard bodies and industryconsortia in aligning ongoing DASH standardsdevelopment and the use of common profiles acrossindustry organizations• Over 65 members• Visit http://dashif.org for more information• Released the DASH/264 standard
16. How to play a DASH Stream• Download Manifest• Parse Manifest• Determine optimal bandwidth for client• Initialize for bandwidth• Download Segment• Hand segment to MSE• Check Bandwidth to determine if change isnecessary
17. Understanding DASH structure• Three types of files– Manifest (.mpd)• XML file describing the segments– Initialization file• Contains headers needed to decode bytes in segments– Segment Files• Contains playable media• Includes:– 0…many video tracks– 0…many audio tracks
18. DASH Manifest• Manifest contains:– Root node– 1 or more periods• Periods contain 1 adaptation set per video stream and• Periods contain 1 adaptation set per audio stream• Adaptation Sets contain:– Content Composition nodes (for each video or audio track)– 1 or more Representation node» Each representation describes a single bitrate» Representations contain data on finding the actual segments» Different ways a representation can describe segments
19. Describing Representations• SegmentBase– Describes a stream with only a single Segment per bitrate– Can be used for Byte Range Requests• SegmentList– A SegmentList will contain a specific list of eachSegmentURL (individual HTTP packet with media data)– Can be used for Byte Range Requests• SegmentTemplate– Defines a known url for the fragment with wildcardsresolved at runtime to request a segments (see bbb.mpd)– Alternatively, can specify a list of segments based onduration
24. Tools used by dash.jsCore Player• Q – Asynchronous handling with promises• Dijon – DI / IOC• Jasmine – unit testsWeb Site• JQuery – DOM manipulation• Flat-ui – UI elements• Flot – Charting• Kendo - Components
25. Class Structure• The player is divided into two main packages.• streaming – Contains the classes responsiblefor creating and populating the MediaSourcebuffers. These classes are intended to beabstract enough for use with any segmentedstream (such as DASH, HLS, HDS and MSS).• dash – Contains the classes responsible formaking decisions specifically related to Dash.
26. streaming package
27. MediaPlayer.js• Exposes the top level functions and propertiesto the developer (play, autoPlay, isLive, abrquality, and metrics).• The manifest URL and the HTML Video objectas passed to the MediaPlayer.
28. Context.js• The dependency mapping for the streampackage.• The context is passed into the MediaPlayerobject allowing for different MediaPlayerinstances to use different mappings.
29. Stream.js• Loads/refreshes the manifest.• Create SourceBuffers from MediaSource.• Create BufferManager classes to manageSourceBuffers.• Responds to events from HTML Video object.• For a live stream, the live edge is calculatedand passed to the BufferController instances.
30. Debug.js• Convenience class for logging methods.• Default implementation is to just useconsole.log().• Extension point for tapping into loggingmessages.
31. BufferController.js• Responsible for loading fragments andpushing the bytes into the SourceBuffer.• Once play() has been called a timer isstarted to check the status of the bytes in thebuffer.• If the amount of time left to play is less thanManifest.minBufferTime the next fragmentis loaded.• Records metrics related to playback.
32. FragmentLoader.js• Responsible for loading fragments.• Loads requests sequentially.ManifestLoader.js• Responsible for loading manifest files.• Returns the parsed manifest object.
33. AbrController.js• Responsible for deciding if the current qualityshould be changed.• The stream metrics are passed to a set of‘rules’.• Methods:getPlaybackQuality(type, data) type – The type of the data(audio/video). data – The stream data.
34. DownloadRatioRule.js• Validates that fragments are beingdownloaded in a timely manner.• Compares the time it takes to download afragment to how long it takes to play out afragment.• If the download time is considered abottleneck the quality will be lowered.
35. InsufficientBufferRule.js• Validates that the buffer doesn’t run dryduring playback.• If the buffer is running dry continuously itlikely means that the player has a processingbottleneck (video decode time is longer thanplayback time).
36. LimitSwitchesRule.js• Watches for competing rules to avoid constantbitrate switches.• If two or more rules are causing switches toooften this rule will limit the switches to give abetter overall playback experience.
37. dash package
38. DashContext.js• Defines dependency mapping specific to thedash package.– Parser– Index Handler– Manifest Extensions
39. DashParser.js• Converts the manifest to a JSON object.• Converts duration and datetime strings intonumber/date objects.• Manages inheritance fields.– Many fields are inherited from parent to childnodes in DASH.– For example, a BaseURL can be defined in the<MPD> node and all <Representation> nodesinherit that value.
40. DashHandler.js• Responsible for deciding which fragment URL should beloaded.• Methods: getInitRequest(quality) – Returns an initializationrequest for a given quality, if available. getSegmentRequestForTime(time, quality) – Returnsa fragment URL to load for a given quality and a giventime. Returns a Stream.vo.SegmentRequest object. getNextSegmentRequest(quality) – Returns the nextfragment URL to load. Assumes thatgetSegmentRequestForTime() has already been called. getCurrentTime (quality) – Returns the time for thelast loaded fragment index.
41. DashHandler.js (cont’d)• Uses available information in the manifest(SegmentList, SegmentTemplate, SegmentBase).• When using a single, non-fragmented mp4 file the SIDX boxwill be loaded to determine byte ranges for segments.
42. Flow1. Create the Context and MediaPlayer instances.var context = new Dash.di.DashContext(),player = new MediaPlayer(context);2. Initialize MediaPlayer and set manifest URL.player.startup();player.setIsLive(false);player.attachSource(manifest_url);3. Attach HTML Video element.video = document.querySelector(".dash-video-playervideo"),player.autoPlay = true;player.attachView(video);
43. 2. Call play()on the MediaPlayer (if autoPlay =false).3. The Stream object will be created and initialized with themanifest URL.4. The manifest is loaded and then parsed.5.MediaSource, SourceBuffers, andBufferControllers are created.– Create one BufferController per stream type (usuallyvideo and audio).6. Set the duration of the MediaSource to the duration of themanifest (or infinity for a live stream).7. If the stream is live, calculate the live edge.8. Call play() on the HTML video element.9. The BufferManager instances create a timer. When thetimer ticks the state of the buffers is checked.
44. BufferManager.validate()1. Check to see if the buffers need more data.• Must be in a playing state.• Must not already be loading data.• Must require more data to be buffered.amountBuffered < manifest.minBufferTime2. If automatic ABR is enabled check to see if the bitrateshould be changed.• Ask AbrController for the new quality.• Rules will determine which bitrate to change to.3. If initial playback, seeking, or the bitrate has changed loadthe initialization fragment (if available).
45. 4. Ask the IndexHandler for the next fragment request.• If seeking pass the seek time to the IndexHandler.• Otherwise ask for the ‘next’ fragment.• Pass the bitrate to the IndexHandler.6. The IndexHandler returns a SegmentRequest indicatingwhat action the BufferManager should take next.• “download” – Download and append the fragment to the buffer.• “stall” – Wait because the IndexHandler is not ready.• “complete” – Signal that the stream has completed playback.7. Repeat.
46. Resources• GPAC– http://gpac.wp.mines-telecom.fr– Provides baseline test streams– Provides baseline player• MP4Parser– http://code.google.com/p/mp4parser/– Open Source java project– Allows for display of contents within boxes• DASH Industry Forum– http://www.dashif.org– Test Vectors– Reference Player