MPEG Augmented Reality Tutorial      Web3D Conference, August 4-5, Los Angeles, CA                Marius Preda, MPEG 3DG C...
MPEG Augmented Reality TutorialTopics of the dayWhat is MPEG?MPEG offer in the Augmented Reality fieldMPEG-A Part 14 Augme...
MPEG Augmented Reality TutorialTopics of the dayWhat is MPEG?MPEG offer in the Augmented Reality fieldMPEG-A Part 14 Augme...
What is MPEG?A suite of ~130 ISO/IEC standardsCoding/compression of elementary media:     – Audio (MPEG-1, 2 and 4)     – ...
What is MPEG?Involvement, approach, deploymentA standardization activity continuing for 24 years      – Supported by sever...
MPEG Augmented Reality TutorialTopics of the dayWhat is MPEG?MPEG offer in the Augmented Reality fieldMPEG-A Part 14 Augme...
MPEG technologies related to AR1992/4   1997   1998                         1999MPEG-1/2(AV content) VRML                 ...
MPEG technologies related to AR2003                2005              2007             2011MPEG-4• Part 16 - AFX:   - A ric...
MPEG technologies related to AR2003      2004       2005              2007       2009   2011MPEG-4• Part 16 - AFX:   - A r...
MPEG technologies related to AR2011             2012               201x      201x        201xMPEG-V - MediaContext and Con...
Main features of MPEG AR technologies All AR-related data is available from MPEG standards Real time composition of synthe...
MPEG vision on AR, the MPEG AR Browser Point to a URL – no need to download new applications for each context. The browser...
MPEG vision on AR                  Produce                                                            Download            ...
Architecture   Remote                    Remote                      Local                Local  Real World               ...
MPEG ongoing work on AR ISO/IEC 23000-14 Augmented Reality Reference Model      – WD stage, collaborating with SC24/WG9, A...
MPEG Augmented Reality TutorialTopics of the dayWhat is MPEG?MPEG offer in the Augmented Reality fieldMPEG-A Part 14 Augme...
Augmented Reality Reference ModelWD2.0 contentViewpoints                                 Glossary       Community Objectiv...
Augmented Reality Reference Model Enterprise viewpoint: global architecture and actors                                    ...
Augmented Reality Reference ModelInformation viewpoint                                                     Local/Remote Co...
Augmented Reality Reference ModelComputational viewpoint                          Local / Remote Context                  ...
Augmented Reality Reference ModelComputational viewpoint                          Local / Remote Context                  ...
Augmented Reality Reference ModelEngineering viewpoint                             Local/Remote Context                   ...
Augmented Reality Reference ModelGlossary
Augmented Reality Reference ModelUse cases
Augmented Reality Reference ModelHow to contribute? Use Trac! http://wg11.sc29.org/trac/augmentedreality/
MPEG Augmented Reality TutorialTopics of the dayWhat is MPEG?MPEG offer in the Augmented Reality fieldMPEG-A Part 14 Augme...
MPEG-A Part 13 ARAF3 components: scene, sensors/actuators, medias A set of scene graph nodes/protos as defined in MPEG-4 P...
MPEG-A Part 13 ARAF Scene: 63 XML Elements                                      Node, Protos / Elements                   ...
MPEG-A Part 13 ARAF Scene: the distance between ARAF and X3D is 32 (XML Elements)                                      Nod...
arkMPEG-ANPartoung We: P13 ARAF              am Y                 -               hu Signal FScene:: Reference it ion     ...
MPEG-A Part 13 ARAF  Scene:: Reference Signal<ProtoDeclare name="ReferenceSignal” locations="org:mpeg:referencesignal">  <...
MPEG-A Part 13 ARAF  Scene:: Reference Signal Location<ProtoDeclare name="ReferenceSignalLocation" locations="org:mpeg:ref...
MPEG-A Part 13 ARAFScene:: Augmentation Region                                 AR service                                 ...
MPEG-A Part 13 ARAF  Scene:: Augmentation Region<ProtoDeclare name="AugmentationRegion" locations="org:mpeg:augmentationre...
MPEG-A Part 13 ARAFScene:: Map, MapMarkers and Overlay
MPEG-A Part 13 ARAF  Scene:: Map, MapMarkers and Overlay<ProtoDeclare name="Map" protoID="1" locations="org:mpeg:map">  <f...
MPEG-A Part 13 ARAF  Scene:: Map, MapMarkers and Overlay<ProtoDeclare name="MapOverlay" locations="org:mpeg:mapoverlay">  ...
MPEG-A Part 13 ARAFSensors/Actuators                                  MPEG-4 Player                                   Scen...
MPEG-A Part 13 ARAFSensors/Actuators:: MPEG-V                                       Virtual World               Sensed    ...
MPEG-A Part 13 ARAFSensors/Actuators:: MPEG-V
MPEG-A Part 13 ARAF Sensors/Actuators:: MPEG-V typesSensors                   Global position                     Actuator...
MPEG-A Part 13 ARAFCompressionMedia         Compression tool name             Reference standardImage         JPEG        ...
MPEG-A Part 13 ARAFExercises AR Quiz              Augmented Book
MPEG-A Part 13 ARAFExercises AR Quiz                       Augmented Book http://youtu.be/la-Oez0aaHE    http://youtu.be/L...
MPEG-A Part 13 ARAFAR Quiz setting, preparing the medias images, videos, audios, 2D/3D assets GPS location
MPEG-A Part 13 ARAFAR Quiz XML inspection   http://tiny.cc/MPEGARQuiz
MPEG-A Part 13 ARAFAR Quiz Authoring Tool www.MyMultimediaWorld.com go to Create / Augmented Reality
MPEG-A Part 13 ARAFAugmented Book setting images, audios
MPEG-A Part 13 ARAFAugmented Book XML inspection  http://tiny.cc/MPEGAugBook
MPEG-A Part 13 ARAFAugmented Book Authoring Tool www.MyMultimediaWorld.com go to Create / Augmented Books
MPEG-A Part 13 ARAFNext Steps Support for metadata at scene and object level Support for usage rights at scene and object ...
ARAF distance to X3DOn Scene Graph        –   32 elements             – including 2D graphics, humanoid animation, generic...
Conclusions • Joint development of AR Reference Model    – The community at large is invited to react/contribute      such...
Prochain SlideShare
Chargement dans…5
×

MPEG Augmented Reality Tutorial

9 312 vues

Publié le

I made this tutorial at Web3D 2012 conference. It provides MPEG position to AR, technologies currently used, as well as explanations on how to set up AR applications.

0 commentaire
4 j’aime
Statistiques
Remarques
  • Soyez le premier à commenter

Aucun téléchargement
Vues
Nombre de vues
9 312
Sur SlideShare
0
Issues des intégrations
0
Intégrations
5 850
Actions
Partages
0
Téléchargements
105
Commentaires
0
J’aime
4
Intégrations 0
Aucune incorporation

Aucune remarque pour cette diapositive

MPEG Augmented Reality Tutorial

  1. 1. MPEG Augmented Reality Tutorial Web3D Conference, August 4-5, Los Angeles, CA Marius Preda, MPEG 3DG Chair Institut Mines TELECOMhttp://www.slideshare.net/MariusPreda/mpeg-augmented-reality-tutorial
  2. 2. MPEG Augmented Reality TutorialTopics of the dayWhat is MPEG?MPEG offer in the Augmented Reality fieldMPEG-A Part 14 Augmented Reality Reference ModelMPEG-A Part 13 Augmented Reality Application Format
  3. 3. MPEG Augmented Reality TutorialTopics of the dayWhat is MPEG?MPEG offer in the Augmented Reality fieldMPEG-A Part 14 Augmented Reality Reference ModelMPEG-A Part 13 Augmented Reality Application Format
  4. 4. What is MPEG?A suite of ~130 ISO/IEC standardsCoding/compression of elementary media: – Audio (MPEG-1, 2 and 4) – Video (MPEG-1, 2 and 4) – 2D/3D graphics (MPEG-4)Storage and Transport – MPEG-2 Transport – File Format (MPEG-4) – Dynamic Adaptive Streaming over HTTP (DASH)Hybrid (natural & synthetic) scene description, user interaction (MPEG-4)Metadata (MPEG-7)Media management and protection (MPEG-21)Sensors and actuators, virtual worlds (MPEG-V)Advanced User interaction (MPEG-U)Media-oriented middleware (MPEG-M)More ISO/IEC standards under development for – 3D Video, 3D Audio – Coding and Delivery in Heterogeneous Environments – …
  5. 5. What is MPEG?Involvement, approach, deploymentA standardization activity continuing for 24 years – Supported by several hundreds companies/organisations from ~25 countries – ~500 experts participating in quarterly meetings – More than 2300 active contributors – Many thousands experts working in companiesA proven manner to organize the work to deliver useful and used standards – Developing standards by integrating individual technologies – Well defined procedures – Subgroups with clear objectives – Ad hoc groups continuing coordinated work between meetingsMPEG standards are widely referenced by industry – 3GPP, ARIB, ATSC, DVB, DVD-Forum, BDA, EITSI, SCTE, TIA, DLNA, DECE, OIPF…Billions of software and hardware devices built on MPEG technologies – MP3 players, cameras, mobile handsets, PCs, DVD/Blue-Ray players, STBs, TVs, …
  6. 6. MPEG Augmented Reality TutorialTopics of the dayWhat is MPEG?MPEG offer in the Augmented Reality fieldMPEG-A Part 14 Augmented Reality Reference ModelMPEG-A Part 13 Augmented Reality Application Format
  7. 7. MPEG technologies related to AR1992/4 1997 1998 1999MPEG-1/2(AV content) VRML MPEG-4 v.1 • Part 11 - BIFS: -Binarisation of VRML -Extensions for streaming -Extensions for server command -Extensions for 2D graphics - Real time augmentation with audio & video • Part 2 - Visual: - 3D Mesh compression MPEG-4 v.2 - Face animation • Part 2 – Visual - Body animation First form of broadcast signal augmentation
  8. 8. MPEG technologies related to AR2003 2005 2007 2011MPEG-4• Part 16 - AFX: - A rich set of 3D MPEG-4 graphics tools •AFX 2nd Edition: - Compression of - Animation by geometry, morphing appearance, - Multi-texturing animation MPEG-4 • AFX 3rd Edition - WSS for terrain and cities A rich set of 3D Graphics - Frame based MPEG-4 representation and animation • AFX 4th Edition compression tools - Scalable complexity mesh coding
  9. 9. MPEG technologies related to AR2003 2004 2005 2007 2009 2011MPEG-4• Part 16 - AFX: - A rich set of 3D MPEG-4 graphics tools •AFX 2nd Edition: - Compression of - Animation by geometry, morphing appearance, - Multi-texturing animation MPEG-4 • AFX 3rd Edition - WSS for terrain and cities MPEG-4 MPEG-4 - Frame based MPEG-4 • Part 16 • Part 25 animation • AFX 4th Edition - X3D - Compression of - Scalable complexity Interactive third-party XML mesh coding Profile (X3D, COLLADA)
  10. 10. MPEG technologies related to AR2011 2012 201x 201x 201xMPEG-V - MediaContext and Control MPEG-U –• 1st Edition Advanced - Sensors and User Interface MPEG-V actuators • 2nd Edition: - Interoperability - GPS between Virtual - Biosensors Worlds - 3D Camera MPEG-H - 3D Video • Compression A rich set of sensors and CDVS of video + actuators • Feature-point based depth - 3D Audio descriptors for image recognition
  11. 11. Main features of MPEG AR technologies All AR-related data is available from MPEG standards Real time composition of synthetic and natural objects Access to – Remotely/locally stored BIFS/compressed 2D/3D mesh objects – Streamed real-time BIFS/compressed 2D/3D mesh objects Inherent object scalability (e.g. for streaming) User interaction & server generated scene changes Physical context – Captured by a broad range of standard sensors – Affected by a broad range of standard actuators
  12. 12. MPEG vision on AR, the MPEG AR Browser Point to a URL – no need to download new applications for each context. The browser – Retrieves scenario from the internet – Starts video acquisition – Tracks objects – Recognizes objects from visual signatures – Recovers camera pose – Gets streamed 3D graphics – Composes new scenes – Gets inputs from various sensors – Offers optimal AR experience by constantly adapting interaction possibilities and objects from a remote server. Industry – Maximize number of customers through MPEG-compliant authoring tools and browsers – No need to develop a new application for each use case and device platform
  13. 13. MPEG vision on AR Produce Download Compression Authoring Tool MPEG-4/MPEG-7/MPEG-21/ MPEG-U/MPEG-V MPEG Player
  14. 14. Architecture Remote Remote Local Local Real World Sensors & Sensors & Real World Environment Actuators Actuators Environment AR Player User AR file or stream Media Service Servers Servers
  15. 15. MPEG ongoing work on AR ISO/IEC 23000-14 Augmented Reality Reference Model – WD stage, collaborating with SC24/WG9, ARStandards, OGC, Khronos, Web3D ISO/IEC 23000-13 Augmented Reality Application Format – CD stage, based on MPEG standards
  16. 16. MPEG Augmented Reality TutorialTopics of the dayWhat is MPEG?MPEG offer in the Augmented Reality fieldMPEG-A Part 14 Augmented Reality Reference ModelMPEG-A Part 13 Augmented Reality Application Format
  17. 17. Augmented Reality Reference ModelWD2.0 contentViewpoints Glossary Community Objectives Enterprise Viewpoint Abstract/Design Information Computational Viewpoint Viewpoint Implementation/Development Use cases Engineering - Guide Viewpoint - Create Technology - Play Viewpoint
  18. 18. Augmented Reality Reference Model Enterprise viewpoint: global architecture and actors Local / Remote Context MCP ARTC TO Telecommunication Operator (TO) End-User (EU) DM MCP AREC AR EU TO AR Player User Document Device Manufacturer (DM) AC TO TO Middleware/Component Provider (MCP)AR Tools Creator (ARTC)AR Experience Creator (AREC) Online Middleware/ComponentAssets Creator (AC) Media Provider (OMCP) Service AR Service Provider (ARSP) Servers ServersAssets Aggregator (AA) AC AA OMCP ARSP
  19. 19. Augmented Reality Reference ModelInformation viewpoint Local/Remote Context Scene/Real World Device Context • Raw image • Device capabilities • Sensed data Location of Device • Virtual Camera view • Location • Detected features • Orientation • Area of Interest/Anchors Presentation AR Player Spatial Models • • Coordinate Ref. Sys. Augmentation • (Geol)ocation User • Registration AR • Styling/complexity • Projections • • Coordinate conversion Document Spatial Filtering, e.g. User Input range Tracking objects • Query • Markers • Manipulation of • Marker-less Presentation • Topics of interest • Preferences Digital Assets • Presentation data Media Service • Trigger/Event rules Servers Servers • Accuracy based
  20. 20. Augmented Reality Reference ModelComputational viewpoint Local / Remote Context 2 1 5 AR AR Player User Document 3 4 Media Service Servers Servers
  21. 21. Augmented Reality Reference ModelComputational viewpoint Local / Remote Context 2 1 3 AR AR Player User Document 4 5 Media Service Servers Servers
  22. 22. Augmented Reality Reference ModelEngineering viewpoint Local/Remote Context Accelero Camera Mic Compass GPS … meter AR Player User AR Rendering Display Application Document … Engine (A/V/H) Engine Media Service Servers Servers
  23. 23. Augmented Reality Reference ModelGlossary
  24. 24. Augmented Reality Reference ModelUse cases
  25. 25. Augmented Reality Reference ModelHow to contribute? Use Trac! http://wg11.sc29.org/trac/augmentedreality/
  26. 26. MPEG Augmented Reality TutorialTopics of the dayWhat is MPEG?MPEG offer in the Augmented Reality fieldMPEG-A Part 14 Augmented Reality Reference ModelMPEG-A Part 13 Augmented Reality Application Format
  27. 27. MPEG-A Part 13 ARAF3 components: scene, sensors/actuators, medias A set of scene graph nodes/protos as defined in MPEG-4 Part 11 – Existing nodes – Audio, image, video, graphics, programming, communication, user interactivity, animation – New standard PROTOs – Map, MapMarker, Overlay, ReferenceSignal, ReferenceSignalLocation, CameraCalibration, AugmentedRegion Connection to sensors as defined in MPEG-V – Orientation, Position, Angular Velocity, Acceleration, GPS, Geomagnetic, Altitude – Local camera sensor Compressed media
  28. 28. MPEG-A Part 13 ARAF Scene: 63 XML Elements Node, Protos / Elements Node, Protos / Elements Category Sub-category Category Sub-category name in MPEG-4 BIFS / XMT name in MPEG-4 BIFS / XMT AudioSource AugmentationRegion Audio Sound Background Sound2D Background2D Image and ImageTexture CameraCalibration video MovieTexture Group Textual FontStyle Inline information Text Layer2D Appearance Scene related Layer3D Color information Layout LineProperties (spatial and NavigationInfo LinearGradient temporal OrderedGroup Material relationships) ReferenceSignalElementary media Material2D ReferenceSignalLocation Rectangle Switch Shape Transform SBVCAnimationV2 Transform2D Graphics SBBone Viewpoint SBSegment Viewport SBSkinnedModel Form MorphShape OrientationInterpolator Coordinate ScalarInterpolator TextureCoordinate Dynamic and CoordinateInterpolator Normal animated scene ColorInterpolator IndexedFaceSet PositionInterpolator IndexedLineSet Valuator Programming Script BitWrapper InputSensor MediaControl Communication SphereSensor Map and compression TimeSensor Maps MapOverlayUser interactivity TouchSensor MapMarker MediaSensor Terminal TermCap PlaneSensor
  29. 29. MPEG-A Part 13 ARAF Scene: the distance between ARAF and X3D is 32 (XML Elements) Node, Protos / Elements Node, Protos / Elements Category Sub-category Category Sub-category name in MPEG-4 BIFS / XMT name in MPEG-4 BIFS / XMT AudioSource AugmentationRegion Audio Sound Background Sound2D Background2D Image and ImageTexture CameraCalibration video MovieTexture Group Textual FontStyle Inline information Text Layer2D Appearance Scene related Layer3D Color information Layout LineProperties (spatial and NavigationInfo LinearGradient temporal OrderedGroup Material relationships) ReferenceSignalElementary media Material2D ReferenceSignalLocation Rectangle Switch Shape Transform SBVCAnimationV2 Transform2D Graphics SBBone Viewpoint SBSegment Viewport SBSkinnedModel Form MorphShape OrientationInterpolator Coordinate ScalarInterpolator TextureCoordinate Dynamic and CoordinateInterpolator Normal animated scene ColorInterpolator IndexedFaceSet PositionInterpolator IndexedLineSet Valuator Programming Script BitWrapper InputSensor MediaControl Communication SphereSensor Map and compression TimeSensor Maps MapOverlayUser interactivity TouchSensor MapMarker MediaSensor Terminal TermCap PlaneSensor
  30. 30. arkMPEG-ANPartoung We: P13 ARAF am Y - hu Signal FScene:: Reference it ion C : s Po m : FC ea nal: T e Ars age I m ce r en M arker Tracking Name: Park Chu- Young fe Position: FW Team: Re Arsenal: FC M arker Ref erence Image 3D graphic Synchronized wit h movement of marker image
  31. 31. MPEG-A Part 13 ARAF Scene:: Reference Signal<ProtoDeclare name="ReferenceSignal” locations="org:mpeg:referencesignal"> <field name="source" ="Strings" vrml97Hint="exposedField" stringArrayValue=""/> <field name="referenceResources" ="Strings" vrml97Hint="exposedField" stringArrayValue=""/> <field name="enabled" ="Boolean" vrml97Hint="exposedField" booleanValue="false"/> <field name="detectionHints" ="Strings" vrml97Hint="exposedField" stringArrayValue=""/> <field name="onInputDetected" ="Integer" vrml97Hint="eventOut"/> <field name="onError" ="Integer" vrml97Hint="eventOut"/></ProtoDeclare>
  32. 32. MPEG-A Part 13 ARAF Scene:: Reference Signal Location<ProtoDeclare name="ReferenceSignalLocation" locations="org:mpeg:referencesignallocation"> <field name="source" ="Strings" vrml97Hint="exposedField" stringArrayValue=""/> <field name="referenceResources" ="Strings" vrml97Hint="exposedField" stringArrayValue=""/> <field name="enabled" ="Boolean" vrml97Hint="exposedField" booleanValue="false"/> <field name="detectionHints" ="Strings" vrml97Hint="exposedField" stringArrayValue=""/> <field name="translation" ="Vector3Array" vrml97Hint="exposedField" Vector3ArrayValue=""/> <field name="rotation" ="Rotations" vrml97Hint="exposedField" rotationArrayValue=""/> <field name="onInputDetected" ="Integer" vrml97Hint="eventOut"/> <field name="onTranslationChanged" ="Integer" vrml97Hint="eventOut"/> <field name="onRotationChanged" ="Integer" vrml97Hint="eventOut"/> <field name="onError" ="Integer" vrml97Hint="eventOut"/></ProtoDeclare>
  33. 33. MPEG-A Part 13 ARAFScene:: Augmentation Region AR service provider A User A Broadcaster User B Augmentation AR service Region provider B
  34. 34. MPEG-A Part 13 ARAF Scene:: Augmentation Region<ProtoDeclare name="AugmentationRegion" locations="org:mpeg:augmentationregion"> <field name="source" ="Strings" vrml97Hint="exposedField" stringArrayValue=""/> <field name="2DRegion" ="Vector2Array" vrml97Hint="exposedField" vector2ArrayValue=""/> <field name="arProvider" ="Strings" vrml97Hint="exposedField" stringArrayValue=""/> <field name="enabled" ="Boolean" vrml97Hint="exposedField" booleanValue="false"/> <field name="translation" ="Vector3Array" vrml97Hint="exposedField" Vector3ArrayValue=""/> <field name="rotation" ="Rotations" vrml97Hint="exposedField" rotationArrayValue=""/> <field name="onTranslationChanged" ="Integer" vrml97Hint="eventOut"/> <field name="onRotationChanged" ="Integer" vrml97Hint="eventOut"/> <field name="onARProviderChanged" ="Boolean" vrml97Hint="eventOut"/> <field name="onError" ="Integer" vrml97Hint="eventOut"/></ProtoDeclare>
  35. 35. MPEG-A Part 13 ARAFScene:: Map, MapMarkers and Overlay
  36. 36. MPEG-A Part 13 ARAF Scene:: Map, MapMarkers and Overlay<ProtoDeclare name="Map" protoID="1" locations="org:mpeg:map"> <field name="addChildren" ="Nodes" vrml97Hint="eventIn"/> <field name="removeChildren" ="Nodes" vrml97Hint="eventIn"/> <field name="addOverlays" ="Nodes" vrml97Hint="eventIn"/> <field name="removeOverlays" ="Nodes" vrml97Hint="eventIn"/> <field name="translate" ="Vector2" vrml97Hint="eventIn"/> <field name="zoom_in" ="Boolean" vrml97Hint="eventIn"/> <field name="zoom_out" ="Boolean" vrml97Hint="eventIn"/> <field name="gpscenter_changed" ="Vector2" vrml97Hint="eventOut"/> <field name="children" ="Nodes" vrml97Hint="exposedField"> <nodes></nodes> </field> <field name="overlays" ="Nodes" vrml97Hint="exposedField"> <nodes></nodes> </field><field name="gpsCenter" ="Vector2" vrml97Hint="exposedField" vector2Value="0 0"/> <field name="mode" ="Strings" vrml97Hint="exposedField" stringArrayValue="ROADMAP"/> <field name="provider" ="Strings" vrml97Hint="exposedField" stringArrayValue="ANY"/> <field name="size" ="Vector2" vrml97Hint="exposedField" vector2Value="0 0"/> <field name="mapWidth" ="Float" vrml97Hint="exposedField" floatValue="0"/> <field name="zoomLevel" ="Integer" vrml97Hint="exposedField" integerValue="0"/></ProtoDeclare>
  37. 37. MPEG-A Part 13 ARAF Scene:: Map, MapMarkers and Overlay<ProtoDeclare name="MapOverlay" locations="org:mpeg:mapoverlay"> <field name="addChildren" ="Nodes" vrml97Hint="eventIn"/> <field name="removeChildren" ="Nodes" vrml97Hint="eventIn"/> <field name="children" ="Nodes" vrml97Hint="exposedField"> <field name="keywords" ="Strings" vrml97Hint="exposedField stringArrayValue=""/></ProtoDeclare><ProtoDeclare name="MapMarker" locations="org:mpeg:mapmarker"> <field name="addChildren" ="Nodes" vrml97Hint="eventIn"/> <field name="removeChildren" ="Nodes" vrml97Hint="eventIn"/> <field name="gpsPosition" ="Vector2" vrml97Hint="exposedField" vector2Value="0 0"/> <field name="children" ="Nodes" vrml97Hint="exposedField"> <nodes></nodes> </field> <field name="keywords" ="Strings" vrml97Hint="exposedField stringArrayValue=""/></ProtoDeclare>
  38. 38. MPEG-A Part 13 ARAFSensors/Actuators MPEG-4 Player Scene mapping of MPEG-4 Scene captured data MPEG-V Sensor 1 InputSensor 1 Acceleration Sensor MPEG-V Orientation Sensor Compositor Sensor 2 InputSensor 2 Screen Angular Velocity Global Position Sensor MPEG-V InputSensor 3 Altitude Sensor Sensor 3 MPEG-4 Player Compositorhw://camera/back Camera Input RAW Decoder Compositor mapping of Camera Stream Screen captured data Camera Sensor
  39. 39. MPEG-A Part 13 ARAFSensors/Actuators:: MPEG-V Virtual World Sensed VW Object Sensory Information Characteristics Effects (5) (4) (3) Engine R→V Adaptation: converts Sensed V→R Adaptation: converts Info from RW to VW Object Sensory Effects from VW into Char/Sensed Info applied to VW Device Cmds applied to RW Sensor Sensor Sensory Sensory Sensed Device Device Adaptation Effects Device Information Commands Capability Preferences Preferences Capability (5) (5) (2) (2) (2) (2) Real World User Real World (Sensor Device) (Sensory Device)
  40. 40. MPEG-A Part 13 ARAFSensors/Actuators:: MPEG-V
  41. 41. MPEG-A Part 13 ARAF Sensors/Actuators:: MPEG-V typesSensors Global position ActuatorsLight Altitude Bend LightAmbient noiseTemperature Gas Flash Dust HeatingHumidity Body heightDistance Body weight CoolingAtmospheric pressure Wind Body temperaturePosition Body fat VibrationVelocity Blood type SprayerAcceleration Blood pressureOrientation Scent Blood sugarAngular velocity Fog Blood oxygenAngular acceleration Heart rate Color correctionForceTorque Electrograph Initialize color correction parameterPressure EEG , ECG, EMG, EOG , GSR Rigid body motion WeatherMotion TactileIntelligent camera type Facial expression Facial morphology KinestheticMulti Interaction pointGaze tracking Facial expression characteristics Global position commandWind Geomagnetic
  42. 42. MPEG-A Part 13 ARAFCompressionMedia Compression tool name Reference standardImage JPEG ISO/IEC 10918 JPEG2000 ISO/IEC 15444Video Visual ISO/IEC 14496-2 Advanced Video Coding ISO/IEC 14496-10Audio MP3 ISO/IEC-11172-3 Advanced Audio Coding ISO/IEC 14496-33D Graphics Scalable Complexity Mesh Coding ISO/IEC 14496-16 Bone-based Animation ISO/IEC 14496-16Scenes BIFS ISO/IEC 14496-11
  43. 43. MPEG-A Part 13 ARAFExercises AR Quiz Augmented Book
  44. 44. MPEG-A Part 13 ARAFExercises AR Quiz Augmented Book http://youtu.be/la-Oez0aaHE http://youtu.be/LXZUbAFPP-Y
  45. 45. MPEG-A Part 13 ARAFAR Quiz setting, preparing the medias images, videos, audios, 2D/3D assets GPS location
  46. 46. MPEG-A Part 13 ARAFAR Quiz XML inspection http://tiny.cc/MPEGARQuiz
  47. 47. MPEG-A Part 13 ARAFAR Quiz Authoring Tool www.MyMultimediaWorld.com go to Create / Augmented Reality
  48. 48. MPEG-A Part 13 ARAFAugmented Book setting images, audios
  49. 49. MPEG-A Part 13 ARAFAugmented Book XML inspection http://tiny.cc/MPEGAugBook
  50. 50. MPEG-A Part 13 ARAFAugmented Book Authoring Tool www.MyMultimediaWorld.com go to Create / Augmented Books
  51. 51. MPEG-A Part 13 ARAFNext Steps Support for metadata at scene and object level Support for usage rights at scene and object level Collisions between real and virtual objects, partial rendering
  52. 52. ARAF distance to X3DOn Scene Graph – 32 elements – including 2D graphics, humanoid animation, generic input, media control, and pure AR protosOn Sensors/Actuators – 6 elementsOn Compression – MPEG-4 Part 25 already compresses X3D
  53. 53. Conclusions • Joint development of AR Reference Model – The community at large is invited to react/contribute such as the model became a reference – http://wg11.sc29.org/trac/augmentedreality • MPEG promoted a first version of an integrated and consistent solution for representing content in AR applications and services – Continue synchronized/harmonized development of technical specifications with X3D, COLLADA, OGC content models

×