SlideShare une entreprise Scribd logo
1  sur  108
Télécharger pour lire hors ligne
Lecture 3: AR D
L       3     Developer Tools
                  l     T l

            Mark Billinghurst
      mark.billinghurst@hitlabnz.org

                July 2011

    COSC 426: Augmented Reality
Building Compelling AR Experiences

            experiences

            applications

               tools

            components     Display, Tracking



                                      Sony CSL © 2004
Low Level AR Libraries
ARToolKit
  Marker based tracking
                      g
FLARToolKit
  Flash
  Fl h version of ART lK
                f ARToolKit
SSTT
  Simple Spatial Template Tracking
Opira
O ira
  Robust Natural Feature Tracking
What ARToolKit?
             Wh is ART lK ?
Marker Tracking Library for AR applications
   Open Source, Multi platform (Linux Windows MacOS)
        Source Multi-platform (Linux, Windows,
Overlays 3D virtual objects on real markers
   Uses single tracking marker
   Determines camera pose information (6 DOF)
ARToolKit Website
http://www.hitl.washington.edu/artoolkit/
http://artoolkit.sourceforge.net/
ARToolKit S f
        ART lK Software
ARToolKit version: 2.65 or later
                   2 65
Currently two license models
  Open Source (GPL): ARToolKit 2 72
                               2.72
  Commercial (ARToolWorks): ARToolKit 4.0
OS: Li
OS Linux, Windows, MacOS X, iPh
          Wi d      M OS X iPhone/Android
                                 /A d id
Programming language: C
Related f
R l d software
  ARToolKit Professional: Commercial version
  ARToolKitPlus: Advanced version
  NyARToolkit: Java and C# version
  FLARToolKit: Flash version
ARToolKit Family
                 ART lKit F il

                                  ARToolKit NFT
ARToolKit
                           ARToolKit Plus

            JARToolKit (Java)

       ARToolKit (Symbian)
                                   FLARToolKit (Flash)
                   NyToolKit
                   - Java, C#,
                   - Android WM
                     Android,
                                     FLARManager (
                                                 (Flash)
                                                       )
ARToolKit
        ART lK contents
Libraries
  libAR – tracking
    b     t ac g
  libARvideo – video capturing
  libARgsub – image/graphics drawing
  libARmulti – multi-marker tracking
Utilities
  Camera calibration
  Marker training
ARToolKit Structure


                                   ARvideo.lib

                                  DirectShow

Three key libraries:
  AR32.lib
  AR32 lib – ARToolKit image processing functions
  ARgsub32.lib – ARToolKit graphics functions
  ARvideo.lib
  ARvideo lib – DirectShow video capture class
Additional Software
To build an AR application you may need
additional software
High level rendering library
  Open VRML, Open Inventor, osgART, etc
Audio Library
  Fmod, etc
Peripheral support
What does ARToolKit Calculate?
Position of makers in the camera coordinates
Pose of markers in the camera coordinates
Output format
  3x4 matrix format to represent the
  transformation matrix from the marker
  coordinates to the camera coordinates
Coordinate Systems
C d        S
Tracking Range with Pattern Size




Rule of thumb – range = 10 x pattern width
Tracking Error with Range
Tracking Error with Angle
AR Application Development
An ARToolKit Application
 Initialization
 I iti li ti
    Load camera and pattern parameters
 Main Loop
 M i L
    Step1. Image capture and display
    Step2. Marker detection
    Step3. Marker identification
    Step4. Getting pose information
    Step5. Object Interactions/Simulation
    Step6. Display virtual objects
 End Application
    Camera shut down
Image capture: libARvideo
Return the pointer for captured image
  ARUint8 *arVideoGetImage( void );
Pixel format and byte size are defined in config h
                                          config.h
  #define     AR_PIX_FORMAT_BGR
  #define     AR_PIX_SIZE       3
Graphics handling: libARgsub
Set up and clean up the graphics window
void argInit( ARParam *cparam double zoom
                      *cparam,        zoom,
            int fullFlag, int xwin, int ywin,
            int hmd_flag );
void argCleanup( void );

cparam:              camera parameter
zoom:                zoom ratio
fullFlag:            0: normal, 1: full screen mode
Xwin, ywin: create small window for debug
hmd_flag:            0: normal, 1: optical see-through mode
Graphics handling: libARgsub
Go into the iterative cycle
void argMainLoop(
   void (*mouseFunc)(int btn int state int x int y)
                         btn,int state,int x,int y),
   void (*keyFunc)(unsigned char key, int x, int y),
   void (*mainFunc)(void)
);


Swap buffers
   p
void argSwapBuffers( void );
Graphics handling: libARgsub
Set the window for 2D drawing
void argDrawMode2D( void )
  id    D   M d 2D(   id );

Set the window for 3D drawing
                            g
void argDrawMode3D( void );
void argDraw3dCamera( int xwin, int ywin );

Display image
void argDispImage( ARUi t8 *i
  id    Di I     ( ARUint8 *image,
                  int xwin, int ywin );
Sample ARToolKit Applications
Ex. 1: Simple video display
Ex. 2: Detecting a marker
Ex. 3: Using pattern
Ex. 4: Getting
E 4 G tti a 3D i f  information
                            ti
Ex.
Ex 5: Virtual object overlay
Ex 1: Simple Video Display
Program : sample1.c
Key points
 Loop structure
 Video image handling
 Camera parameter handling
 Window setup
 Mouse and keyboard handling
Sample1.c Main Function
main()
{
   init();
  argMainLoop( mouseEvent, keyEvent,
               a  oop);
            mainLoop);
}
Sample1.c - mainLoop Function
 if( dataPtr = (ARUint8 *)
 arVideoGetImage()) == NULL ) {
     arUtilSleep(2);
     return;
 }
 argDrawMode2D();
     D M d 2D()
 argDispImage(dataPtr, 0, 0 );
 arVideoCapNext();
 argSwapBuffers();
Sample1.c – video initialization
Configure th video input
C fi      the id i     t
vconf = <video configuration string>
Start video capture
arVideoCapStart();
In init(), open the video
arVideoOpen( vconf );
arVideoInqSize(&xsize, &ysize);
When finished, close the video path
  arVideoCapStop();
  arVideoClose();
Changing Image Size
  For input capture
  vconf = “videoWidth=320,videoHeight=240";
  Note – the camera must support this image size

  For display
  argInit( &cparam 1 5 0 0 0 0 );
           &cparam, 1.5, 0, 0, 0,


The
Th second parameter means zoom ratio for display
          d                          f d l
  image size related to input image.
Ex. 2: Detecting a Marker
Program : sample2.c
P             l 2
Key points
  yp
  Threshold value
  Important external variables
  arDebug – keep thresholded image
  arImage – pointer for thresholded image
  arImageProcMode – use 50% image for image
  processing
   - AR_IMAGE_PROC_IN_FULL
   - AR IMAGE PROC IN HALF
     AR_IMAGE_PROC_IN_HALF
Sample2.c – marker detection
/* detect the markers in the video frame */
if(arDetectMarker(dataPtr, th
if( D t tM k (d t Pt       thresh,
                                h
  &marker_info, &marker_num) < 0 ) {
  cleanup();
   l     ()
  exit(0);
}
for( i = 0; i < marker_num; i++ ) {
  argDrawSquare(marker_info[i].vertex,0,0);
}
Sample2.c – marker_info structure
  typedef struct {
    int area;
    int id;
    int dir;
    double cf;
    double pos[2];
    double line[4][3];
    double vertex[4][2];
  } ARMarkerInfo;
Ex. 3: Using a Pattern
Program : sample3.c
Key points
 Pattern files loading
 Structure of marker
 information
  - Region features
  - Pattern Id, direction
  - Certainty factor
 Marker identification
Making a pattern template
Use f ili
U of utility program:
     mk_patt.exe
Show the pattern
Put the corner of red line
segments on the left-top
                   left top
vertex of the marker
Pattern stored as a
template in a file
1:2:1 ratio determines the
pattern region used
  tt        i        d
Sample3.c – Pattern File Loading
int patt_id;
char *patt name = “Data/kanjiPatt”
      patt_name    Data/kanjiPatt

/
/* load pattern file *//
if(patt_id=arLoadPatt (patt_name) < 0)
{
   printf ("Pattern file load error !! n");
   exit(0);
}
Checking for known patterns
/* check for known patterns */
k = -1;
for( i = 0; i < marker num; i++ ) {
                 marker_num;
    if( marker_info[i].id == patt_id) {
        /* you've found a pattern */
        printf("Found pattern: %d n",patt_id);
        if( k == -1 ) k = i;
        else
        /* make sure you have the best pattern
        (highest confidence factor) */
        if( marker_info[k].cf < marker_info[i].cf )
               k    i f [k] f       k  i f [i] f
              k = i;
    }
}
Ex. 4 – Getting 3D information
Program : sample4.c
Key points
 Definition of a real marker
 Transformation matrix
  - Rotation component
  -TTranslation component
          l ti          t
Sample4.c – Transformation matrix
double marker_center[2] = {0.0, 0.0};
double marker_width = 80.0;
double marker_trans[3][4];

arGetTransMat(&marker_info[i],
arGetTransMat(&marker info[i]
    marker_center, marker_width,
    marker_trans);
    marker trans);
Finding the Camera Position
This function sets transformation matrix from marker
  to camera into marker_trans[3][4].
                         _     [ ][ ]
  arGetTransMat(&marker_info[k],     marker_center,
      marker_width, marker_trans);


You can see the position information in the values of
  marker_trans[3][4].
  marker trans[3][4]
     Xpos = marker_trans[0][3];
     Ypos = marker_trans[1][3];
     Zpos = marker_trans[2][3];
ARToolKit Coordinate Frame
Ex. 5- Virtual Object Display
Program : sample5.c
Key points
  OpenGL parameter setting
  Setup of projection matrix
  Setup of modelview matrix
Appending your own OpenGL code
Set the camera parameters to OpenGL Projection matrix.
      g
   argDrawMode3D();
   argDraw3dCamera( 0, 0 );
Set the transformation matrix from the marker to the camera to
   the OpenGL ModelView matrix.
   argConvGlpara(marker_trans, gl_para);
   glMatrixMode(GL_MODELVIEW);
   glLoadMatrixd( gl_para );

After calling these functions, your OpenGL objects are
drawn in the real marker coordinates.
3D CG Model Rendering
ARToolKit does not have a function to handle
3D CG models
        models.
3rd party CG rendering software should be
employed.
  OpenVRML
    p
  OpenSceneGraph
  etc
Loading Multiple Patterns
Sample File: LoadMulti.c
  Uses object c to load
       object.c
Object Structure
typedef struct {
t   d f t    t
  char       name[256];
  int        id;
  int        visible;
  double     marker_coord[4][2];
  double     trans[3][4];
  double     marker_width;
  double     marker_center[2];
} ObjectData_T;
Finding Multiple Transforms
 Create object list
ObjectData_T
ObjectData T      *object;
                   object;

 Read in objects - in init( )
read_ObjData( char *name, int *objectnum );

 Find Transform – in mainLoop( )
for( i = 0; i < objectnum; i++ ) {
    ..Check patterns
      Ch k    tt
    ..Find transforms for each marker
  }
Drawing Multiple Objects
 Send the object list to the draw function
draw( object, objectnum );

 Draw each object individually
             j               y
for( i = 0; i < objectnum; i++ ) {
   if( object[i].visible == 0 ) continue;
     (   j   [ ]                        ;
   argConvGlpara(object[i].trans, gl_para);
   draw_object(
   draw object( object[i].id, gl para);
                              gl_para);
}
Limitations of ARToolKit
 Partial occlusions cause tracking failure
 Affected by lighting and shadows
 Tracking range depends on marker size
 Performance depends on number of markers
 P f            d     d         b     f    k
    cf artTag, ARToolKitPlus
 Pose accuracy depends on distance to marker
 Pose accuracy depends on angle to marker
                y p             g
ARToolKit in the World




Hundreds f
H d d of projects
                j
Large research community
   g                   y
FLARToolKit
Flash AS3 Version of the ARToolKit
(was ported from NyARToolkit the Java Version of the ARToolkit)


enables augmented reality on the Browser
uses Papervision3D for as 3D Engine
available at http://saqoosha.net/
dual license, GPL and commercial license
AR Application Components
                   Adobe Flash


                   Papervision 3D


                   FLARToolkit
Source Packages
„Original“ FLART lki (Libspark, Saqoosha)
 O i i l“ FLARToolkit
(http://www.libspark.org/svn/as3/FLARToolKit/trunk/ )


Start-up-guides
   Saqoosha (http://saqoosha.net/en/flartoolkit/start-up-guide/ )
   Miko Haapoja (http://www.mikkoh.com/blog/?p=182 )
„Frameworks
 Frameworks“
   Squidder MultipleMarker – Example
   (http://www.squidder.com/2009/03/06/flar-how-to-multiple-instances-of-multiple-
   markers/ )
       k /
   FLARManager (http://words.transmote.com/wp/flarmanager/ )
Papervision 3D
http://www.papervision3d.org/
Flash based 3D Engine
Flash-based 3D-Engine
Supports
  import of 3D Models
  texturing
          g
  animation
  scene graph
alternatives: Away3d, Sandy,…
Papervision Examples
Red Bull|Flugtage Flight Lab
http://www.redbull.com/flightlab/#/Game/TEMP_0



Barcinski JeanJean | 3D Portfolio
http://www.barcinski-jeanjean.com/



more cool papervision websites
http://blog.papervision3d.org/2009/04/24/2009-webby-awards-nominees/
AR Tools
Building Compelling AR Experiences
B ildi   C    lli      E    i

          experiences

          applications

             tools       Authoring


          components     Tracking, Display



                                     Sony CSL © 2004
AR Authoring
Software Lib i
S ft      Libraries
   osgART, Studierstube, MXRToolKit
     g
Plug-ins to existing software
   DART (Macromedia Director), mARx
          (M         di Di    ) AR
Stand Alone
   AMIRE, ComposAR, etc
Next Generation
   iaTAR (Tangible AR)
mARx Plug-in




3D Studio Max Plug-in
Can model and view AR content at the same time
BuildAR




http://www.buildar.co.nz/
              pp
Stand alone application
Visual interface for AR model viewing application
Enables non-programmers to build AR scenes
ImageTclAR
Adds AR components to ImageTcl
  http://metlab.cse.msu.edu/imagetclar/
  Modular Library (Scripting, Tcl)
  Supports several tracking systems (vision, magnetic, inertial)
  Easy to learn but little support, small community
DART
Designers AR Toolkit
  http://www.cc.gatech.edu/dart/
  http://www cc gatech edu/dart/
  AR plug-in for Macromedia Director
  Developed f d i
  D     l   d for designers
  Visual programming
  Scripting interface
Studierstube
Complete authoring tool
  http://studierstube.icg.tu-graz.ac.at/
  Framework (Low Level Programming, C++)
  Modularity, Extensibility, Scalability, Heterogeneity
  Support for wide range of trackers, displays, input
Metaio UnifEye SDK
Complete commercial authoring platform
  http://www.metaio.com/products/
     p                     p
  Offers viewer and editor tools
  Visual interface and low level SDK
  Delivery on desktop or mobile platforms
OSGART Programming Library
Integration of ARToolKit with a High-Level
Rendering Engine (OpenSceneGraph)
          g g ( p                p )
OSGART= OpenSceneGraph + ARToolKit




Supporting Geometric + Photometric Registration
  pp     g                           g
osgART:Features



C++ (but also Python Lua, etc).
              Python, Lua etc)
Multiple Video Input supports:
  Direct (Firewire/USB Camera), Files, Network by
  ARvideo, PtGrey, CVCam, VideoWrapper, etc.
Benefits of Open Scene Graph
  Rendering Engine Plug-ins etc
              Engine, Plug ins,
More Libraries
JARToolKit
MRToolKit, MXRToolKit ARLib
MRToolKit MXRToolKit, ARLib, OpenVIDIA
DWARF, Goblin XNA
AMIRE
D’Fusion
D’F i
Advanced Authoring: iaTAR (Lee 2004)




 Immersive AR A th i
 I      i       Authoring
 Using real objects to create AR applications
osgART

Developing Augmented Reality
  Applications with osgART
What is a Scene Graph?
Tree-like structure for organising a virtual world
  e.g. VRML
Hierarchy of nodes that define:
Hi     h f d th t d fi
  Groups (and Switches, Sequences etc…)
  Transformations
  Projections
  Geometry
  …
And states and attributes that define:
  Materials and textures
  Lighting and blending
  …
Scene Graph Example
Benefits of a Scene Graph
Performance
P f
  Structuring data facilitates
  optimization
   - Culling, state management, etc…
Abstraction
  Underlying graphics pipeline is
  hidden
  Low-level programming (“how do I      Image: sgi


  display this?”) replaced with high-
  level
  l l concepts (“what do I want to
                t (“ h t d         tt
  display?”)
About Open Scene Graph
http://www.openscenegraph.org/
Open-source scene graph implementation
Based on OpenGL
Object-oriented C++ following design pattern principles
Used for simulation, games, research, and industrial projects
Active development community
   Maintained by Robert Osfield
   ~2000 mailing list subscribers
   Documentation project: www osgbooks com
                            www.osgbooks.com
Uses the OSG Public License (similar to LGPL)
About Open Scene Graph (2)



Pirates of the XXI Century         Flightgear               SCANeR




         3DVRII              Research Institute EOR   VRlab Umeå University
Open Scene Graph Features
Plugins for loading and saving
  3D: 3D Studio (.3ds), OpenFlight (.flt), Wavefront (.obj)…
  2D: .png, .jpg, .bmp, QuickTime movies
NodeKits to extend functionality
                               y
  e.g. osgShadow
Cross platform
Cross-platform support for:
  Window management (osgViewer)
  Threading (OpenThreads)
Open Scene Graph Architecture
 Inter-operability with
  other environments,
       e.g. Python




Plugins read and                            NodeKits extend
write 2D image and                          core functionality,
3D model files            Scene graph and   exposing higher-level
                              rendering     node types
                            functionality
Some Open Scene Graph Demos



  osgviewer                               osgmotionblur                        osgparticle




   osgreflect                              osgdistortion                      osgfxbrowser

You may want to get the OSG data package:
   Via SVN: http://www.openscenegraph.org/svn/osg/OpenSceneGraph-Data/trunk
Learning OSG
Check out the Quick Start Guide
   Free PDF download at http://osgbooks.com/, Physical copy $13US
Join h
J i the mailing li
          ili list:
http://www.openscenegraph.org/projects/osg/wiki/MailingLists

Browse the website: http://www.openscenegraph.org/projects/osg
Use the forum: http://forum.openscenegraph.org
                  p          p       g p     g


Study the examples
Read the source?
osgART
   ART
What is osgART?
osgART adds AR to Open Scene Graph
Further developed and enhanced by:
  Julian Looser
  Hartmut Seichter
  H         S h
  Raphael Grasset
Current version 2.0, Open Source
  http://www.osgart.org
  http://www osgart org
osgART Approach: Basic Scene Graph
   Root




 Transform
 T    f      [0.988 -0.031
             -0.048 0.857
              0.141 0.513
                             -0.145
                             -0.512
                              0.846
             10.939 29.859 -226.733   ]
                                      0
                                      0
                                      0
                                      1




                                          To add Video see-through AR:
 3D Object                                  Integrate live video
                                                g
                                            Apply correct projection matrix
                                            Update tracked transformations in
                                            realtime
osgART Approach: AR Scene Graph
               Root




             Transform




             3D Object
osgART Approach: AR Scene Graph
                Root


               Virtual
               Camera
       Video
                         Transform
       Layer


       Video
                         3D Obj
                            Object
       Geode
osgART Approach: AR Scene Graph
                             Root
                                      Projection matrix from
                                      tracker calibration

    Orthographic                                        Transformation matrix
    projection
                            Virtual
                                                        updated from marker
                            Camera                      tracking in realtime
                    Video
                                      Transform
                    Layer
Full-screen quad
with live texture
updated from
Video source        Video
                                      3D Obj
                                         Object
                    Geode
osgART Approach: AR Scene Graph
                             Root
                                      Projection matrix from
                                      tracker calibration

    Orthographic                                        Transformation matrix
    projection
                            Virtual
                                                        updated from marker
                            Camera                      tracking in realtime
                    Video
                                      Transform
                    Layer
Full-screen quad
with live texture
updated from
Video source        Video
                                      3D Obj
                                         Object
                    Geode
osgART Architecture
Like any video see-through AR library, osgART requires video
input and tracking capabilities.


             Video Source
            e.g.
            e g DirectShow




                              AR
                              A Library




                                            Applica
                                                  ation
            Tracking Module
               (libAR.lib)
osgART Architecture
osgART uses a plugin architecture so that video sources and tracking
technologies can be plugged in as necessary
                        OpenCVVideo -
                           VidCapture -
                             CMU1394 -
                       PointGrey SDK -
                         VidereDesign -




                                           Video Plugin
                                           V
                       VideoWrapper -
                       Vd W
                            VideoInput -
                          VideoSource -




                                                                    App
                                 DSVL -




                                                           os
                         Intranel RTSP -




                                                            sgART



                                                                      plication
                                           Tracke Plugin
                           ARToolKit4 -
                        ARToolkitPlus -         er




                                                                              n
                         MXRToolKit -
                                ARLib -
              bazAR (work in progress) -
              ARTag (work in progress) -
Basic osgART Tutorial
Develop
D l a working osgART application from scratch.
            ki        ART      li ti f          t h
Use ARToolKit 2.72 library for tracking and video
capture
osgART Tutorial 1: Basic OSG Viewer
 Start with the standard Open Scene Graph Viewer
 We will modify this to do AR!
                y
osgART Tutorial 1: Basic OSG Viewer
    The basic osgViewer…
#include <osgViewer/Viewer>
#include <osgViewer/ViewerEventHandlers>

int main(int argc, char* argv[])   {

    // Create a viewer
    osgViewer::Viewer viewer;

    // Create a root node
    osg::ref_ptr<osg::Group> root = new osg::Group;

    // Attach root node to the viewer
    viewer.setSceneData(root.get());
                             g

    // Add relevant event handlers to the viewer
    viewer.addEventHandler(new osgViewer::StatsHandler);
    viewer.addEventHandler(new osgViewer::WindowSizeHandler);
    viewer.addEventHandler(new osgViewer::ThreadingHandler);
    viewer.addEventHandler(new
    viewer addEventHandler(new osgViewer::HelpHandler);

    // Run the viewer and exit the program when the viewer is closed
    return viewer.run();

}
osgART Tutorial 2: Adding Video
Add a video plugin
  Load, configure, start video capture…
Add a video background
  Create, link to video, add to scene-graph
                                      g p
osgART Tutorial 2: Adding Video
    New code to load and configure a Video Plugin:

// Preload the video and tracker
int _video_id = osgART::PluginManager::getInstance()->load("osgart_video_artoolkit2");

// Load a video plugin.
osg::ref_ptr<osgART::Video> video =
   dynamic_cast<osgART::Video*>(osgART::PluginManager::getInstance()->get(_video_id));

// Check if an instance of the video stream could be created
if (!video.valid()) {
   // Without video an AR application can not work. Quit if none found.
   osg::notify(osg::FATAL) << "Could not initialize video plugin!" << std::endl;
   exit(-1);
}

// Open the video. This will not yet start the video stream but will
// get information about the format of the video which is essential
// for the connected tracker.
video->open();
osgART Tutorial 2: Adding Video
    New code to add a live video background
osg::Group* createImageBackground(osg::Image* video) {
   osgART::VideoLayer* _layer = new osgART::VideoLayer();
   _layer->setSize(*video);
   osgART::VideoGeode* _geode = new osgART::VideoGeode(osgART::VideoGeode::USE_TEXTURE_2D, video);
   addTexturedQuad(*_geode, video->s(), video->t());
   _layer->addChild(_geode);
   return _layer;
}


    In the main function…
osg::ref_ptr<osg::Group> videoBackground = createImageBackground(video.get());
videoBackground->getOrCreateStateSet()->setRenderBinDetails(0, "RenderBin");

root->addChild(videoBackground.get());
video->start();
osgART Tutorial 3: Tracking
Add a Tracker plugin
      T k      l
  Load, configure, link to video
Add a Marker to track
  Load,
  Load activate
Tracked node
  Create, li k with marker via tracking callbacks
  C       link i h     k    i      ki     llb k
Print out the tracking data
osgART Tutorial 3: Tracking
     Load a t ac g plugin and associate it with the video plugin
      oa tracking p ug a assoc ate t w t t e v eo p ug
int _tracker_id = osgART::PluginManager::getInstance()->load("osgart_tracker_artoolkit2");

osg::ref_ptr<osgART::Tracker> tracker =
      dynamic_cast<osgART::Tracker*>(osgART::PluginManager::getInstance()->get(_tracker_id));

if (!tracker.valid()) {
   // Without tracker an AR application can not work. Quit if none found.
   osg::notify(osg::FATAL) << "Could not initialize tracker plugin!" << std::endl;
   exit(-1);
}

// get the tracker calibration object
osg::ref_ptr<osgART::Calibration> calibration = tracker->getOrCreateCalibration();

// load a calibration file
if (!calibration->load("data/camera para dat"))
   (!calibration->load( data/camera_para.dat ))
{
   // the calibration file was non-existing or couldnt be loaded
   osg::notify(osg::FATAL) << "Non existing or incompatible calibration file" << std::endl;
   exit(-1);
}

// set the image source for the tracker
tracker->setImage(video.get());

osgART::TrackerCallback::addOrSet(root.get(), tracker.get());

// create the virtual camera and add it to the scene
osg::ref_ptr<osg::Camera> cam = calibration->createCamera();
root->addChild(cam.get());
osgART Tutorial 3: Tracking
     Load a marker and activate it
     Associate it with a transformation node (via event callbacks)
     Add the transformation node to the virtual camera node
osg::ref_ptr<osgART::Marker> marker = tracker->addMarker("single;data/patt.hiro;80;0;0");
if (!marker.valid())
{
   // Without marker an AR application can not work. Quit if none found.
   osg::notify(osg::FATAL) << "Could not add marker!" << std::endl;
   exit(-1);
}

marker->setActive(true);

osg::ref_ptr<osg::MatrixTransform> arTransform = new osg::MatrixTransform();
osgART::attachDefaultEventCallbacks(arTransform.get(), marker.get());
cam->addChild(arTransform.get());


     Add a debug callback to print out information about the tracked marker
osgART::addEventCallback(arTransform.get(), new osgART::MarkerDebugCallback(marker.get()));
osgART Tutorial 3: Tracking

Tracking information is
       g
output to console
osgART Tutorial 4: Adding Content
Now put the trackin data t use!
N       t    tracking     to se!
Add content to the tracked transform
Basic cube code
   arTransform->addChild(osgART::testCube());
   arTransform->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");
osgART Tutorial 5: Adding 3D Model
Open Scene Graph can load some 3D formats directly:
O    S     G h       l d          f       di    l
   e.g. Wavefront (.obj), OpenFlight (.flt), 3D Studio (.3ds), COLLADA
   Others need to be converted
Support for some formats is much better than others
   e.g. OpenFlight good, 3ds hit and miss.
Recommend native .osg and .ive formats
   .osg – ASCII representation of scene graph
   .ive – Binary OSG file. Can contain hold textures
    ive              file                   textures.
osgExp : Exporter for 3DS Max is a good choice
   http://sourceforge.net/projects/osgmaxexp
Otherwise .3ds files from TurboSquid can work
osgART Tutorial 5: Adding 3D Model
     Replace the simple cube with a 3D model
     Models are loaded using the osgDB::readNodeFile() function
std::string filename = "media/hollow_cube.osg";
arTransform >addChild(osgDB::readNodeFile(filename));
arTransform->addChild(osgDB::readNodeFile(filename));




                                             Export to .osg



           3D Studio M
              St di Max


                                                              osgART
     Note: Scale is important. Units are in mm.
                    important               mm
osgART Tutorial 6: Multiple Markers
Repeat the process so far to track more than
one marker simultaneouslyy
osgART Tutorial 6: Multiple Markers
    Repeat the process so far to track more than one marker
    R       h             f          k       h          k

    Load and activate two markers
osg::ref_ptr<osgART::Marker> markerA = tracker->addMarker("single;data/patt.hiro;80;0;0");
markerA->setActive(true);

osg::ref_ptr<osgART::Marker> markerB = tracker->addMarker("single;data/patt.kanji;80;0;0");
        _
markerB->setActive(true);


    Create two transformations, attach callbacks, and add models
osg::ref_ptr<osg::MatrixTransform> arTransformA = new osg::MatrixTransform();
osgART::attachDefaultEventCallbacks(arTransformA.get(), markerA.get());
arTransformA->addChild(osgDB::readNodeFile("media/hitl_logo.osg"));
arTransformA->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");
cam->addChild(arTransformA.get());

osg::ref_ptr<osg::MatrixTransform> arTransformB = new osg::MatrixTransform();
osgART::attachDefaultEventCallbacks(arTransformB.get(), markerB.get());
arTransformB->addChild(osgDB::readNodeFile("media/gist_logo.osg"));
arTransformB->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");
cam->addChild(arTransformB.get());
osgART Tutorial 6: Multiple Markers
Basic osgART Tutorial: Summary



Standard OSG Viewer      Addition of Video    Addition of Tracking




Addition of basic 3D   Addition of 3D Model     Multiple Markers
      g p
      graphics
FLARManager
  FLARM
http://transmote.com/flar
http://transmote com/flar
FLARManager:
Makes building FLARToolkit apps easier
Is open-source, with a free and commercial license
   open source,
Is designed to allow exploration of both augmented
reality and alternative controllers
Was initiated by Eric Socolofsky, developed with
contributions from FLARToolkit community
      ib i      f    FLART lki            i
Decouples FLARToolkit from Papervision3D
Configuration without recompilation, via xml config
FLARManager: features

Gives more control over application environment
Provides multiple input options
Robust multiple marker management
Supports multiple 3D frameworks
Offers features f optimization
Off    f        for    i i i
Allows for customization
Resources
Websites
Software Download
  http://artoolkit.sourceforge.net/
  http://artoolkit sourceforge net/
ARToolKit Documentation
  http://www.hitl.washington.edu/artoolkit/
ARToolKit Forum
  http://www.hitlabnz.org/wiki/Forum
ARToolworks Inc
  http://www.artoolworks.com/
     p
ARToolKit Plus
  http://studierstube.icg.tu-
  graz.ac.at/handheld_ar/artoolkitplus.php
           t/h dh ld / t lkit l         h
osgART
  http://www.osgart.org/
FLARToolKit
  http://www.libspark.org/wiki/saqoosha/FLARToolKit/
FLARManager
  http://words.transmote.com/wp/flarmanager/
  http://words transmote com/wp/flarmanager/
Books
Interactive E
I           Environments with Open-Source
                              hO      S
Software: 3D Walkthroughs and Augmented
Reality for Architects with Bl d 2 43 DART
R li f A hi             i h Blender 2.43,
3.0 and ARToolKit 2.72 by Wolfgang Höhl

A Hitchhikers Guide to Virtual Reality by Karen
                                     y y
McMenemy and Stuart Ferguson
More Information
• M k Billi h t
  Mark Billinghurst
  – mark.billinghurst@hitlabnz.org
• Websites
  – www.hitlabnz.org
        hi l b

Contenu connexe

Tendances

Circles graphic
Circles graphicCircles graphic
Circles graphicalldesign
 
OpenGL Transformation
OpenGL TransformationOpenGL Transformation
OpenGL TransformationSandip Jadhav
 
maXbox Starter 45 Robotics
maXbox Starter 45 RoboticsmaXbox Starter 45 Robotics
maXbox Starter 45 RoboticsMax Kleiner
 
Introduction to OpenCV 3.x (with Java)
Introduction to OpenCV 3.x (with Java)Introduction to OpenCV 3.x (with Java)
Introduction to OpenCV 3.x (with Java)Luigi De Russis
 
Introduction to Computer Vision using OpenCV
Introduction to Computer Vision using OpenCVIntroduction to Computer Vision using OpenCV
Introduction to Computer Vision using OpenCVDylan Seychell
 
C++ amp on linux
C++ amp on linuxC++ amp on linux
C++ amp on linuxMiller Lee
 
Summer Games University - Day 3
Summer Games University - Day 3Summer Games University - Day 3
Summer Games University - Day 3Clemens Kern
 
NVIDIA's OpenGL Functionality
NVIDIA's OpenGL FunctionalityNVIDIA's OpenGL Functionality
NVIDIA's OpenGL FunctionalityMark Kilgard
 
A race of two compilers: GraalVM JIT versus HotSpot JIT C2. Which one offers ...
A race of two compilers: GraalVM JIT versus HotSpot JIT C2. Which one offers ...A race of two compilers: GraalVM JIT versus HotSpot JIT C2. Which one offers ...
A race of two compilers: GraalVM JIT versus HotSpot JIT C2. Which one offers ...J On The Beach
 
C aptitude 1st jan 2012
C aptitude 1st jan 2012C aptitude 1st jan 2012
C aptitude 1st jan 2012Kishor Parkhe
 

Tendances (12)

Baiscs of OpenGL
Baiscs of OpenGLBaiscs of OpenGL
Baiscs of OpenGL
 
Circles graphic
Circles graphicCircles graphic
Circles graphic
 
OpenGL Transformation
OpenGL TransformationOpenGL Transformation
OpenGL Transformation
 
maXbox Starter 45 Robotics
maXbox Starter 45 RoboticsmaXbox Starter 45 Robotics
maXbox Starter 45 Robotics
 
Introduction to OpenCV 3.x (with Java)
Introduction to OpenCV 3.x (with Java)Introduction to OpenCV 3.x (with Java)
Introduction to OpenCV 3.x (with Java)
 
Introduction to Computer Vision using OpenCV
Introduction to Computer Vision using OpenCVIntroduction to Computer Vision using OpenCV
Introduction to Computer Vision using OpenCV
 
C++ amp on linux
C++ amp on linuxC++ amp on linux
C++ amp on linux
 
Summer Games University - Day 3
Summer Games University - Day 3Summer Games University - Day 3
Summer Games University - Day 3
 
NVIDIA's OpenGL Functionality
NVIDIA's OpenGL FunctionalityNVIDIA's OpenGL Functionality
NVIDIA's OpenGL Functionality
 
A race of two compilers: GraalVM JIT versus HotSpot JIT C2. Which one offers ...
A race of two compilers: GraalVM JIT versus HotSpot JIT C2. Which one offers ...A race of two compilers: GraalVM JIT versus HotSpot JIT C2. Which one offers ...
A race of two compilers: GraalVM JIT versus HotSpot JIT C2. Which one offers ...
 
C aptitude 1st jan 2012
C aptitude 1st jan 2012C aptitude 1st jan 2012
C aptitude 1st jan 2012
 
Course1
Course1Course1
Course1
 

En vedette

QR Codes and Mobile Websites
QR Codes and Mobile WebsitesQR Codes and Mobile Websites
QR Codes and Mobile WebsitesNancy-jo Manney
 
Using Qr Codes Andv3
Using Qr Codes Andv3Using Qr Codes Andv3
Using Qr Codes Andv3jbou
 
Figurative Language QR Code Quest
Figurative Language QR Code QuestFigurative Language QR Code Quest
Figurative Language QR Code Questtheteachingfactor
 
2.Как привлечь клиентов и повысить продажи через сайт. Васильев Алексей
2.Как привлечь клиентов и повысить продажи через сайт. Васильев Алексей2.Как привлечь клиентов и повысить продажи через сайт. Васильев Алексей
2.Как привлечь клиентов и повысить продажи через сайт. Васильев Алексейdirectline-sib
 
Use Case Diagram Templates by Creately
Use Case Diagram Templates by CreatelyUse Case Diagram Templates by Creately
Use Case Diagram Templates by CreatelyCreately
 
Use Case Diagram
Use Case DiagramUse Case Diagram
Use Case DiagramAshesh R
 

En vedette (10)

QR Codes IT MGMT
QR Codes IT MGMTQR Codes IT MGMT
QR Codes IT MGMT
 
Poster rewire workshop seville_210313
Poster rewire workshop seville_210313Poster rewire workshop seville_210313
Poster rewire workshop seville_210313
 
QR Codes and Mobile Websites
QR Codes and Mobile WebsitesQR Codes and Mobile Websites
QR Codes and Mobile Websites
 
Using Qr Codes Andv3
Using Qr Codes Andv3Using Qr Codes Andv3
Using Qr Codes Andv3
 
Figurative Language QR Code Quest
Figurative Language QR Code QuestFigurative Language QR Code Quest
Figurative Language QR Code Quest
 
QR Codes & WordPress Plugins
QR Codes & WordPress PluginsQR Codes & WordPress Plugins
QR Codes & WordPress Plugins
 
2.Как привлечь клиентов и повысить продажи через сайт. Васильев Алексей
2.Как привлечь клиентов и повысить продажи через сайт. Васильев Алексей2.Как привлечь клиентов и повысить продажи через сайт. Васильев Алексей
2.Как привлечь клиентов и повысить продажи через сайт. Васильев Алексей
 
Zap App : Smart Ordering
Zap App : Smart OrderingZap App : Smart Ordering
Zap App : Smart Ordering
 
Use Case Diagram Templates by Creately
Use Case Diagram Templates by CreatelyUse Case Diagram Templates by Creately
Use Case Diagram Templates by Creately
 
Use Case Diagram
Use Case DiagramUse Case Diagram
Use Case Diagram
 

Similaire à Developer Tools for Building AR Experiences

426 lecture 4: AR Developer Tools
426 lecture 4: AR Developer Tools426 lecture 4: AR Developer Tools
426 lecture 4: AR Developer ToolsMark Billinghurst
 
Using Deep Learning for Computer Vision Applications
Using Deep Learning for Computer Vision ApplicationsUsing Deep Learning for Computer Vision Applications
Using Deep Learning for Computer Vision ApplicationsFarshid Pirahansiah
 
Static analysis of C++ source code
Static analysis of C++ source codeStatic analysis of C++ source code
Static analysis of C++ source codePVS-Studio
 
Static analysis of C++ source code
Static analysis of C++ source codeStatic analysis of C++ source code
Static analysis of C++ source codeAndrey Karpov
 
Augmented Reality With FlarToolkit and Papervision3D
Augmented Reality With FlarToolkit and Papervision3DAugmented Reality With FlarToolkit and Papervision3D
Augmented Reality With FlarToolkit and Papervision3DRoman Protsyk
 
Ultracode Berlin #2 : Introduction to Perceptual Computing by Sulamita Garcia
Ultracode Berlin #2 : Introduction to Perceptual Computing by Sulamita GarciaUltracode Berlin #2 : Introduction to Perceptual Computing by Sulamita Garcia
Ultracode Berlin #2 : Introduction to Perceptual Computing by Sulamita GarciaBeMyApp
 
Shape12 6
Shape12 6Shape12 6
Shape12 6pslulli
 
426 lecture6a osgART Development
426 lecture6a osgART Development426 lecture6a osgART Development
426 lecture6a osgART DevelopmentMark Billinghurst
 
XebiCon'17 : Faites chauffer les neurones de votre Smartphone avec du Deep Le...
XebiCon'17 : Faites chauffer les neurones de votre Smartphone avec du Deep Le...XebiCon'17 : Faites chauffer les neurones de votre Smartphone avec du Deep Le...
XebiCon'17 : Faites chauffer les neurones de votre Smartphone avec du Deep Le...Publicis Sapient Engineering
 
Breizhcamp Rennes 2011
Breizhcamp Rennes 2011Breizhcamp Rennes 2011
Breizhcamp Rennes 2011sekond0
 
Annotation tools for ADAS & Autonomous Driving
Annotation tools for ADAS & Autonomous DrivingAnnotation tools for ADAS & Autonomous Driving
Annotation tools for ADAS & Autonomous DrivingYu Huang
 
Perceptual Computing Workshop à Paris
Perceptual Computing Workshop à ParisPerceptual Computing Workshop à Paris
Perceptual Computing Workshop à ParisBeMyApp
 
JIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdf
JIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdfJIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdf
JIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdfSamiraKids
 
Skia & Freetype - Android 2D Graphics Essentials
Skia & Freetype - Android 2D Graphics EssentialsSkia & Freetype - Android 2D Graphics Essentials
Skia & Freetype - Android 2D Graphics EssentialsKyungmin Lee
 
PVS-Studio, a solution for resource intensive applications development
PVS-Studio, a solution for resource intensive applications developmentPVS-Studio, a solution for resource intensive applications development
PVS-Studio, a solution for resource intensive applications developmentOOO "Program Verification Systems"
 
License Plate Recognition System
License Plate Recognition System License Plate Recognition System
License Plate Recognition System Hira Rizvi
 
PVS-Studio and Continuous Integration: TeamCity. Analysis of the Open RollerC...
PVS-Studio and Continuous Integration: TeamCity. Analysis of the Open RollerC...PVS-Studio and Continuous Integration: TeamCity. Analysis of the Open RollerC...
PVS-Studio and Continuous Integration: TeamCity. Analysis of the Open RollerC...Andrey Karpov
 

Similaire à Developer Tools for Building AR Experiences (20)

426 lecture 4: AR Developer Tools
426 lecture 4: AR Developer Tools426 lecture 4: AR Developer Tools
426 lecture 4: AR Developer Tools
 
FLAR Workflow
FLAR WorkflowFLAR Workflow
FLAR Workflow
 
Using Deep Learning for Computer Vision Applications
Using Deep Learning for Computer Vision ApplicationsUsing Deep Learning for Computer Vision Applications
Using Deep Learning for Computer Vision Applications
 
ARE 2011 AR Authoring
ARE 2011 AR AuthoringARE 2011 AR Authoring
ARE 2011 AR Authoring
 
Static analysis of C++ source code
Static analysis of C++ source codeStatic analysis of C++ source code
Static analysis of C++ source code
 
Static analysis of C++ source code
Static analysis of C++ source codeStatic analysis of C++ source code
Static analysis of C++ source code
 
Augmented Reality With FlarToolkit and Papervision3D
Augmented Reality With FlarToolkit and Papervision3DAugmented Reality With FlarToolkit and Papervision3D
Augmented Reality With FlarToolkit and Papervision3D
 
ARTag
ARTagARTag
ARTag
 
Ultracode Berlin #2 : Introduction to Perceptual Computing by Sulamita Garcia
Ultracode Berlin #2 : Introduction to Perceptual Computing by Sulamita GarciaUltracode Berlin #2 : Introduction to Perceptual Computing by Sulamita Garcia
Ultracode Berlin #2 : Introduction to Perceptual Computing by Sulamita Garcia
 
Shape12 6
Shape12 6Shape12 6
Shape12 6
 
426 lecture6a osgART Development
426 lecture6a osgART Development426 lecture6a osgART Development
426 lecture6a osgART Development
 
XebiCon'17 : Faites chauffer les neurones de votre Smartphone avec du Deep Le...
XebiCon'17 : Faites chauffer les neurones de votre Smartphone avec du Deep Le...XebiCon'17 : Faites chauffer les neurones de votre Smartphone avec du Deep Le...
XebiCon'17 : Faites chauffer les neurones de votre Smartphone avec du Deep Le...
 
Breizhcamp Rennes 2011
Breizhcamp Rennes 2011Breizhcamp Rennes 2011
Breizhcamp Rennes 2011
 
Annotation tools for ADAS & Autonomous Driving
Annotation tools for ADAS & Autonomous DrivingAnnotation tools for ADAS & Autonomous Driving
Annotation tools for ADAS & Autonomous Driving
 
Perceptual Computing Workshop à Paris
Perceptual Computing Workshop à ParisPerceptual Computing Workshop à Paris
Perceptual Computing Workshop à Paris
 
JIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdf
JIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdfJIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdf
JIT Spraying Never Dies - Bypass CFG By Leveraging WARP Shader JIT Spraying.pdf
 
Skia & Freetype - Android 2D Graphics Essentials
Skia & Freetype - Android 2D Graphics EssentialsSkia & Freetype - Android 2D Graphics Essentials
Skia & Freetype - Android 2D Graphics Essentials
 
PVS-Studio, a solution for resource intensive applications development
PVS-Studio, a solution for resource intensive applications developmentPVS-Studio, a solution for resource intensive applications development
PVS-Studio, a solution for resource intensive applications development
 
License Plate Recognition System
License Plate Recognition System License Plate Recognition System
License Plate Recognition System
 
PVS-Studio and Continuous Integration: TeamCity. Analysis of the Open RollerC...
PVS-Studio and Continuous Integration: TeamCity. Analysis of the Open RollerC...PVS-Studio and Continuous Integration: TeamCity. Analysis of the Open RollerC...
PVS-Studio and Continuous Integration: TeamCity. Analysis of the Open RollerC...
 

Plus de Mark Billinghurst

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VRMark Billinghurst
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR SystemsMark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR PrototypingMark Billinghurst
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR InteractionMark Billinghurst
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR TechnologyMark Billinghurst
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: PerceptionMark Billinghurst
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XRMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 

Plus de Mark Billinghurst (20)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 

Dernier

Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Paola De la Torre
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsTop 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsRoshan Dwivedi
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure servicePooja Nehwal
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Enterprise Knowledge
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...Neo4j
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEarley Information Science
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 

Dernier (20)

Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsTop 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 

Developer Tools for Building AR Experiences

  • 1. Lecture 3: AR D L 3 Developer Tools l T l Mark Billinghurst mark.billinghurst@hitlabnz.org July 2011 COSC 426: Augmented Reality
  • 2. Building Compelling AR Experiences experiences applications tools components Display, Tracking Sony CSL © 2004
  • 3. Low Level AR Libraries ARToolKit Marker based tracking g FLARToolKit Flash Fl h version of ART lK f ARToolKit SSTT Simple Spatial Template Tracking Opira O ira Robust Natural Feature Tracking
  • 4. What ARToolKit? Wh is ART lK ? Marker Tracking Library for AR applications Open Source, Multi platform (Linux Windows MacOS) Source Multi-platform (Linux, Windows, Overlays 3D virtual objects on real markers Uses single tracking marker Determines camera pose information (6 DOF) ARToolKit Website http://www.hitl.washington.edu/artoolkit/ http://artoolkit.sourceforge.net/
  • 5. ARToolKit S f ART lK Software ARToolKit version: 2.65 or later 2 65 Currently two license models Open Source (GPL): ARToolKit 2 72 2.72 Commercial (ARToolWorks): ARToolKit 4.0 OS: Li OS Linux, Windows, MacOS X, iPh Wi d M OS X iPhone/Android /A d id Programming language: C Related f R l d software ARToolKit Professional: Commercial version ARToolKitPlus: Advanced version NyARToolkit: Java and C# version FLARToolKit: Flash version
  • 6. ARToolKit Family ART lKit F il ARToolKit NFT ARToolKit ARToolKit Plus JARToolKit (Java) ARToolKit (Symbian) FLARToolKit (Flash) NyToolKit - Java, C#, - Android WM Android, FLARManager ( (Flash) )
  • 7. ARToolKit ART lK contents Libraries libAR – tracking b t ac g libARvideo – video capturing libARgsub – image/graphics drawing libARmulti – multi-marker tracking Utilities Camera calibration Marker training
  • 8. ARToolKit Structure ARvideo.lib DirectShow Three key libraries: AR32.lib AR32 lib – ARToolKit image processing functions ARgsub32.lib – ARToolKit graphics functions ARvideo.lib ARvideo lib – DirectShow video capture class
  • 9. Additional Software To build an AR application you may need additional software High level rendering library Open VRML, Open Inventor, osgART, etc Audio Library Fmod, etc Peripheral support
  • 10. What does ARToolKit Calculate? Position of makers in the camera coordinates Pose of markers in the camera coordinates Output format 3x4 matrix format to represent the transformation matrix from the marker coordinates to the camera coordinates
  • 12. Tracking Range with Pattern Size Rule of thumb – range = 10 x pattern width
  • 16. An ARToolKit Application Initialization I iti li ti Load camera and pattern parameters Main Loop M i L Step1. Image capture and display Step2. Marker detection Step3. Marker identification Step4. Getting pose information Step5. Object Interactions/Simulation Step6. Display virtual objects End Application Camera shut down
  • 17. Image capture: libARvideo Return the pointer for captured image ARUint8 *arVideoGetImage( void ); Pixel format and byte size are defined in config h config.h #define AR_PIX_FORMAT_BGR #define AR_PIX_SIZE 3
  • 18. Graphics handling: libARgsub Set up and clean up the graphics window void argInit( ARParam *cparam double zoom *cparam, zoom, int fullFlag, int xwin, int ywin, int hmd_flag ); void argCleanup( void ); cparam: camera parameter zoom: zoom ratio fullFlag: 0: normal, 1: full screen mode Xwin, ywin: create small window for debug hmd_flag: 0: normal, 1: optical see-through mode
  • 19. Graphics handling: libARgsub Go into the iterative cycle void argMainLoop( void (*mouseFunc)(int btn int state int x int y) btn,int state,int x,int y), void (*keyFunc)(unsigned char key, int x, int y), void (*mainFunc)(void) ); Swap buffers p void argSwapBuffers( void );
  • 20. Graphics handling: libARgsub Set the window for 2D drawing void argDrawMode2D( void ) id D M d 2D( id ); Set the window for 3D drawing g void argDrawMode3D( void ); void argDraw3dCamera( int xwin, int ywin ); Display image void argDispImage( ARUi t8 *i id Di I ( ARUint8 *image, int xwin, int ywin );
  • 21. Sample ARToolKit Applications Ex. 1: Simple video display Ex. 2: Detecting a marker Ex. 3: Using pattern Ex. 4: Getting E 4 G tti a 3D i f information ti Ex. Ex 5: Virtual object overlay
  • 22. Ex 1: Simple Video Display Program : sample1.c Key points Loop structure Video image handling Camera parameter handling Window setup Mouse and keyboard handling
  • 23. Sample1.c Main Function main() { init(); argMainLoop( mouseEvent, keyEvent, a oop); mainLoop); }
  • 24. Sample1.c - mainLoop Function if( dataPtr = (ARUint8 *) arVideoGetImage()) == NULL ) { arUtilSleep(2); return; } argDrawMode2D(); D M d 2D() argDispImage(dataPtr, 0, 0 ); arVideoCapNext(); argSwapBuffers();
  • 25. Sample1.c – video initialization Configure th video input C fi the id i t vconf = <video configuration string> Start video capture arVideoCapStart(); In init(), open the video arVideoOpen( vconf ); arVideoInqSize(&xsize, &ysize); When finished, close the video path arVideoCapStop(); arVideoClose();
  • 26. Changing Image Size For input capture vconf = “videoWidth=320,videoHeight=240"; Note – the camera must support this image size For display argInit( &cparam 1 5 0 0 0 0 ); &cparam, 1.5, 0, 0, 0, The Th second parameter means zoom ratio for display d f d l image size related to input image.
  • 27. Ex. 2: Detecting a Marker Program : sample2.c P l 2 Key points yp Threshold value Important external variables arDebug – keep thresholded image arImage – pointer for thresholded image arImageProcMode – use 50% image for image processing - AR_IMAGE_PROC_IN_FULL - AR IMAGE PROC IN HALF AR_IMAGE_PROC_IN_HALF
  • 28. Sample2.c – marker detection /* detect the markers in the video frame */ if(arDetectMarker(dataPtr, th if( D t tM k (d t Pt thresh, h &marker_info, &marker_num) < 0 ) { cleanup(); l () exit(0); } for( i = 0; i < marker_num; i++ ) { argDrawSquare(marker_info[i].vertex,0,0); }
  • 29. Sample2.c – marker_info structure typedef struct { int area; int id; int dir; double cf; double pos[2]; double line[4][3]; double vertex[4][2]; } ARMarkerInfo;
  • 30. Ex. 3: Using a Pattern Program : sample3.c Key points Pattern files loading Structure of marker information - Region features - Pattern Id, direction - Certainty factor Marker identification
  • 31. Making a pattern template Use f ili U of utility program: mk_patt.exe Show the pattern Put the corner of red line segments on the left-top left top vertex of the marker Pattern stored as a template in a file 1:2:1 ratio determines the pattern region used tt i d
  • 32. Sample3.c – Pattern File Loading int patt_id; char *patt name = “Data/kanjiPatt” patt_name Data/kanjiPatt / /* load pattern file *// if(patt_id=arLoadPatt (patt_name) < 0) { printf ("Pattern file load error !! n"); exit(0); }
  • 33. Checking for known patterns /* check for known patterns */ k = -1; for( i = 0; i < marker num; i++ ) { marker_num; if( marker_info[i].id == patt_id) { /* you've found a pattern */ printf("Found pattern: %d n",patt_id); if( k == -1 ) k = i; else /* make sure you have the best pattern (highest confidence factor) */ if( marker_info[k].cf < marker_info[i].cf ) k i f [k] f k i f [i] f k = i; } }
  • 34. Ex. 4 – Getting 3D information Program : sample4.c Key points Definition of a real marker Transformation matrix - Rotation component -TTranslation component l ti t
  • 35. Sample4.c – Transformation matrix double marker_center[2] = {0.0, 0.0}; double marker_width = 80.0; double marker_trans[3][4]; arGetTransMat(&marker_info[i], arGetTransMat(&marker info[i] marker_center, marker_width, marker_trans); marker trans);
  • 36. Finding the Camera Position This function sets transformation matrix from marker to camera into marker_trans[3][4]. _ [ ][ ] arGetTransMat(&marker_info[k], marker_center, marker_width, marker_trans); You can see the position information in the values of marker_trans[3][4]. marker trans[3][4] Xpos = marker_trans[0][3]; Ypos = marker_trans[1][3]; Zpos = marker_trans[2][3];
  • 38. Ex. 5- Virtual Object Display Program : sample5.c Key points OpenGL parameter setting Setup of projection matrix Setup of modelview matrix
  • 39. Appending your own OpenGL code Set the camera parameters to OpenGL Projection matrix. g argDrawMode3D(); argDraw3dCamera( 0, 0 ); Set the transformation matrix from the marker to the camera to the OpenGL ModelView matrix. argConvGlpara(marker_trans, gl_para); glMatrixMode(GL_MODELVIEW); glLoadMatrixd( gl_para ); After calling these functions, your OpenGL objects are drawn in the real marker coordinates.
  • 40. 3D CG Model Rendering ARToolKit does not have a function to handle 3D CG models models. 3rd party CG rendering software should be employed. OpenVRML p OpenSceneGraph etc
  • 41.
  • 42. Loading Multiple Patterns Sample File: LoadMulti.c Uses object c to load object.c Object Structure typedef struct { t d f t t char name[256]; int id; int visible; double marker_coord[4][2]; double trans[3][4]; double marker_width; double marker_center[2]; } ObjectData_T;
  • 43. Finding Multiple Transforms Create object list ObjectData_T ObjectData T *object; object; Read in objects - in init( ) read_ObjData( char *name, int *objectnum ); Find Transform – in mainLoop( ) for( i = 0; i < objectnum; i++ ) { ..Check patterns Ch k tt ..Find transforms for each marker }
  • 44. Drawing Multiple Objects Send the object list to the draw function draw( object, objectnum ); Draw each object individually j y for( i = 0; i < objectnum; i++ ) { if( object[i].visible == 0 ) continue; ( j [ ] ; argConvGlpara(object[i].trans, gl_para); draw_object( draw object( object[i].id, gl para); gl_para); }
  • 45. Limitations of ARToolKit Partial occlusions cause tracking failure Affected by lighting and shadows Tracking range depends on marker size Performance depends on number of markers P f d d b f k cf artTag, ARToolKitPlus Pose accuracy depends on distance to marker Pose accuracy depends on angle to marker y p g
  • 46. ARToolKit in the World Hundreds f H d d of projects j Large research community g y
  • 47. FLARToolKit Flash AS3 Version of the ARToolKit (was ported from NyARToolkit the Java Version of the ARToolkit) enables augmented reality on the Browser uses Papervision3D for as 3D Engine available at http://saqoosha.net/ dual license, GPL and commercial license
  • 48. AR Application Components Adobe Flash Papervision 3D FLARToolkit
  • 49. Source Packages „Original“ FLART lki (Libspark, Saqoosha) O i i l“ FLARToolkit (http://www.libspark.org/svn/as3/FLARToolKit/trunk/ ) Start-up-guides Saqoosha (http://saqoosha.net/en/flartoolkit/start-up-guide/ ) Miko Haapoja (http://www.mikkoh.com/blog/?p=182 ) „Frameworks Frameworks“ Squidder MultipleMarker – Example (http://www.squidder.com/2009/03/06/flar-how-to-multiple-instances-of-multiple- markers/ ) k / FLARManager (http://words.transmote.com/wp/flarmanager/ )
  • 50. Papervision 3D http://www.papervision3d.org/ Flash based 3D Engine Flash-based 3D-Engine Supports import of 3D Models texturing g animation scene graph alternatives: Away3d, Sandy,…
  • 51. Papervision Examples Red Bull|Flugtage Flight Lab http://www.redbull.com/flightlab/#/Game/TEMP_0 Barcinski JeanJean | 3D Portfolio http://www.barcinski-jeanjean.com/ more cool papervision websites http://blog.papervision3d.org/2009/04/24/2009-webby-awards-nominees/
  • 53. Building Compelling AR Experiences B ildi C lli E i experiences applications tools Authoring components Tracking, Display Sony CSL © 2004
  • 54. AR Authoring Software Lib i S ft Libraries osgART, Studierstube, MXRToolKit g Plug-ins to existing software DART (Macromedia Director), mARx (M di Di ) AR Stand Alone AMIRE, ComposAR, etc Next Generation iaTAR (Tangible AR)
  • 55. mARx Plug-in 3D Studio Max Plug-in Can model and view AR content at the same time
  • 56. BuildAR http://www.buildar.co.nz/ pp Stand alone application Visual interface for AR model viewing application Enables non-programmers to build AR scenes
  • 57. ImageTclAR Adds AR components to ImageTcl http://metlab.cse.msu.edu/imagetclar/ Modular Library (Scripting, Tcl) Supports several tracking systems (vision, magnetic, inertial) Easy to learn but little support, small community
  • 58. DART Designers AR Toolkit http://www.cc.gatech.edu/dart/ http://www cc gatech edu/dart/ AR plug-in for Macromedia Director Developed f d i D l d for designers Visual programming Scripting interface
  • 59. Studierstube Complete authoring tool http://studierstube.icg.tu-graz.ac.at/ Framework (Low Level Programming, C++) Modularity, Extensibility, Scalability, Heterogeneity Support for wide range of trackers, displays, input
  • 60. Metaio UnifEye SDK Complete commercial authoring platform http://www.metaio.com/products/ p p Offers viewer and editor tools Visual interface and low level SDK Delivery on desktop or mobile platforms
  • 61. OSGART Programming Library Integration of ARToolKit with a High-Level Rendering Engine (OpenSceneGraph) g g ( p p ) OSGART= OpenSceneGraph + ARToolKit Supporting Geometric + Photometric Registration pp g g
  • 62. osgART:Features C++ (but also Python Lua, etc). Python, Lua etc) Multiple Video Input supports: Direct (Firewire/USB Camera), Files, Network by ARvideo, PtGrey, CVCam, VideoWrapper, etc. Benefits of Open Scene Graph Rendering Engine Plug-ins etc Engine, Plug ins,
  • 63. More Libraries JARToolKit MRToolKit, MXRToolKit ARLib MRToolKit MXRToolKit, ARLib, OpenVIDIA DWARF, Goblin XNA AMIRE D’Fusion D’F i
  • 64. Advanced Authoring: iaTAR (Lee 2004) Immersive AR A th i I i Authoring Using real objects to create AR applications
  • 65. osgART Developing Augmented Reality Applications with osgART
  • 66. What is a Scene Graph? Tree-like structure for organising a virtual world e.g. VRML Hierarchy of nodes that define: Hi h f d th t d fi Groups (and Switches, Sequences etc…) Transformations Projections Geometry … And states and attributes that define: Materials and textures Lighting and blending …
  • 68. Benefits of a Scene Graph Performance P f Structuring data facilitates optimization - Culling, state management, etc… Abstraction Underlying graphics pipeline is hidden Low-level programming (“how do I Image: sgi display this?”) replaced with high- level l l concepts (“what do I want to t (“ h t d tt display?”)
  • 69. About Open Scene Graph http://www.openscenegraph.org/ Open-source scene graph implementation Based on OpenGL Object-oriented C++ following design pattern principles Used for simulation, games, research, and industrial projects Active development community Maintained by Robert Osfield ~2000 mailing list subscribers Documentation project: www osgbooks com www.osgbooks.com Uses the OSG Public License (similar to LGPL)
  • 70. About Open Scene Graph (2) Pirates of the XXI Century Flightgear SCANeR 3DVRII Research Institute EOR VRlab Umeå University
  • 71. Open Scene Graph Features Plugins for loading and saving 3D: 3D Studio (.3ds), OpenFlight (.flt), Wavefront (.obj)… 2D: .png, .jpg, .bmp, QuickTime movies NodeKits to extend functionality y e.g. osgShadow Cross platform Cross-platform support for: Window management (osgViewer) Threading (OpenThreads)
  • 72. Open Scene Graph Architecture Inter-operability with other environments, e.g. Python Plugins read and NodeKits extend write 2D image and core functionality, 3D model files Scene graph and exposing higher-level rendering node types functionality
  • 73. Some Open Scene Graph Demos osgviewer osgmotionblur osgparticle osgreflect osgdistortion osgfxbrowser You may want to get the OSG data package: Via SVN: http://www.openscenegraph.org/svn/osg/OpenSceneGraph-Data/trunk
  • 74. Learning OSG Check out the Quick Start Guide Free PDF download at http://osgbooks.com/, Physical copy $13US Join h J i the mailing li ili list: http://www.openscenegraph.org/projects/osg/wiki/MailingLists Browse the website: http://www.openscenegraph.org/projects/osg Use the forum: http://forum.openscenegraph.org p p g p g Study the examples Read the source?
  • 75. osgART ART
  • 76. What is osgART? osgART adds AR to Open Scene Graph Further developed and enhanced by: Julian Looser Hartmut Seichter H S h Raphael Grasset Current version 2.0, Open Source http://www.osgart.org http://www osgart org
  • 77. osgART Approach: Basic Scene Graph Root Transform T f [0.988 -0.031 -0.048 0.857 0.141 0.513 -0.145 -0.512 0.846 10.939 29.859 -226.733 ] 0 0 0 1 To add Video see-through AR: 3D Object Integrate live video g Apply correct projection matrix Update tracked transformations in realtime
  • 78. osgART Approach: AR Scene Graph Root Transform 3D Object
  • 79. osgART Approach: AR Scene Graph Root Virtual Camera Video Transform Layer Video 3D Obj Object Geode
  • 80. osgART Approach: AR Scene Graph Root Projection matrix from tracker calibration Orthographic Transformation matrix projection Virtual updated from marker Camera tracking in realtime Video Transform Layer Full-screen quad with live texture updated from Video source Video 3D Obj Object Geode
  • 81. osgART Approach: AR Scene Graph Root Projection matrix from tracker calibration Orthographic Transformation matrix projection Virtual updated from marker Camera tracking in realtime Video Transform Layer Full-screen quad with live texture updated from Video source Video 3D Obj Object Geode
  • 82. osgART Architecture Like any video see-through AR library, osgART requires video input and tracking capabilities. Video Source e.g. e g DirectShow AR A Library Applica ation Tracking Module (libAR.lib)
  • 83. osgART Architecture osgART uses a plugin architecture so that video sources and tracking technologies can be plugged in as necessary OpenCVVideo - VidCapture - CMU1394 - PointGrey SDK - VidereDesign - Video Plugin V VideoWrapper - Vd W VideoInput - VideoSource - App DSVL - os Intranel RTSP - sgART plication Tracke Plugin ARToolKit4 - ARToolkitPlus - er n MXRToolKit - ARLib - bazAR (work in progress) - ARTag (work in progress) -
  • 84. Basic osgART Tutorial Develop D l a working osgART application from scratch. ki ART li ti f t h Use ARToolKit 2.72 library for tracking and video capture
  • 85. osgART Tutorial 1: Basic OSG Viewer Start with the standard Open Scene Graph Viewer We will modify this to do AR! y
  • 86. osgART Tutorial 1: Basic OSG Viewer The basic osgViewer… #include <osgViewer/Viewer> #include <osgViewer/ViewerEventHandlers> int main(int argc, char* argv[]) { // Create a viewer osgViewer::Viewer viewer; // Create a root node osg::ref_ptr<osg::Group> root = new osg::Group; // Attach root node to the viewer viewer.setSceneData(root.get()); g // Add relevant event handlers to the viewer viewer.addEventHandler(new osgViewer::StatsHandler); viewer.addEventHandler(new osgViewer::WindowSizeHandler); viewer.addEventHandler(new osgViewer::ThreadingHandler); viewer.addEventHandler(new viewer addEventHandler(new osgViewer::HelpHandler); // Run the viewer and exit the program when the viewer is closed return viewer.run(); }
  • 87. osgART Tutorial 2: Adding Video Add a video plugin Load, configure, start video capture… Add a video background Create, link to video, add to scene-graph g p
  • 88. osgART Tutorial 2: Adding Video New code to load and configure a Video Plugin: // Preload the video and tracker int _video_id = osgART::PluginManager::getInstance()->load("osgart_video_artoolkit2"); // Load a video plugin. osg::ref_ptr<osgART::Video> video = dynamic_cast<osgART::Video*>(osgART::PluginManager::getInstance()->get(_video_id)); // Check if an instance of the video stream could be created if (!video.valid()) { // Without video an AR application can not work. Quit if none found. osg::notify(osg::FATAL) << "Could not initialize video plugin!" << std::endl; exit(-1); } // Open the video. This will not yet start the video stream but will // get information about the format of the video which is essential // for the connected tracker. video->open();
  • 89. osgART Tutorial 2: Adding Video New code to add a live video background osg::Group* createImageBackground(osg::Image* video) { osgART::VideoLayer* _layer = new osgART::VideoLayer(); _layer->setSize(*video); osgART::VideoGeode* _geode = new osgART::VideoGeode(osgART::VideoGeode::USE_TEXTURE_2D, video); addTexturedQuad(*_geode, video->s(), video->t()); _layer->addChild(_geode); return _layer; } In the main function… osg::ref_ptr<osg::Group> videoBackground = createImageBackground(video.get()); videoBackground->getOrCreateStateSet()->setRenderBinDetails(0, "RenderBin"); root->addChild(videoBackground.get()); video->start();
  • 90. osgART Tutorial 3: Tracking Add a Tracker plugin T k l Load, configure, link to video Add a Marker to track Load, Load activate Tracked node Create, li k with marker via tracking callbacks C link i h k i ki llb k Print out the tracking data
  • 91. osgART Tutorial 3: Tracking Load a t ac g plugin and associate it with the video plugin oa tracking p ug a assoc ate t w t t e v eo p ug int _tracker_id = osgART::PluginManager::getInstance()->load("osgart_tracker_artoolkit2"); osg::ref_ptr<osgART::Tracker> tracker = dynamic_cast<osgART::Tracker*>(osgART::PluginManager::getInstance()->get(_tracker_id)); if (!tracker.valid()) { // Without tracker an AR application can not work. Quit if none found. osg::notify(osg::FATAL) << "Could not initialize tracker plugin!" << std::endl; exit(-1); } // get the tracker calibration object osg::ref_ptr<osgART::Calibration> calibration = tracker->getOrCreateCalibration(); // load a calibration file if (!calibration->load("data/camera para dat")) (!calibration->load( data/camera_para.dat )) { // the calibration file was non-existing or couldnt be loaded osg::notify(osg::FATAL) << "Non existing or incompatible calibration file" << std::endl; exit(-1); } // set the image source for the tracker tracker->setImage(video.get()); osgART::TrackerCallback::addOrSet(root.get(), tracker.get()); // create the virtual camera and add it to the scene osg::ref_ptr<osg::Camera> cam = calibration->createCamera(); root->addChild(cam.get());
  • 92. osgART Tutorial 3: Tracking Load a marker and activate it Associate it with a transformation node (via event callbacks) Add the transformation node to the virtual camera node osg::ref_ptr<osgART::Marker> marker = tracker->addMarker("single;data/patt.hiro;80;0;0"); if (!marker.valid()) { // Without marker an AR application can not work. Quit if none found. osg::notify(osg::FATAL) << "Could not add marker!" << std::endl; exit(-1); } marker->setActive(true); osg::ref_ptr<osg::MatrixTransform> arTransform = new osg::MatrixTransform(); osgART::attachDefaultEventCallbacks(arTransform.get(), marker.get()); cam->addChild(arTransform.get()); Add a debug callback to print out information about the tracked marker osgART::addEventCallback(arTransform.get(), new osgART::MarkerDebugCallback(marker.get()));
  • 93. osgART Tutorial 3: Tracking Tracking information is g output to console
  • 94. osgART Tutorial 4: Adding Content Now put the trackin data t use! N t tracking to se! Add content to the tracked transform Basic cube code arTransform->addChild(osgART::testCube()); arTransform->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");
  • 95. osgART Tutorial 5: Adding 3D Model Open Scene Graph can load some 3D formats directly: O S G h l d f di l e.g. Wavefront (.obj), OpenFlight (.flt), 3D Studio (.3ds), COLLADA Others need to be converted Support for some formats is much better than others e.g. OpenFlight good, 3ds hit and miss. Recommend native .osg and .ive formats .osg – ASCII representation of scene graph .ive – Binary OSG file. Can contain hold textures ive file textures. osgExp : Exporter for 3DS Max is a good choice http://sourceforge.net/projects/osgmaxexp Otherwise .3ds files from TurboSquid can work
  • 96. osgART Tutorial 5: Adding 3D Model Replace the simple cube with a 3D model Models are loaded using the osgDB::readNodeFile() function std::string filename = "media/hollow_cube.osg"; arTransform >addChild(osgDB::readNodeFile(filename)); arTransform->addChild(osgDB::readNodeFile(filename)); Export to .osg 3D Studio M St di Max osgART Note: Scale is important. Units are in mm. important mm
  • 97. osgART Tutorial 6: Multiple Markers Repeat the process so far to track more than one marker simultaneouslyy
  • 98. osgART Tutorial 6: Multiple Markers Repeat the process so far to track more than one marker R h f k h k Load and activate two markers osg::ref_ptr<osgART::Marker> markerA = tracker->addMarker("single;data/patt.hiro;80;0;0"); markerA->setActive(true); osg::ref_ptr<osgART::Marker> markerB = tracker->addMarker("single;data/patt.kanji;80;0;0"); _ markerB->setActive(true); Create two transformations, attach callbacks, and add models osg::ref_ptr<osg::MatrixTransform> arTransformA = new osg::MatrixTransform(); osgART::attachDefaultEventCallbacks(arTransformA.get(), markerA.get()); arTransformA->addChild(osgDB::readNodeFile("media/hitl_logo.osg")); arTransformA->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin"); cam->addChild(arTransformA.get()); osg::ref_ptr<osg::MatrixTransform> arTransformB = new osg::MatrixTransform(); osgART::attachDefaultEventCallbacks(arTransformB.get(), markerB.get()); arTransformB->addChild(osgDB::readNodeFile("media/gist_logo.osg")); arTransformB->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin"); cam->addChild(arTransformB.get());
  • 99. osgART Tutorial 6: Multiple Markers
  • 100. Basic osgART Tutorial: Summary Standard OSG Viewer Addition of Video Addition of Tracking Addition of basic 3D Addition of 3D Model Multiple Markers g p graphics
  • 102. FLARManager: Makes building FLARToolkit apps easier Is open-source, with a free and commercial license open source, Is designed to allow exploration of both augmented reality and alternative controllers Was initiated by Eric Socolofsky, developed with contributions from FLARToolkit community ib i f FLART lki i Decouples FLARToolkit from Papervision3D Configuration without recompilation, via xml config
  • 103. FLARManager: features Gives more control over application environment Provides multiple input options Robust multiple marker management Supports multiple 3D frameworks Offers features f optimization Off f for i i i Allows for customization
  • 105. Websites Software Download http://artoolkit.sourceforge.net/ http://artoolkit sourceforge net/ ARToolKit Documentation http://www.hitl.washington.edu/artoolkit/ ARToolKit Forum http://www.hitlabnz.org/wiki/Forum ARToolworks Inc http://www.artoolworks.com/ p
  • 106. ARToolKit Plus http://studierstube.icg.tu- graz.ac.at/handheld_ar/artoolkitplus.php t/h dh ld / t lkit l h osgART http://www.osgart.org/ FLARToolKit http://www.libspark.org/wiki/saqoosha/FLARToolKit/ FLARManager http://words.transmote.com/wp/flarmanager/ http://words transmote com/wp/flarmanager/
  • 107. Books Interactive E I Environments with Open-Source hO S Software: 3D Walkthroughs and Augmented Reality for Architects with Bl d 2 43 DART R li f A hi i h Blender 2.43, 3.0 and ARToolKit 2.72 by Wolfgang Höhl A Hitchhikers Guide to Virtual Reality by Karen y y McMenemy and Stuart Ferguson
  • 108. More Information • M k Billi h t Mark Billinghurst – mark.billinghurst@hitlabnz.org • Websites – www.hitlabnz.org hi l b