Rahul Budhiraja has extensive experience in augmented reality research and development. He is currently the technical architect for Project Beyond at Samsung Research America, which is developing the world's first true 3D 360 degree omniview camera. He has held research positions at MIT Media Lab, University of Canterbury, National University of Singapore, and Indian Institute of Information Technology. He co-founded Tesseract Imaging, a startup focused on mobile imaging applications.
1. 3200 Zanker Road Mobile: +1-408-901-9468
San Jose, CA 95134 www: rahulbudhiraja.com
Contact
Information
Research Engineer February 2014 to presentProfessional
Experience Think Tank Team
Samsung Research America
- Technical architect, researcher and prototyper for Project Beyond: World’s first
true 3D 360 omniview camera
Research Intern October 2013 to February 2014
Think Tank Team
Samsung Research America
- Research and Prototyping of novel Wearable and Mobile Imaging applications
for Samsung Devices.
Chief Developer, Director of User Experience June 2013 to January 2014
Tesseract Imaging
- Co-Founded a Mobile Imaging spinoff company based on my research at MIT.
Designed the User Interfaces and prototyped three mobile apps.
Visiting Researcher November 2012 to June 2013
Fluid Interfaces Group
MIT Media Lab
- Vehicles 2.0: Bringing life to Braitenbergs thought experiments through Aug-
mented Reality
- Nostalgia Room: Immersive emotional experiences using information on social
networks.
Research Assistant February 2012 to November 2012
Human Interface Technology Lab NZ
University of Canterbury
- Exploring interaction techniques for outdoor Augmented Reality applications (in
collaboration with BlackBay)
Research Intern September 2011 to February 2012
Human Interface Technology Lab NZ
University of Canterbury
- mARtSuB: A Mobile Augmented Reality tracking system using BRISK features
Research Intern February 2010 to February 2011
Keio-NUS CUTE Center, Interactive and Digital Media Institute
National University of Singapore
- Augmented Reality for Military Applications (in collaboration with Ministry
of Defence, Singapore)
Research Assistant December 2009 to February 2010
Signal, Image and Language Processing Lab
Indian Institute of Information Technology
- P-SCAR: A Presentation System for Classrooms using Augmented Reality
Product Demonstrations Designer May 2008 to July 2008
Tata Consultancy Services
- Development of product demonstrational videos for the IP Multimedia Subsys-
tems(IMS) Department that were shown at International Telecom Expos.
2. Human Interface Technology Lab, Christchurch, New ZealandEducation
MHIT, Human Interface Technology, August 2013
- Thesis Topic: Interaction Techniques for Outdoor Augmented Reality
Applications
- BlackBay-HITLabNZ Fellowship
Advisor: Prof. Mark Billinghurst
Area of Study: Augmented Reality, Computer Vision, Human Computer Inter-
action
Indian Institute of Information Technology, Allahabad, India
B.Tech, Information Technology, July 2010
- Thesis Topic: Tracking the Tracker-Investigating Tracking Methods in Aug-
mented Reality from Fiducial Markers to Markerless Tracking Techniques
Advisor: Prof. Shekhar Verma
Area of Study: Augmented Reality, Computer Vision, Computer Graphics
Augmented Reality, Computer Vision, Virtual Reality, Human Computer Interaction,Research
Interests User Interface Design, Rapid Prototyping,
“Construction of Camera Systems for Capture of 360 Degree StereoscopicPatents
3D Video”. U.S. Patent Application 62/053,726. Filed on September 22, 2014. Patent
Pending.
“Interaction Techniques for 360 Degree Stereoscopic 3D Video”. U.S. Patent
Application 62/053,729. Filed on September 22, 2014. Patent Pending.
“Reconstruction Techniques for 360 Degree Stereoscopic 3D Video Presen-
tation”. U.S. Patent Application 62/053,737. Filed on September 22, 2014. Patent
Pending.
“Stitching Techniques for Real-Time 360 Degree Stereoscopic 3D Video Cap-
ture”. U.S. Patent Application 62/053,743. Filed on September 22, 2014. Patent
Pending.
“Techniques for Real-Time Network Transmission of High-Resolution 360
Degree Stereoscopic 3D Video”. U.S. Patent Application 62/053,750. Filed on
September 22, 2014. Patent Pending.
“System for Display of Visual Information”. New Zealand Patent Application
702966. Filed on December 12, 2014. Patent Pending.
PCMag. “Eyes On With Samsung’s Project Beyond”. November 2014Selected
Press
The Verge. “Samsung announces Project Beyond, a 360-degree 3D camera that cre-
ates virtual reality worlds”. November 2014
TechCrunch. “Samsung Announces ’Project Beyond’, A Camera For Capturing Footage
For Virtual Reality”
3. DailyMail. “The 360 degree camera that can put you anywhere on Earth: Sam-
sung reveals revolutionary virtual reality kit”. November 2014
Engadget. “Samsung unveils Project Beyond, a 3D-capturing camera for Gear VR”.
November 2014
FirstPost. “Samsung Project Beyond: A true 3D camera with 360 degree field of
view”. November 2014
Gizmodo. “Samsung Project Beyond: A 360 Camera For Streaming Virtual Real-
ity”. November 2014
PCWorld. “Virtual reality goes 3D with Samsung’s Project Beyond camera”. Novem-
ber 2014
RoadToVR. “First Impressions of Project Beyond, Samsungs 360 3D Camera for VR”.
November 2014
SamsungMobile. “Project Beyond 3D camera captures 360-degree footage for Gear
VR”. November 2014
SlashGear. “Samsungs Project Beyond camera explained in a video”. November
2014
Yahoo!. “Samsung Announces ‘Project Beyond’, A Camera For Capturing Footage
For Virtual Reality”. November 2014
TechCrunch. “MIT Startup Tesseract Aims To Commercialize Mobile Digital Photo
Refocusing And 3D Effects”. February 2014
PetaPixel. “MIT Project would like to bring Light Field Photography to every Smart-
phone”. June 2014
Visual Arts Daily. “MIT Startup Tesseract aims to Commercialize Mobile Digi-
tal Photo Refocussing and 3D Effects”. February 2014
Reviewed.com Cameras. “This MIT Project could change Photography forever”.
May 2014
Postal Technology International. “Blackbay and HIT Lab NZ employ augmented
reality for courier market”. May 2014
UC News. “Augmented Reality to help get the mail through”. July 2012
Kinect Hacks. “Kinect gesture-based navigation for Photosynth”. March 2011
Conference
Publications
1. Budhiraja, Rahul, Lee, Gun A., Billinghurst, Mark, Using a HHD with a HMD for
mobile Augmented Reality Interaction, IEEE International Symposium on Mixed and
Augmented Reality (ISMAR) pg: 1-6, 1-4 Oct. 2013
2. Budhiraja, Rahul, Lee, Gun A., Billinghurst, Mark, Interaction techniques for
HMD-HHD hybrid AR systems, IEEE International Symposium on Mixed and Aug-
mented Reality (ISMAR) pgs: 243-244, 1-4 Oct. 2013
4. 3. Budhiraja, Rahul, Agarwal, Harshit, Maes, Patricia, Vehicles 2.0-Bringing life to
Braitenbergs thought experiments, [Accepted Demo] IEEE International Symposium on
Mixed and Augmented Reality (ISMAR) 1-4 Oct. 2013
4. Rahul Budhiraja, Shekhar Verma and Arunanshu Pandey. Designing interactive
presentation systems for classrooms in Proceedings of the 28th Annual International
Conference on Design of Communication, ACM SIGDOC 2010 S˜ao Paulo, Brazil,
September 26-29, 2010 pages: 259-260.
5. Budhiraja.R, Verma.S, Pandey.A P-SCAR: A Presentation System for Classrooms
using Augmented Reality in Proceedings of NICOGRAPH International: May 2010,
Singapore.
6. Jia Wang, Rahul Budhiraja, Owen Leach, Rory Clifford, Daiki Matsuda. Escape
from Meadwyn 4: A Cross-platform environment for collaborative navigation tasks.
IEEE 3DUI 2012, pgs: 179-180, Costa Mesa, CA.
7. Command Centre: An Authoring tool to supervise Augmented Reality Sessions [Ad-
ditional Author], IEEE Virtual Reality 2012, Orange County, CA.
- Spot Award 2013, Samsung Research AmericaHonors and
Awards
- BlackBay-HITLabNZ fellowship, Masters Thesis Scholarship amounting to 40,000$
- Runner Up, IEEE 3DUI Contest 2012
- Runner-Up (Phase I), Topcoder-Alcatel Lucent 100 Apps in 100 days
- Sixth Position, Gulf Chemistry Olympiad 2005
Software
Copyrights
SW-4753/2011
RoboCAM 1.0: A multiclient video conferencing tool
Project Beyond: Worlds first true 3D 360 omniview cameraSelected
Projects
Improving on prior research in Stereoscopic Imaging and Computer Vision, Project
Beyond captures high-resolution 3D 360 videos that can be stored offline or streamed
in real-time to provide highly immersive viewing experiences for users with VR head-
sets. Project Beyond uses patent-pending stereoscopic interleaved capture and 3D-aware
stitching technology to capture the scene just like the human eye, but in an extremely
compact form factor. Announced in November 2014 at the Samsung Developer confer-
ence, our capturing system, sample footage and immersive viewing experience has been
demonstrated at the venue and covered extensively by online media. In addition to
bringing high resolution 3D content to the consumer VR industry through Samsung’s
Gear VR, Project Beyond has also received great interest from Hollywood studios on
how to effectively create next-generation 360 3D cinematic viewing experiences.
My responsibilities included both research (3D capturing solutions, Computer Vision
and Graphics) as well as development (technical architect of our reconstruction system).
[March’14 to present]
5. Interaction Techniques for Outdoor Augmented Reality Applications
For my Masters thesis, I explored how mobile Augmented Reality interfaces could be
used to improve courier delivery services. Under this project, we prototyped a system
which could utilize phones and head mounted displays to create rich interactions span-
ning the two platforms. Our system was also shown at the PostExpo 2012 conference
in Brussels, Belgium and demonstrated and published at the IEEE International Sym-
posium on Mixed and Augmented Reality, the leading AR research conference in the
world. A patent on aspects of the user interface and interactions has also been filed and
I am listed as a primary inventor.
Advisor: Prof. Mark Billinghurst, Mr. Ben Reid [Jun’12 to April’13]
Vehicles 2.0-Bringing Life to Braitenbergs Thought Experiments
Vehicles 2.0 combines AR and digital fabrication techniques to create simple and inter-
active learning experiences for difficult concepts like Artificial Intelligence. Our research
was significant in demonstrating how digital fabrication can be used to create more ef-
fective AR experiences and AR can be incorporated in designing a highly engaging
learning experience so that readers can use elements within the book but also explore
their surroundings with knowledge gained from the book. We explored a concept in
Artificial Intelligence called Braitenberg vehicles, which explains how complex systems
can be broken down into simple individual behaviors of entities within the system. The
concept was explained through a customized interactive book that had wooden pages, a
projector smart phone embedded inside the book and other objects like crayons, skittles
etc that provided a holistic learning experience.
Advisor: Prof. Patricia Maes [March’13 to July’13]
An AR System for Training of Emergency Providers in a Large Scale Envi-
ronment
We describe an AR system that is useful for the training of emergency providers in
a large scale environment. The system has a wearable system for each provider, and
a command center which functions as both an AR authoring system, and a real-time
tool to provide dynamic AR directional guidance by the training commanders. The
pose of the wearable system is given by a combination of modalities, a set of sensors
(GPS/DRM/Compass) and a vision based tracking system that uses natural features.
The sensors are used to provide the localization for the augmented guidance arrows,
while the vision system provides localization for static augmentations. Our novel nat-
ural feature vision system handles a large number of small 3d feature maps built by
Bundler/PhotoSynth as opposed to traditional SFM and SLAM based approaches and
introduces a new method to move between pose determination and pose tracking. The
overall system is useful for the training of military, fire fighting, and disaster manage-
ment personnel.
This research was sponsored by the Ministry of Defence Singapore and was a collabo-
rative effort between four universities around the world.
Advisor: Prof. Gerhard Roth, Prof. Adrian David Cheok [Feb’10 to Feb’11]
Further details of my projects can be found here
6. Augmented Reality RelatedFreelance
Activities
• Developed an Augmented Reality application using FLARToolkit (Advertising and
Multimedia Department at the University of Kairouan, Tunisia)
• Built a framework to dynamically link Augmented Reality markers with collada
models using a SQL backend (North Sumatara University, Indonesia)
• Collaborated with a Uruguayan video producer and web artist to create a special
Flash application that allows artists to showcase their video portfolio on AR markers.
• Developed a custom AR framework that allowed users to rapidly prototype web-
based AR applications. (Universidade Potiguar, Brazil)
• Created a FLARToolkit application for Smash Magazine, U.K to demonstrate the
usage of Augmented Reality in giving Interactive Presentations.
Visual FX
Edited a Promotional Video for Aphelion(IV), a track from The Palatin Project.
Worked with an Electronic Music Composer to use raw footage of a Temperate Rain-
forest to create the video using Visual FX Software.
Languages: C, C++, Java, SQL, HTML, VB, ASP, ActionScript, XML, Matlab,Technical
Skills OpenGL, LATEX, Bash Script, PHP, Neon SIMD
Software Experience
IDEs:Adobe Flash, Oracle 9i, Visual Studio, Eclipse, GNU Emacs
3D Rendering: Poser Pro 7, Blender
Video Editing: Final Cut Pro, Cyberlink Power Director, Adobe After Effects
Audio Editing: Adobe Soundbooth, Adobe Audition
Image Editing: Adobe Photoshop, Corel Paintshop Pro
Game Engines: Unity
- Associate Member: IEEEPositions of
Responsibility
- Alumni Member: ACM IIITA Chapter and North India SIGCHI Chapter
- Lead Designer, Effervescence Promotional Video Team (Jan 08-May 08).
- Member. Organizing committee of the 2nd Science Conclave 2009, IIITA
- Captain: College (2007-2009) & School Tennis Team (2002-2006).
Dr. Pattie Maes (e-mail: pattie@media.mit.edu)References
• Professor
Massachusetts Institute of Technology,
Dr. Maes was my Project Supervisor while I was a Visiting Researcher at the MIT
Media Lab
Dr. Mark Billinghurst (e-mail: mark.billinghurst@hitlabnz.org)
• Professor
University of Canterbury,
Dr. Billinghurst was my Masters Thesis Supervisor for the MobileAR project and
the Director of the HITLabNZ
7. Dr. Gerhard Roth (e-mail: gerhardroth@rogers.com)
• Adjunct Research Professor
School of Computer Science, Carleton Univesity
Dr. Roth is a Co-PI who designed and supervised my work during the development
of our Tracking and Rendering System and gave me useful education and advice on
Augmented Reality and Computer Vision
Dr. Adrian Cheok (e-mail: adriancheok@mixedrealitylab.org)
• Professor
Graduate School of Media Design, Keio University
• Associate Professor,
Interactive and Digital Media Institute, National University of Singapore
Dr. Cheok was the PI in our Project with the Ministry of Defence and is the director
of the CUTE Center.
Mr. Pranav Mistry (e-mail: p.mistry@samsung.com)
• Vice President of Research
Samsung Electronics
Mr. Mistry supervised my work on Project Beyond and is the lab director of the
Samsung Think Tank Team.
Dr. Shekhar Verma (e-mail: sverma@iiita.ac.in)
• Associate Professor, Signal,Image and Language Processing Lab
Indian Institute of Information Technology
Dr. Verma was my bachelor thesis advisor and supervised my work at the SILP Lab
Dr. Hideaki Nii (e-mail: nii@mixedrealitylab.org)
• Research fellow, Keio-NUS CUTE Center
Interactive and Digital Media Institute,National University of Singapore
Dr. Nii is a co-PI for the MINDEF project and supervised my work in developing
an application for the CBRE group of the Singapore Military