Trevor Draeseke's GIS MSc Project, delivering an Augmented Reality viewer that shows the geology of Arthur's Seat, Edinburgh, overlayed on the camera view of a mobile device. Much of the data for the project came from Digimap.
2. Augmented Reality and GIS
Augmented Reality (AR) is a view of the physical world upon which
information, graphics and/or sound are augmented
1994 – A Formal Taxonomy: The Virtuality Continuum
Early 2000s - AR has been used in a number of geovisualisation applications
Visualise the past - archaeological sites
Visualise the future – planned alterations to the landscape
Assess building damage after disasters
Show underground/hidden infrastructure
Promote tourism
Education
3. So…What sets this project apart?
Past AR implementations were ambition, but often limited by burdensome
hardware
An obvious platform—the smartphone
An obvious programmatic structure—an app
Built with a very general goal…
4. Project ambitions
Build an app, the ‘Arthur’s Seat Augmented Reality Visualiser’ (ArSARV)
The app should follow the lead of traditional GIS
Dynamically overlay geographical information layers
Let the user choose those layers (ie using WMS)
Perform user testing of the app in the field
10. The app in action…
https://www.youtube.com/watch?v=QHR3CDpYC8g
11. User testing
Driven by research questions:
How well can geographical information be translated onto the landscape in
Augmented Reality versus with a traditional map?
Which types of geographical information are most usefully visualised in Augmented
Reality?
This was tested by way of a translation exercise and a questionnaire asked of
members of the public in Holyrood Park
12.
13. Future Work
Allow users to interact with their preferred geographical information through
WMS servers
Improved interactivity with the underlying dataset
True object recognition and tracking
The term, in varying forms, has been around for decades, and is most strongly associated with the realm of computer science.
Receives a formal taxonomy in 1994 led by computer scientists MILGRAM & KISHINO
Has had a variety of applications within the realm of GIS since the early 2000s
HMD, expense, not widely owned, burdensome to wear and operate
Smartphones are widely owned and equipped with much of the required technology required in AR including:
A camera and screen for displaying both an image of the real world and graphical information.
An interface for user input, both the touchscreen and any buttons.
A range of positional sensors. Critically GPS, but also positional sensors (accelerometer, magnetometer and gyroscopes).
Finally, the ability to connect to the internet and therefore access server information.
The other major difference between this project and other AR implementations within GIS, is that the intended application was for more general.
So, as stated. The project was mainly built around the construction of an app, specifically an android app. As that’s the type of smartphone I own. I gave the app the incredibly dry and unoriginal name of the ‘Arthur’s Seat Augmented Reality Visualiser’ or ArSARV. Since Arthur’s Seat was accessible to me and made an obvious testing ground for questions of landscape Geovisualisation in AR.
And as stated in my previous slide the ambition of the project was very general. And when I say general, I mean that the app should function in the same open ended way as a traditional GIS software package.
Namely, that the app should be able to dynamically overlay geographical information layers
And that those layers should be chosen by the user, possibly as widely as any information available in a shapefile or similar format through, for example, WMS servers.
Finally, the project had to have a user testing component, to determine not only flaws in the app’s design itself, but also to test some of the project’s main research questions.
Namely, How well do users translate geographical information onto the landscape vs. a user to a traditional map? And which types of GI are most usefully visualised in AR and why?
I’ll come back to a brief summary of the user testing results at the end, but for now, I’d like to explain how the app functioned programmatically and to show how geographical information acquired through EDINA’s Digimap became a major part of this project.
So initially the app is activated. Shown in the top middle of the slide in orange. The app performs a series of initialising functions. Setting screen orientation, accessing the phone’s live camera feed, and critically, acquiring the user’s current GPS location.
Based on that location, a call is made to a webserver, which returns a set of transparent geographical information layers as png files.
This is an example of one of these layer sets showing, from a location within Holyrood park just next to the commonwealth pool.
A horizon line, the important of which will be explained momentarily.
Contour lines
Paths
And Surface Geology.
These layers were rendered in ArcScene based on a number of EDINA Digimap sources. Crucially, all information was draped over OS Terrain 5 DEM data, from which the contour lines were also derived. Path information was derived from OS Master Map data. And the surface geology was derived from British Geological Survey data, also acquired through EDINA’s digimap.
So, now the png files have been downloaded, shown here in yellow, and are then stored locally on the device.
The user is then prompted through tutorial information to line up the horizon line with the horizon of Arthur’s seat visible through the camera feed.
Once they have lined up the horizon line. The tracking button is pushed, which starts to read the phone’s sensors (which up to this point have been on, but not feeding in to any the programs functions).
However, with the tracking now activated, the phone attempts to keep the geographical information layers in place relative to the camera feed.
Finally, the last piece of user input, is layer controls. Which, like a traditional GIS, are overlaid dynamically as the user requests.
As discussed earlier, the project aimed to answer two main research questions through the construction and testing of the ArSARV app.
How well geographical information can be translated onto the landscape in Augmented Reality vs. with a traditional map.
And which types of geographical information are most usefully visualised in Augmented Reality?
The results showed that some users could fairly accurately translate some type of geographical information onto the landscape. Usually based on their background. For example, an experienced hill walker was pretty good at translating contour lines onto the landscape, while most really struggled. Another example, a trained geologist made pretty accurate translations of information from the BGS geology map onto the landscape.
But overwhelmingly, most struggled with translation of information onto the landscape.
Furthermore, the type of geographical information they wished to see visualised varied widely, supporting the idea that a flexible and general app is required for visualisation of geographical information augmented reality.
So, as stated before. This project was very much a proof of concept and there is ample opportunity for improvement.
The first would be to allow user driven data sets within the app. For example, by allowing the app to pull through information through a WMS server link.
The second would be to improve interactivity with the underlying dataset. Allowing the user to tap somewhere on the screen to pull up additional information about the underlying data.
Finally, to improve tracking of objects with increased use of work within the field of computer vision.
So that’s the Arthur’s Seat Augmented Reality visualiser.
While it was an exploration of questions in Geovisualisation in Augmented Reality and involved heavily in computer science and programming, the project would not have been possible without easy and flexible access to the underlying geographical information layers afforded through EDINA’s digimap service.
Thank you!