By Thomas Puha and Jan Benes.
Umbra's automatically created spatial database can be used to not only perform occlusion culling but also to solve other problems such as 3D path finding, shadow caster culling and audio propagation.
In this talk, Remedy's engineers will expand on how the Umbra middleware is used in the upcoming Xbox One exclusive Quantum Break from Remedy Entertainment. Quantum Break presents a major challenge for a visibility system with dynamic content that runs the gamut from indoors to large dynamically changing outdoor environments. Remedy also utilizes Umbra for shadow rendering optimization and makes use of the Umbra spatial database for audio occlusion. The talk will contain several real-world examples of how Umbra is used in Remedy's engine, performance, toolchain integration and pros and cons of using an out-of-house solution.
Umbra is a visibility middleware used in games such as Destiny, Call of Duty: Advanced Warfare, The Witcher 3: Wild Hunt and Rime.
A Secure and Reliable Document Management System is Essential.docx
Using Umbra Spatial Data for Visibility and Audio Propagation in Quantum Break
1.
2. Jan Beneš Remedy, Audio Programmer
Thomas Puha Umbra, Man of all hours
Using Umbra Spatial Data for
Visibility and Audio Propagation
in Quantum Break
13. REMEDY HISTORY
1996 Death Rally
1997 Final Reality
(Futuremark)
2001 Max Payne 1
2003 Max Payne 2
2008 Max Payne Movie
2010 Alan Wake
2011 Death Rally Mobile
2012 Alan Wake’s American
Nightmare
2014 Agents of Storm
2015 Quantum Break
14. NORTHLIGHT
• In-house
• Based on AW tech
• Open world
• Streaming – 2D grid of ’cells’
• Relies on key middleware
• Xbox One, C++ & D
• Renderer
• DX11, multithreaded, deferred lighting
15. QUANTUM BREAK
REQUIREMENTS
• Varying environments
(indoors, urban environment, outdoors…)
• Tens of thousands meshes per
level
• Dynamic elements
• Level permutations
• Fast & automatic incremental
builds
• Xbox One
16. UMBRA IN QUANTUM
BREAK• Occlusion culling
• Main camera
• Directional lights (shadow culling)
• Audio
• Environments & reverb
• Audio occlusion
• Audio propagation
17. WHY UMBRA?
• Long history with Umbra
• Good experience using ’previous-gen’ Umbra in
AW
• New features
• Scene hierarchy/voxelization
• Raycasts
• Bi-dir visibility
• Supports streaming (TomeCollection in Umbra 3)
18. OCCLUSION CULLING
• Main camera and directional light
shadows
• Asynchronous queries
• Bi-directional visibility
• Collaborated on a new culling feature with
Umbra
• Store local depth cube maps for each object
• ~ 10% increase in the number of occluded
objects
19. STATISTICS
(GAMESCOM DEMO LEVEL)
• Clean export times (Core-i7, 8 HW threads)
• ~ 14 min (3 different permutations)
• Iteration times < 1 min
• Runtime stats
• Data size: around 20 MB
• Camera visibility queries 1-2 ms (on a worker thread)
21. SOUND ENVIRONMENTS
• Sphere or box areas with effects (reverb), switching weapon
assets
• Quick way to query for local environments in game (clusters)
• Smooth transitions, overlaps & blending
• Raycasts to prevent the effects from bleeding through walls
23. SOUND OBSTRUCTION –
RAYCASTS• Raycasts between sound sources & camera
• Umbra raycasts are fast (operating on voxelized
geometry)
• Multiple (3x3) raycasts to distinguish
occlusion/obstruction
24. SOUND OBSTRUCTION –
STATISTICS• ~ 2 ms budget for all 3D audio computations
• Only single raycasts for distant objects
• Usual raycast count (after): 100-200 per frame
• Updating only a part of the active objects every
frame
25. UMBRA GATES
• Objects with controlled occlusion behavior
(on/off)
• Specific usecase - audio occluders
• Marking windows and glass walls that should obstruct
only audio
• Generic dynamic gates
• Same system & data used for camera culling & audio
obstruction
• Opening doors also opens an Umbra gate
26. SOUND PROPAGATION
• Propagating obstructed sound through
doors and portals (gates!)
• Affects positioning (sound coming from
the door)
• Effects – more reverb, filtering, lower
volume
• Setup in editor – gate with environments
on each side
27. SOUND PROPAGATION –
RUNTIME• Find environments of the sound
and camera (query cluster)
• Environments know the gates
connecting them (cached as a
graph)
• Gates determine virtual positions
of the sounds
• Keep distance to preserve
attenuation
Camera
Source
Virtual
source
Gate
Hi,
Thank you for coming to our session, I know there’s a lot of sessions out there and that you are busy, and that expensive All Access Pass is better served by going to all the lectures around, so thank you for attending. We are going present a packed 30 minutes for you.
With me on the stage today is mr Jan Benes, the audio programmer from Remedy from Finland and he is going to providing the interesting part of this presentation.
The point of our talk, the takeaway if you will, is that Umbra and the spatial data it generates, is useful for many things in games and even outside of games. Specifically we are going to concentrate on how Remedy is leveraging our tech in Quantum Break for not just visibility but audio propagation.
What are not going to do is going into detail about the inner workings of Umbra,
half an hour is not enough for that.
The core of what our tech does is occlusion culling, that’s why you are going to be licensing it. Our tech does that really well. I dare say that.
The footage is captured from the Umbra debugger which is our brand new tool that ships with the Umbra SDK. It’s something you can use to evaluate Umbra as you can import your content into it, but it’s main task is to help visualize how Umbra works with your data
Umbra Cloud
The Umbra Cloud performs the geometry processing in the cloud instead of locally, and thus offers near-unlimited computation and storage capabilities. Cloud integration improves usability and improves content iteration times and productivity. Umbra Cloud generates the database in the cloud, and thus offers near-unlimited computation and storage capabilities.
Here you see the data computation working and you can do that in the cloud now.
Umbra Debugger
The Umbra Debugger is part of the SDK and is a tool for inspecting, visualizing and debugging Umbra’s behavior in the user’s worlds. The Debugger visualizes the effect of various parameters. It’s also a valuable tool for getting started with the evaluation and integration process. The Debugger can be used to iterate the scene geometry export and to find the right set of computation parameters without having to implement a full runtime integration into the user’s actual renderer.
Umbra has extremely accurate information about space, we have spatial connectivity data
Which is very useful for many things and many of our licensees use our tech in various ways...
The games massive game worlds are split into multiple blocks which are dynamically streamed in the game.
Umbra supports this seamlessly, as Umbra’s data is also streamable.
They also use Umbra’s built-in LOD culling support to automatically determine the visibility of different LOD levels of their objects
As well as accelerate their shadow mapping by using Umbra to cull shadow casters that would cast occluded shadows.
Visibility query is cheap to begin with, you can split into multiple jobs running in parallel.
Destiny uses Umbra’s spatial data to split the world into various logical sections.
They then also use this spatial data for various purposes, like streaming, AI and audio.
Destiny also uses Umbra’s unique “predicted camera” feature, which allows them to launch the visibility query very early on in the frame - even before the camera location is accurately known - and still get correct results.
Today we are going to talk about Quantum Break from Remedy
and how Umbra is used in this upcoming Xbox one exclusive, which is the most expensive finnish cultural production in our nations history
So Jan, take it away...
Making games for almost two decades, building in-house tech for almost two decades.
The current version of the Remedy game engine is an evolution of the Alan Wake tech, which means it’s still a streaming-heavy open world engine.
For the purpose of the talk let’s mention game cells, a 128x128 m grid for streaming static data.
Key middleware for the different subsystems - the middleware data need to support streaming as well.
Renderer: deferred since AW.
The engine needs to cope with big levels, varying environments and a lot of detail.
In many levels you can experience big dynamic changes during the playthrough, so static data need to be built for all the versions (permutations).
We’re trying to support fast iterations in all pripelines and make the generation of middleware data transparent to the users as much as possible.
Umbra helps a lot with occlusion culling, which is obviously its main purpose.
We also found that Umbra provides a useful toolset for our ‘slightly more advanced’ 3D audio systems.
Scene hierarchy and raycasts are pretty useful for audio, bi-directional visibility improves occlusion culling.
Streaming support is a clear requirement given by the engine architecture.
Mostly ‘standard’ approach for occlusion culling (see also The Witcher 3 and Destiny talks).
Queries run on a job thread so they don’t block the renderer.
Bi-directional visibility is a nice optimization that Umbra implemented for us, it combines dynamic occlusion buffer for the camera with occlusion buffers baked during export for individual objects.
Building from scratch can take a while (but bear in mind that in this example it means building three versions of the level). Usually this will be handled by a build server anyway.
Iterations are more important and quite quick, we only rebuild data for modified geometry (at the scope of game cells, see slide 14).
The big challenge for sound design in Quantum Break is building two completely different and contrasting worlds (normal world – realism meets art, players should feel the locations; stutter world – out of time, surreal, broken).
On the engine side, we’re exposing as much state information as possible to Wwise to give sound designers control and independence.
Using runtime effects helps with iterations (live editing) and simplifies pipelines as there’s no need to bake anything to the data. Probably wouldn’t be possible on previous generation hardware.
Grain synth plugin allows us to dynamically play sounds forwards or backwards and freeze them when needed.
Signal analysis plugin sends a set of values to the renderer so we can for example emit particles of shake geometry by playing specific sounds.
Sound environments mark areas with different effects, on the Wwise side this is implemented using auxiliary sends – splitting the signal between a main dry ‘route’ and possibly several effect buses.
We’re using Umbra data as scene subdivision, to have a quick access to the effects affecting given position in space. The scope is Umbra clusters, convex areas of ‘reasonable size’.
Also there’s blend time in Wwise to smooth the transitions.
A lot of raycasts, a lot of sound sources. Note that also bullet hits or collision sounds are actual 3D sounds that need to be processed by the system.
Slowly getting to sound propagation, so let’s introduce this feature.
Funny fact: the audio occluders was the first use case/implementation we did.
Now we also have the generic gates in place. Most common example for these is doors, which block both audio and visibility as long as they are closed.
Placing and scripting the gates means some ammount of manual editing, but if often comes for free, for example gates in door prefabs.
Sound propagation system kicks in when a sound source is completely obstructed and when there is at least one gate between the source and listener (camera).
The expected result is: muffled sound using effects and filters, hearing it coming through the gate.
Set of parameters is sent to Wwise for each sound source to give the information about occlusion, propagation and some specific details.
What’s happening undr the hood - one more important step that’s worth mentioning: when the environment volumes load, they run a ‘query connected region’ search from their center point, which marks all reachable clusters in their volume (it won’t bleed through walls).
This step also discovers the connecting gates.
Virtual positions and attenuations – keeping distance is quite important as we usually drive quite a few parameters based on the attenuation (volume, low and high pass filters, reverb sends, …).
Following video shows the occlusion raycasts and a few different cases of the sound propagation (with a commentary).