Recently, Unity Technologies released a tech demo, “Book of the Dead”, for their latest engine update Unity 2018. Needless to say, this demo was beyond impressive and an exciting tease for what lies in store for video games in the future, more specifically in terms of their graphical fidelity.
Brimming with questions, I had the chance to attempt to dive a bit deeper into what exactly is going on under the hood of Unity’s latest engine. Below is my conversation with some of Unity Technology’s most brilliant minds: Silvia Rasheva – Producer, Torbjorn Laedre – Tech Lead, and Arisa Scott – Graphics Product Manager.
n3rdabl3: The [Book of the Dead] tech demo is beyond impressive, there is no denying that. However, what kind of file sizes did the demo use, and how much you can get away with in this revamped engine?
Torbjorn Laedre: “The asset sizes used in the demo are really quite industry standard: there’s a mix of differently sized texture sets used depending on the type of asset, many of the assets rely on dynamic GPU tessellation to enhance relatively low-density geometries, other assets types like audio and animation use standard compression schemes.
“It’s true that when photoscanning objects, you can quickly end up with a large amount of unique data in your project, and you have to be a bit disciplined so it doesn’t run away into impractical amounts. We always strive to increase the mileage of our assets by using tiling texture data whenever possible or creating asset variation through procedural shading.
“One of the cool things about the Unity engine is that your project is more or less WYSIWYG (What You See is What You Get) during your everyday work. If you can work smoothly while building your project, it’s always going to run at least that well in an optimized build – and usually much much faster.”
n3: I’m assuming most of the maps were at least 4K and we know that textures of that size can really make the file sizes huge. How did you work around this issue in Unity to allow stream-lined performance and even make the performance of the engine increase?
TL: “Although there are some 4K textures used in the demo, the majority of the textures are actually 2K or even lower in some cases. More than any technical wizardry it primarily comes down to managing your data in a sensible way. For texture data, this often means spending texels where they can actually be seen; e.g. on assets that can get really close to the camera and thus fill a large portion of the user’s screen, or on assets that are spatially very large, and would quickly get blurry without high resolution texture data. Some objects also mix in a bit of extra detail from tiling textures to ensure that the quality holds up even in extreme closeups.”
n3: Will “Book of the Dead” be developed further and/or are there plans to release the demo on the PlayStation 4 console?
Silvia Rasheva: “We will continue working on Book of the Dead as an internal production which informs and supports the development of new features in Unity. And yes, the target is an interactive experience running on console, although at this point I can’t be specific about where and when we will release it.
“Right now we’re focusing on preparing a project we can release to users for free, as we’ve seen in the past that the demo projects have been very appreciated by our community, which uses them for everything from research and development to demonstrating their own features, learning about our process or just lifting up some assets or code for usage in their own projects. We’re happy to keep supporting the community by sharing our own work with them in this way, so we’ll do that first, before we resume the development of the main Book of the Dead
“I’d like to add that in general we keep the scope of our productions relatively low, because as a small team we can’t take on the amount of work that a full-fledged game would require, especially when it comes to gameplay and game design. So we’re not yet looking at making an entire game, but a short interactive piece which could potentially turn into a pilot or a first part of a game, or a first episode if the game was episodic. What’s important for us with this project is to achieve a high level of visual quality, while staying relevant to techniques and workflows typical for game production.
“We’ll make sure to keep the community informed as we progress through the Book of the Dead production, and will be updating the homepage of the project (http://www.unity3d.com/book-of-the-dead) with new materials as we release them.”
n3: In terms of production time, how does Unity 2018 stack up against other engines and video game pipelines? Have you seen shorter time frames when dealing with deliverable results?
SR: “I think this comparison is best left to the users, because every project and team is unique and has its own set of challenges and its own production process.
“As an internal team working within Unity, we always base our productions on unfinished technology, using features from their very early pre-alpha stage, and following along with them in the course of their development, until they improve and mature enough that they can be shipped publicly.
“As we’re the first user of these features, we take the first hit – it’s an aspect of our team that is unique, because at such an early stage, the tech is not necessarily there yet for what we’re trying to achieve. When we start a new project, we set ourselves goals that are considered either extremely difficult, or impossible. Once we’re done with a demo and the relevant features ship with the next installments of Unity, every user should have a better experience than we had: shorter and more efficient production time, with an engine which can do more.”
n3: How adaptable is the Scriptable Render Pipeline for developers? It was stated that it’s meant for “ease of use” and to allow developers to skip the “make your own engine” step. What kind of customization options does Unity 2018 give developers to mould Unity into an engine they can use to make a game in the style they want?
Arisa Scott: “With 2018.1 we’re introducing the Scriptable Render Pipeline which is a new way of rendering in Unity. Going from an all purpose monolithic rendering model, where things are accessible only via Unity C++ source code, to one where most things are exposed via C# scripts and API. Users can write their own pipelines in C# or take the ones we made and to customize them for their needs. We’re releasing two initial pipelines (High-Definition and Lightweight) with 2018.1 to start from and make your own modifications. Current work-in progress on Scriptable Render Pipeline is accessible here.
“Most users might consider either adding features to Lightweight pipeline or removing features from High-Definition pipeline. For example, a user might find Lightweight mostly a good fit for their game but consider adding a velocity pass to support temporal effects such as Motion Blur and TAA. On the other hand a user might want to achieve the visual fidelity of High-Definition pipeline but might not require some of its features for their game. In this case they might remove passes, such as distortion or decals, or even removing a whole renderer such as its deferred renderer to use only the forward component. This kind of flexibility means users only pay the computation and rendering cost of what their game requires as they can tailor make their render pipeline for their needs. More info on our blog.”
In terms of poly count, how much more detailed can static meshes afford to be when running in Unity 2018? Or is it more about the level of detail in the texture maps, and how much more information they can hold and show?
TL: “More than anything, I would say that the secret ingredient to making Book of the Dead look awesome is all about Unity raising the quality in a number of areas and empowering developers to do more with the engine. It’s not so much about increasing the quantity of things that are already pretty high fidelity. Of course, there are some highly skilled content creators working on building the project, but the key isn’t just more of everything.
“The all new Scriptable Render Pipelines give developers the freedom to customize their rendering in a way that yields optimal results for their specific project, the High-Definition Render Pipeline implementation offers much improved physically based shading and a host of state of the art shading features out of the box, Progressive Lightmapper is a massive improvement to lighting workflow and baked lighting quality, and Post Processing v2 once again raises the quality and performance of the post processing framework.
“Armed with all these new engine improvements, we were able to raise each part of the project to a higher quality level than in the past. So our static meshes didn’t see any significant increase in authored poly count, instead we relied on HDRP’s built-in dynamic GPU tessellation and displacement for that extra bit of detail in the first LOD. Our textures for most part didn’t see any significant increase in detail either, we just got better mileage out of them from the improved HDRP shaders coupled with some procedural shader blending tricks. The Progressive Lightmapper gave us a much more accurate bake than we’d been able to achieve in the past, and also allowed us to bake out occlusion volumes that greatly increased the sense of depth in the scene. In addition, since both High-Definition Render Pipeline and Post Processing v2 are fully customizable, we were able to integrate our custom rendering effects as first class citizens into the render pipe. This meant they could render without any artificially imposed overhead that could be incurred in the past by having to shoehorn an effect into a fixed callback point.”
Basically, the new Unity engine is a powerhouse full of improved, or new, abilities that allow developers to take an asset and make it look better than ever within the engine.
The new ways Unity 2018 harnesses the power of Tesselation and Displacement maps allows assets to achieve a new level of detail without drastically increasing low-poly models. Coupled with the fact that it can make the most of the maps it’s given without increasing the file sizes.
All of this wouldn’t be possible without the new Scriptable Render Pipeline, which allows users to manipulate, or completely alter, the code that runs the rendering engine. Basically, everything is going to look REALLY good without sacrificing performance, and THAT is an achievement worth bragging about!