New Tech Analysis of BOTW graphics engine (PBR, Volumetric Lighting, and more)
- Page 1 of 1
Since there seemed to be a moderate level of interest in my analysis of Super Mario Odyssey's graphics engine here, I figured that you guys might also appreciate my analysis of Breath of the Wild's graphics engine (which is by far the more impressive of the two, imo).
I've actually wanted to do a proper tech analysis of BOTW's engine for quite some time, but never really got around to it. However, with the new video capture feature on the Switch, I thought it would be the perfect opportunity to revisit the title and share my findings through videos that I've uploaded to Twitter.
I'll start off with a summary of my findings, but I'll also do a breakdown of each technical feature later in this post in order to keep things accessible. Whenever possible, I'm going to try to avoid redundancies. For example, if someone else like Digital Foundry already covered a feature of the engine, I'm not going to bother covering it here. The purpose of this post (as with my SMO post), is to bring more exposure to the technical accomplishments in games where no one else bothered to even investigate.
Anyway, here's a summary of the engine's features:
- Global Illumination (more specifically, Radiosity)
- Local Reflections (calculated along Fresnel)
- Physically-based Rendering
- Emissive materials/area lights
- Screen Space Ambient Occlusion
- Dynamic Wind Simulation system
- Real-time cloud formation (influenced by wind)
- Rayleigh scattering/Mie Scattering
- Full Volumetric Lighting
- Bokeh DOF and approx. of Circle of Confusion
- Sky Occlusion and Dynamic Shadow Volumes
- Aperture Based Lens Flares
- Sub-surface Scattering
- Dynamically Localized Lightning Illumination
- Per-Pixel Sky Irradiance
- Fog inscatter
- Particle Lights
- Puddle formation and evaporation
So, next I'll provide a breakdown of each feature, along with a video as an example of the feature.
Radiosity
First off, I just want to say that all real time global Illumination solutions are faked in one way or another, with varying degrees of accuracy. So anyone trying to dismiss the global illumination solution in BOTW simply because it doesn't use path tracing or something similar should really think about what they're saying. The important part to take away from this is that it is being rendered in real time; it's not just lighting that has been baked into the textures, which is pretty impressive for an open world game (especially on Wii U).
Now, what exactly is Radiosity? Well, in 3D graphics rendering, it is a global illumination approximation of light bouncing from different surfaces, and transferring the color information from one surface to another, along the process. The more accurate the Radiosity, the more light bounces will need to be calculated in order to transfer the proper amount of color.
In Breath of the Wild, the engine uses light probes throughout the environment to collect color information about the different surfaces located near the light probe. There is no simulation of light bounces, just some approximations of what general colors should be coming from a given area. The exact algorithm BOTW uses to calculate this information is unclear, but my best guess is spherical harmonics or something similar, based on the color averages and localization of the Radiosity. Unlike Super Mario Odyssey, Radiosity in Breath of the Wild is pretty granular instead of being binary. The lighting information that's being calculated from the light probes appear to be streamed and tied with the LOD system at the pipeline level, which makes it pretty efficient.
Here's a video of it in action:
* Observational Tip: Notice how the rock cliffs receive a green hue from the grass as the camera moves closer to that region.
Real-time Local Reflections
It appears that surface reflections of immediate surroundings use a two-pronged approach in BOTW. Anything emitting light, including area lights that glow on objects, will reflect on almost any material at pretty much any angle if they're close enough. Initially, I did not account for this (I assumed that the reflections I saw were of the objects themselves instead of the light sources coming from the objects), which is why I originally dismissed Screen Space Reflections as the rendering solution for reflections. However, when taking this into consideration, it appears that any reflected objects not directly illuminating a surface with a light source will be rendered as a screen space reflection. It's hard to tell at first because you can only view these kinds of objects during Fresnel (grazing angles), but I got the game to glitch out a few times in order to prove it. To put it more succinctly, the game uses a combination Specular Lighting and Screen Space Reflections to produce local reflections, depending on if the potential reflection provides its own light source or not.
Anyway, here's a video showing the hybrid system in action. Link and the Cryonis cube are produced through SSR (though the cube's light source is not), while the blue lantern reflections on the wall are produced by specular lighting through the PBR pipeline. If you see that these lights look like mirror images sometimes, it's because area lights can be shaped like objects, but they're still just specular highlights when reflected:
* Observational Tip: Look at Link's reflection compared to the Blue Lantern's reflection. Link must be on screen in order for his reflection to appear, whereas the blue lantern does not need to be on screen in order for its reflection to appear.
Physically-based Rendering
Before anyone asks, no, this does not mean 'physically correct looking materials'. It is simply a methodology applied to a 3D graphics rendering pipeline where all materials (textured surfaces) uniquely influence the way that light behaves when interacting with them. That is what happens in the real world, which is why it's called Physically based rendering (a concept based on real world light physics). Different materials cause light to behave differently, which is why we can visually differentiate between different surfaces in the first place.
Traditionally, rendering pipelines relied on an artist's understanding of how light interacted with different real world materials and would define the look of texture maps based on that understanding. As a result, there was a lot of inconsistency between different textured surfaces and how they compared to their counterparts in the real world (which is understandable as we can't expect artists to have encyclopedic knowledge of all the properties of all matter in the real world). With PBR, the fundamentals of light physics are part of the pipeline itself, and all textured surfaces are classified as materials that have unique properties that will cause light to behave according to those unique properties. This allows surfaces to be placed in different lighting conditions and dynamic camera angles and adjusts how light interacts with those surfaces dynamically. Artists do not have to predefine this interaction like they did with the traditional workflow. It happens automatically. Because of the efficiency of PBR, developers feel more inclined to make games where all materials have unique properties that affect light differently.
In Breath of the Wild, PBR is used with a bit of artistic flair, so you might not notice that the engine even relies on such a pipeline since the textures don't necessarily look realistic. However, the BRDFs (Bi-directional Reflectance Distribution Function) used on the materials make it pretty clear that the engine uses PBR. You see, with every dynamic light source, its specular highlights (the parts of a surface where the light source itself shows as a reflection) and the reflectivity/reflectance of those highlights are dynamically generated depending on the angle of incidence (angle of incoming light rays with respect to a surface normal) and index of refraction (how much a material 'bends' light as the rays touch its surface) of whatever material the lights are interacting with. If the game was using a traditional pipeline, the distribution of those specular highlights would not be much different between wood and metal. But in this game, the production of specular highlights are completely dependent on the material that the light is interacting with.
Another key element that shows that BOTW uses PBR is the Fresnel (pronounced fruh-NELL) reflections of all the materials. First of all, most games using a traditional pipeline don't even bother with Fresnel because at that point you might as well just use PBR. As I explained earlier when discussing local reflections, Fresnel reflections become visible at grazing angles (angles where incoming light is nearly parallel to the surface it's interacting with from the perspective of the observer/camera).
According to the Fresnel reflection coefficient, all materials achieve 100% reflectivity at grazing angles, but the effectiveness of that reflectivity will depend on the roughness of the materials. As a result, programmers differentiate between 'reflectivity' and 'reflectance'. Some materials reflect light in all directions (diffuse materials). Even at 100% reflectivity, 100% of the light may be reflected from the total surface area, but it's not all reflected in the same direction, so the light is spread out uniformly and you don't see any specular reflections (mirror images of the surfaces' surroundings). Other materials only reflect incident light in the opposite direction the light was received (specular materials) so you will only see reflections at the appropriate angle where close to 90% of the light is reflected. The Reflectance (the effectiveness of a material's ability to reflect incident light) of both diffuse and specular materials is not always 100%, even at grazing angles, which is why you don't see perfectly specular reflections at grazing angles of all materials, even in the real world. The clarity of fresnel reflections will vary with the materials producing the reflections.
As evidence of PBR, this dynamic behavior of fresnel reflections on different materials with the same dynamic light source can been seen here:
* Observational Tip: Notice how the green light source on the wood of the barrel appears to be the same at all angles, while the same green light source appears to change its reflection on the metallic barrel hoops (the metal circles on the barrel) with respect to the camera angle.
Emissive materials and area lights
This one is pretty straightforward. The materials of glowing objects provide unique light sources that light the environment in the same shape as the materials themselves. These are not point light sources that radiate in all directions, or even simple directional light sources that light in one direction. They're basically 'custom shaped' light sources. It's important to mention that only the global (sun/moon/lightning) light sources cast shadows. However, BRDF still applies to all light sources in the game.
Take a look:
* Observational Tip: Look at the shape of the light being cast from the fire sword. It matches the shape of the sword itself, though the intensity of the light will depend on how close the sword is to the surface it's illuminating.
Screen Space Ambient Occlusion
In the real world, there is a certain amount of 'ambient light' that colors the environment after light has bounced around the environment so much that it has become completely diffused. If shadows are the result of objects occluding direct sunlight, then ambient occlusion can be thought of as the result of cracks and crevices in the environment occluding ambient light.
The method used in BOTW is called SSAO (screen space ambient occlusion) as it calculates the AO in screen space and is view dependent. The environment will only receive AO when it is perpendicular with respect to the camera.
Here's an example:
* Observational Tip: Look for the dark, shadowy noise patterns in the cracks and crevices of the walls when viewed from head on. This same noise pattern outlines Link's silhouette from this angle as well.
Dynamic Wind Simulation System
So this one surprised me a bit because I was not expecting it to be so robust. Basically, the physics system is tied to a wind simulation system. It's completely dynamic and affects different objects according to their respective weight values. The most prominent objects affected are the blades of grass and the procedurally generated clouds.
Here's an example:
* Observational Tip: If you watch closely, you can see here how the directional flow of both the grass and clouds match the direction in which the wind changes.
Real-time cloud formation
This game does not use a traditional skybox in any sense of the word. Clouds are procedurally generated based on parameters set by the engine. They cast real-time shadows. They received light information based on the sun's position in the sky. As far as I can tell, clouds are treated as an actual material in the game. They're not volumetric, so you won't be getting any crepuscular rays or anything like that, but they're not 'skybox' clouds either. They're formation is also influenced by the wind system.
Take a look:
* Observational Tip: Notice how the cloud particles in the sky sporadically gather together.
Rayleigh Scattering/Mie Scattering
In the real world, when light reaches Earth's atmosphere, it is scattered by air molecules, which results in Earth's blue sky, since the shorter wavelengths of blue light are scattered more easily than other colors of light. However, as the sun approaches the horizon, it has to pass through more of the atmosphere, resulting in most of the blue light being scattered away by the time the sunlight reaches the eye of the observer, leaving the longer wavelengths of orange and red light to reach the eye. BOTW approximates this algorithm mathematically (I actually found this out through a text dump of the game's code earlier this year!) Apparently the algorithm accounts for Mie Scattering as well, which gives fog its appearance in the sky.
Honestly, had I not looked at the code from that text dump, I would have never assumed that this phenomenon was being simulated in the game. It's just so easy to fake. However, after looking at the reflections of the sky in the water, it all made sense. This scattered light is being reflected onto the entire environment in real time. A simple sky box would make that impossible.
Here's an example. You can see how the granularity of the lighting has an effect on the environment:
* Observational Tip: Notice how the different hues of orange and red in the sky reflect on the environment with the same colors. Although not shown in the video, the light scattering of the sky illuminate the environment and water with other colors as well, depending on how the light is scattered.
* Observational Tip: Note the snow's change in color as the sun sets.
* Observational Tip: There are at least 5 sources of distinct reflections in the water in the beginning of this video. The Shrine (blue), the hills (green), the flag (dark silhouette), the sky (orange) and the sun (pink). The hills and the flag are reflected through SSR, the shrine and the sun are reflected through specular lighting (and they're both specular highlights), and the sky is reflected through specular lighting as well, but it's not a specular highlight. As a rainstorm precipitates, the changes in reflections are completely dynamic. Sky Occlusion from the dark clouds changes the illumination from the Rayleigh scattering in the sky in real time. Eventually, the orange skylight can no longer reach the water's surface so it fades out, but the sun persists as it hasn't been completely blocked out. However, with so much mie scattering in the sky, the sun's color has changed from pink to white! Even still, the clouds eventually prove to be too much for the sun, blocking it out completely, leaving the light from the shrine and partial reflections of the hills.
Full Volumetric Lighting
Aside from clouds in the sky, every part of the environment and every object in it has the potential to create light shafts in real time, given the right lighting conditions. The game uses SSAO to aid the effect, but the volumetric lighting is actually not view dependent. You can find out more about how the Volumetric Lighting works in the shadow volumes section of this post.
Here's an example:
* Observational Tip: Notice how the light shafts are created as they peer through the shadows cast by the large building structure
Bokeh DOF and approx. of Circle of Confusion
Another surprising feature for an engine that I assume uses deferred lighting/shading. So I'm going to simplify things a bit because it can get really technical trying to explain why the Bokeh effect even happens in the first place in the real world. Suffice to say that as light enters the aperture (opening) of an eye/camera, the incoming rays of light begin to converge into a single point on a focal plane. As light becomes more focused on this plane, its appearance becomes sharper and smaller. As light becomes more defocused away from this plane, it becomes larger and blurrier.
The Bokeh effect as it is commonly known is when the points of light that enter the camera lens take on the shape of the aperture that they entered through (like a hexagonal shape, for example). The circle of confusion is the region of focus where a human cannot distinguish between a point of light that is perfectly in focus and one that slightly out of focus. Depth of field is usually determined by the circle of confusion. What's interesting is that BOTW emulates both of these concepts when using the sheikah scope or camera rune. My guess is that it's all calculated in screen space based on the texel (texture element) data, and then applied as a post-process effect.
Regardless of the method used, it's pretty impressive. You can check it out here:
* Observational Tip: Pay attention to the reticle of the camera and the shiny blue lights on the metal boxes. When the camera focuses on distances far away from the light sources, the light sources become more blurry and also appears larger. The opposite happens when the camera focuses directly on the light sources. The circles shapes that the blue lights transform into are known as the Bokeh effect.
Sky Occlusion and Dynamic Shadow Volumes
Aside from the physics in the game, these shading features are without a doubt the most computationally taxing elements in BOTW. Here's how it all works:
Even though the clouds themselves don't have any volume, they still cast (soft) shadows onto the environment. However, the sun and the scattered light from the sky Illuminate the environment dynamically, and the environment and all of the objects in it cast their own shadows according to that illumination. It wouldn't look very believable for the lighting in the environment to remain unchanged even when the sky is completely overcast with cloud cover. Nintendo has implemented Sky Occlusion to solve this problem.
Using a mie scattering algorithm (mie coefficients that simulate the effect of atmospheric fog), the engine calculates how much skylight to remove from the environment based on how much fog or cloud cover is in the atmosphere. The more skylight that gets occluded from the environment, the more overcast the environment will appear. Since there is less direct illumination in occluded areas, the ambient light (diffuse, non-directional light) will play a greater role in the illumination of those areas, and all of the shadows in those areas will become softer and start to match the colors of their immediate surroundings.
Here's an example:
* Observational Tip: Watch how the stark, hard shadow of the Flag becomes softer and starts to receive more color from the ambient light term as the storm precipitates
The engine also uses shadow volumes instead of simple shadow maps, and this is done for every shadow caster in the game. Shadow volumes are cast within a specified 3D space instead of just the surfaces and objects in an environment. Aside from the Sky Occlusion looking more believable when shadow volumes are implemented, dynamically generating shadow volumes within a 3D space also provides the benefit of full real time volumetric lighting when it's combined with atmospheric fog that can receive shadows, which is exactly what happens in BOTW.
Here's an example:
* Observational Tip: Pay attention to how the movement of the Flag correlates to the light shafts that are produced. The dark regions in between the light shafts come from the shadow volume of the flag. The dynamic contortion of the flag tells us that these shadow volumes are generated on the fly
Aperture-based Lens Flares
This feature will go unnoticed by probably 99% of the people who play this game, so I'm not sure that it was worth implementing, tbh.
Basically, when rays from a bright light source enter a camera lens at some oblique angles, they can produce optical artifacts known as Lens Flares due to the rays internally reflecting inside of the camera elements. Most games just emulate this phenomenon by applying the flare as a post effect that appears when the camera is slightly off-center from the camera frustum; the concept of light internally reflecting within the camera itself is not even even factored into the equation.
In Breath of the Wild, since the engine already emulates a camera aperture for DOF, it tracks the aperture's relative position to the sun and calculates how much lens flare should be produced, even if the sun isn't on screen. But that's not all! Cameras with lots of zooming elements are even more prone to Lens Flares and the flares will change shape and size depending on the shape/size of the aperture and level of zoom. Surprisingly, BOTW approximates these effects as well!
Check it out:
* Observational Tip: You can see that even though the sun is off-screen, the lens flares (circular light artifacts) are still present. More importantly, the shape, size, and clarity of the lens flares scale with the level of camera zoom.
Sub-surface Scattering
Some surfaces are translucent (not to be confused with transparent) in the real world, meaning that light can both pass through the surface and scatter inside of it. Some examples of real world translucent surfaces would be human skin, grapes, wax, and milk. Modeling this unique behavior of light in 3D graphics is called Sub-surface Scattering or SSS. As with most real time 3D rendering solutions, programmers have come up with several methods to approximate the effect without having to simulate light bounces at the molecular level. The method used in BOTW is relatively simplistic but effective.
Any surface that should have some level of translucency will have multiple layers of materials in order to produce SSS. The first layer is the internal material. This material is usually baked with lighting information that gives it a translucent look. Light travels through the material but does not actually light the material itself in real time. On top of this material is the surface material. This material is the more dominant of the two, and is what you will see in most lighting conditions.
The relationship between these materials work in such a way that the dominant appearance of either material is always determined by the ratio between incident light and transmitted light. If the surface material is reflecting more light than the internal material is transmitting, then the surface material will increase in opacity in proportion to the light it's receiving. If the internal material is transmitting more light than the surface material is reflecting, then the surface material will decrease in opacity in proportion to the light it's not receiving. Balancing the opacity of the surface material according to the Incidence/Transmittance ratio is a very smart and efficient way to give materials an SSS effect.
Here are some examples:
* Observational Tip: Note how light from inside the stable can be seen diffusely illuminating the outer surface. Link's shadow on the roof is also illuminated by the light from inside, but not when he's on the ground.
* Observational Tip: Note how the surface material becomes more opaque as it receives more light, obscuring the internal material.
* Observational Tip: Note how the surface material becomes less opaque as it receives less light, revealing the internal material.
Dynamically Localized Lightning Illumination
Lots of games implement the illumination of an environment by lightning as a global light source, where it flashes over the entire environment and all shadow casters cast shadows in predetermined sizes and directions.
In BOTW, lightning strikes are basically big ass camera flashes, each with their own radius and intensity, and have the ability to strike anywhere on map, regardless of the players location. What's interesting about BOTW's lightning system is that shadows dynamically correspond to the intensity and location of the shadow caster's nearest lightning strike. This system is probably the coolest lightning system I've ever seen in a game.
* Observational Tip: Note the change in size, direction, and contrast of the shadows with each lightning strike.
Per-Pixel Sky Irradiance
If Radiance can be thought of as the amount radiation coming from the sun, Irradiance can be thought of as the amount of that radiation that a given surface actually receives. This is a pretty important variable for scattering skylight because its absence is the main reason we can see into space at night! BOTW calculates Irradiance using an algorithm that tracks the sun's position relative to zenith and during sunsets, it starts to remove skylight, pixel by pixel, until there is no Irradiance left. Granted the sky is free of cloud cover and mie scattering, stars will start to appear in the sky, even if the sky isn't dark yet. The color gradient transitions between night and day are really impressive.
* Observational Tip: Well, your supposed to be able to see the stars come out as the sun sets but Twitter compression made sure that didn't happen, lol
Fog Inscatter
In the real world, fog receives both light and shade, like a physical object. This is computationally expensive to do with computer graphics if the fog is Volumetric. BOTW gets around this by creating a fog noise pattern (similar to their ambient occlusion noise pattern, but not restricted to screen space) and applying radiance values from the sun and skylight to produce 'inscatter'. When you combine this with shadow volumes, not only do you get Volumetric Lighting, you also get fog that looks like it has volume even when it doesn't.
* Observational Tip: Note how the fog on the mountain has taken on the color of the available light in the environment and also appears to have volume.
Particle Lights
Almost every particle in the game is emissive (glowing). Many of them illuminate the environment as well. Instead of rendering particles as objects, many particles are simply point light sources that radiate in all directions in 3D space.
Let's take a look:
* Observational Tip: Note how the glowing embers move independently in 3D space, irrespective of the camera.
* Observational Tip: Snow particles are rendered as particle lights in BOTW. An approach that gives the snow particles the illusion that they're reflecting sunlight. It could also simply be an artistic choice.
* Observational Tip: Note how the fire flies illuminate the surfaces they are close to.
Puddle formation and evaporation
Probably the most bizarre but also the most clever Rendering solution in the game. Underneath the entire terrain of the game world, there exists a plane of water materials that will raise and lower to fill water basins with water when it's raining and evaporate the water when the sun comes back out. There is a foam material layer that is used depending on the water surface's relative distance from the ground. The process is pretty straightforward while also serving as yet another impressive dynamic to the game.
Examples:
* Observational Tip: Watch how the water basin 'fills up' with water when it starts to rain.
* Observational Tip: Watch how the water 'evaporates' when it stops raining and the sun comes out.
UPDATE:
I actually went back to look at some of the game's code from the text dump and it pretty much confirms everything that I posted, lol. I suppose I should have done that first instead of investigating the engine during gameplay. Would've made my life a lot easier!
Here's the pastebin:
https://pastebin.com/Jc9b0BCp
And some screenshots of the code:
UPDATE 2:
Added more features of the engine to this post.
And with that, this analysis has been concluded. As always, if you have any questions about the information provided in this post, feel free to let me know.
Enjoy!
I've actually wanted to do a proper tech analysis of BOTW's engine for quite some time, but never really got around to it. However, with the new video capture feature on the Switch, I thought it would be the perfect opportunity to revisit the title and share my findings through videos that I've uploaded to Twitter.
I'll start off with a summary of my findings, but I'll also do a breakdown of each technical feature later in this post in order to keep things accessible. Whenever possible, I'm going to try to avoid redundancies. For example, if someone else like Digital Foundry already covered a feature of the engine, I'm not going to bother covering it here. The purpose of this post (as with my SMO post), is to bring more exposure to the technical accomplishments in games where no one else bothered to even investigate.
Anyway, here's a summary of the engine's features:
- Global Illumination (more specifically, Radiosity)
- Local Reflections (calculated along Fresnel)
- Physically-based Rendering
- Emissive materials/area lights
- Screen Space Ambient Occlusion
- Dynamic Wind Simulation system
- Real-time cloud formation (influenced by wind)
- Rayleigh scattering/Mie Scattering
- Full Volumetric Lighting
- Bokeh DOF and approx. of Circle of Confusion
- Sky Occlusion and Dynamic Shadow Volumes
- Aperture Based Lens Flares
- Sub-surface Scattering
- Dynamically Localized Lightning Illumination
- Per-Pixel Sky Irradiance
- Fog inscatter
- Particle Lights
- Puddle formation and evaporation
So, next I'll provide a breakdown of each feature, along with a video as an example of the feature.
Radiosity
First off, I just want to say that all real time global Illumination solutions are faked in one way or another, with varying degrees of accuracy. So anyone trying to dismiss the global illumination solution in BOTW simply because it doesn't use path tracing or something similar should really think about what they're saying. The important part to take away from this is that it is being rendered in real time; it's not just lighting that has been baked into the textures, which is pretty impressive for an open world game (especially on Wii U).
Now, what exactly is Radiosity? Well, in 3D graphics rendering, it is a global illumination approximation of light bouncing from different surfaces, and transferring the color information from one surface to another, along the process. The more accurate the Radiosity, the more light bounces will need to be calculated in order to transfer the proper amount of color.
In Breath of the Wild, the engine uses light probes throughout the environment to collect color information about the different surfaces located near the light probe. There is no simulation of light bounces, just some approximations of what general colors should be coming from a given area. The exact algorithm BOTW uses to calculate this information is unclear, but my best guess is spherical harmonics or something similar, based on the color averages and localization of the Radiosity. Unlike Super Mario Odyssey, Radiosity in Breath of the Wild is pretty granular instead of being binary. The lighting information that's being calculated from the light probes appear to be streamed and tied with the LOD system at the pipeline level, which makes it pretty efficient.
Here's a video of it in action:
* Observational Tip: Notice how the rock cliffs receive a green hue from the grass as the camera moves closer to that region.
Real-time Local Reflections
It appears that surface reflections of immediate surroundings use a two-pronged approach in BOTW. Anything emitting light, including area lights that glow on objects, will reflect on almost any material at pretty much any angle if they're close enough. Initially, I did not account for this (I assumed that the reflections I saw were of the objects themselves instead of the light sources coming from the objects), which is why I originally dismissed Screen Space Reflections as the rendering solution for reflections. However, when taking this into consideration, it appears that any reflected objects not directly illuminating a surface with a light source will be rendered as a screen space reflection. It's hard to tell at first because you can only view these kinds of objects during Fresnel (grazing angles), but I got the game to glitch out a few times in order to prove it. To put it more succinctly, the game uses a combination Specular Lighting and Screen Space Reflections to produce local reflections, depending on if the potential reflection provides its own light source or not.
Anyway, here's a video showing the hybrid system in action. Link and the Cryonis cube are produced through SSR (though the cube's light source is not), while the blue lantern reflections on the wall are produced by specular lighting through the PBR pipeline. If you see that these lights look like mirror images sometimes, it's because area lights can be shaped like objects, but they're still just specular highlights when reflected:
* Observational Tip: Look at Link's reflection compared to the Blue Lantern's reflection. Link must be on screen in order for his reflection to appear, whereas the blue lantern does not need to be on screen in order for its reflection to appear.
Physically-based Rendering
Before anyone asks, no, this does not mean 'physically correct looking materials'. It is simply a methodology applied to a 3D graphics rendering pipeline where all materials (textured surfaces) uniquely influence the way that light behaves when interacting with them. That is what happens in the real world, which is why it's called Physically based rendering (a concept based on real world light physics). Different materials cause light to behave differently, which is why we can visually differentiate between different surfaces in the first place.
Traditionally, rendering pipelines relied on an artist's understanding of how light interacted with different real world materials and would define the look of texture maps based on that understanding. As a result, there was a lot of inconsistency between different textured surfaces and how they compared to their counterparts in the real world (which is understandable as we can't expect artists to have encyclopedic knowledge of all the properties of all matter in the real world). With PBR, the fundamentals of light physics are part of the pipeline itself, and all textured surfaces are classified as materials that have unique properties that will cause light to behave according to those unique properties. This allows surfaces to be placed in different lighting conditions and dynamic camera angles and adjusts how light interacts with those surfaces dynamically. Artists do not have to predefine this interaction like they did with the traditional workflow. It happens automatically. Because of the efficiency of PBR, developers feel more inclined to make games where all materials have unique properties that affect light differently.
In Breath of the Wild, PBR is used with a bit of artistic flair, so you might not notice that the engine even relies on such a pipeline since the textures don't necessarily look realistic. However, the BRDFs (Bi-directional Reflectance Distribution Function) used on the materials make it pretty clear that the engine uses PBR. You see, with every dynamic light source, its specular highlights (the parts of a surface where the light source itself shows as a reflection) and the reflectivity/reflectance of those highlights are dynamically generated depending on the angle of incidence (angle of incoming light rays with respect to a surface normal) and index of refraction (how much a material 'bends' light as the rays touch its surface) of whatever material the lights are interacting with. If the game was using a traditional pipeline, the distribution of those specular highlights would not be much different between wood and metal. But in this game, the production of specular highlights are completely dependent on the material that the light is interacting with.
Another key element that shows that BOTW uses PBR is the Fresnel (pronounced fruh-NELL) reflections of all the materials. First of all, most games using a traditional pipeline don't even bother with Fresnel because at that point you might as well just use PBR. As I explained earlier when discussing local reflections, Fresnel reflections become visible at grazing angles (angles where incoming light is nearly parallel to the surface it's interacting with from the perspective of the observer/camera).
According to the Fresnel reflection coefficient, all materials achieve 100% reflectivity at grazing angles, but the effectiveness of that reflectivity will depend on the roughness of the materials. As a result, programmers differentiate between 'reflectivity' and 'reflectance'. Some materials reflect light in all directions (diffuse materials). Even at 100% reflectivity, 100% of the light may be reflected from the total surface area, but it's not all reflected in the same direction, so the light is spread out uniformly and you don't see any specular reflections (mirror images of the surfaces' surroundings). Other materials only reflect incident light in the opposite direction the light was received (specular materials) so you will only see reflections at the appropriate angle where close to 90% of the light is reflected. The Reflectance (the effectiveness of a material's ability to reflect incident light) of both diffuse and specular materials is not always 100%, even at grazing angles, which is why you don't see perfectly specular reflections at grazing angles of all materials, even in the real world. The clarity of fresnel reflections will vary with the materials producing the reflections.
As evidence of PBR, this dynamic behavior of fresnel reflections on different materials with the same dynamic light source can been seen here:
* Observational Tip: Notice how the green light source on the wood of the barrel appears to be the same at all angles, while the same green light source appears to change its reflection on the metallic barrel hoops (the metal circles on the barrel) with respect to the camera angle.
Emissive materials and area lights
This one is pretty straightforward. The materials of glowing objects provide unique light sources that light the environment in the same shape as the materials themselves. These are not point light sources that radiate in all directions, or even simple directional light sources that light in one direction. They're basically 'custom shaped' light sources. It's important to mention that only the global (sun/moon/lightning) light sources cast shadows. However, BRDF still applies to all light sources in the game.
Take a look:
* Observational Tip: Look at the shape of the light being cast from the fire sword. It matches the shape of the sword itself, though the intensity of the light will depend on how close the sword is to the surface it's illuminating.
Screen Space Ambient Occlusion
In the real world, there is a certain amount of 'ambient light' that colors the environment after light has bounced around the environment so much that it has become completely diffused. If shadows are the result of objects occluding direct sunlight, then ambient occlusion can be thought of as the result of cracks and crevices in the environment occluding ambient light.
The method used in BOTW is called SSAO (screen space ambient occlusion) as it calculates the AO in screen space and is view dependent. The environment will only receive AO when it is perpendicular with respect to the camera.
Here's an example:
* Observational Tip: Look for the dark, shadowy noise patterns in the cracks and crevices of the walls when viewed from head on. This same noise pattern outlines Link's silhouette from this angle as well.
Dynamic Wind Simulation System
So this one surprised me a bit because I was not expecting it to be so robust. Basically, the physics system is tied to a wind simulation system. It's completely dynamic and affects different objects according to their respective weight values. The most prominent objects affected are the blades of grass and the procedurally generated clouds.
Here's an example:
* Observational Tip: If you watch closely, you can see here how the directional flow of both the grass and clouds match the direction in which the wind changes.
Real-time cloud formation
This game does not use a traditional skybox in any sense of the word. Clouds are procedurally generated based on parameters set by the engine. They cast real-time shadows. They received light information based on the sun's position in the sky. As far as I can tell, clouds are treated as an actual material in the game. They're not volumetric, so you won't be getting any crepuscular rays or anything like that, but they're not 'skybox' clouds either. They're formation is also influenced by the wind system.
Take a look:
* Observational Tip: Notice how the cloud particles in the sky sporadically gather together.
Rayleigh Scattering/Mie Scattering
In the real world, when light reaches Earth's atmosphere, it is scattered by air molecules, which results in Earth's blue sky, since the shorter wavelengths of blue light are scattered more easily than other colors of light. However, as the sun approaches the horizon, it has to pass through more of the atmosphere, resulting in most of the blue light being scattered away by the time the sunlight reaches the eye of the observer, leaving the longer wavelengths of orange and red light to reach the eye. BOTW approximates this algorithm mathematically (I actually found this out through a text dump of the game's code earlier this year!) Apparently the algorithm accounts for Mie Scattering as well, which gives fog its appearance in the sky.
Honestly, had I not looked at the code from that text dump, I would have never assumed that this phenomenon was being simulated in the game. It's just so easy to fake. However, after looking at the reflections of the sky in the water, it all made sense. This scattered light is being reflected onto the entire environment in real time. A simple sky box would make that impossible.
Here's an example. You can see how the granularity of the lighting has an effect on the environment:
* Observational Tip: Notice how the different hues of orange and red in the sky reflect on the environment with the same colors. Although not shown in the video, the light scattering of the sky illuminate the environment and water with other colors as well, depending on how the light is scattered.
* Observational Tip: Note the snow's change in color as the sun sets.
* Observational Tip: There are at least 5 sources of distinct reflections in the water in the beginning of this video. The Shrine (blue), the hills (green), the flag (dark silhouette), the sky (orange) and the sun (pink). The hills and the flag are reflected through SSR, the shrine and the sun are reflected through specular lighting (and they're both specular highlights), and the sky is reflected through specular lighting as well, but it's not a specular highlight. As a rainstorm precipitates, the changes in reflections are completely dynamic. Sky Occlusion from the dark clouds changes the illumination from the Rayleigh scattering in the sky in real time. Eventually, the orange skylight can no longer reach the water's surface so it fades out, but the sun persists as it hasn't been completely blocked out. However, with so much mie scattering in the sky, the sun's color has changed from pink to white! Even still, the clouds eventually prove to be too much for the sun, blocking it out completely, leaving the light from the shrine and partial reflections of the hills.
Full Volumetric Lighting
Aside from clouds in the sky, every part of the environment and every object in it has the potential to create light shafts in real time, given the right lighting conditions. The game uses SSAO to aid the effect, but the volumetric lighting is actually not view dependent. You can find out more about how the Volumetric Lighting works in the shadow volumes section of this post.
Here's an example:
* Observational Tip: Notice how the light shafts are created as they peer through the shadows cast by the large building structure
Bokeh DOF and approx. of Circle of Confusion
Another surprising feature for an engine that I assume uses deferred lighting/shading. So I'm going to simplify things a bit because it can get really technical trying to explain why the Bokeh effect even happens in the first place in the real world. Suffice to say that as light enters the aperture (opening) of an eye/camera, the incoming rays of light begin to converge into a single point on a focal plane. As light becomes more focused on this plane, its appearance becomes sharper and smaller. As light becomes more defocused away from this plane, it becomes larger and blurrier.
The Bokeh effect as it is commonly known is when the points of light that enter the camera lens take on the shape of the aperture that they entered through (like a hexagonal shape, for example). The circle of confusion is the region of focus where a human cannot distinguish between a point of light that is perfectly in focus and one that slightly out of focus. Depth of field is usually determined by the circle of confusion. What's interesting is that BOTW emulates both of these concepts when using the sheikah scope or camera rune. My guess is that it's all calculated in screen space based on the texel (texture element) data, and then applied as a post-process effect.
Regardless of the method used, it's pretty impressive. You can check it out here:
* Observational Tip: Pay attention to the reticle of the camera and the shiny blue lights on the metal boxes. When the camera focuses on distances far away from the light sources, the light sources become more blurry and also appears larger. The opposite happens when the camera focuses directly on the light sources. The circles shapes that the blue lights transform into are known as the Bokeh effect.
Sky Occlusion and Dynamic Shadow Volumes
Aside from the physics in the game, these shading features are without a doubt the most computationally taxing elements in BOTW. Here's how it all works:
Even though the clouds themselves don't have any volume, they still cast (soft) shadows onto the environment. However, the sun and the scattered light from the sky Illuminate the environment dynamically, and the environment and all of the objects in it cast their own shadows according to that illumination. It wouldn't look very believable for the lighting in the environment to remain unchanged even when the sky is completely overcast with cloud cover. Nintendo has implemented Sky Occlusion to solve this problem.
Using a mie scattering algorithm (mie coefficients that simulate the effect of atmospheric fog), the engine calculates how much skylight to remove from the environment based on how much fog or cloud cover is in the atmosphere. The more skylight that gets occluded from the environment, the more overcast the environment will appear. Since there is less direct illumination in occluded areas, the ambient light (diffuse, non-directional light) will play a greater role in the illumination of those areas, and all of the shadows in those areas will become softer and start to match the colors of their immediate surroundings.
Here's an example:
* Observational Tip: Watch how the stark, hard shadow of the Flag becomes softer and starts to receive more color from the ambient light term as the storm precipitates
The engine also uses shadow volumes instead of simple shadow maps, and this is done for every shadow caster in the game. Shadow volumes are cast within a specified 3D space instead of just the surfaces and objects in an environment. Aside from the Sky Occlusion looking more believable when shadow volumes are implemented, dynamically generating shadow volumes within a 3D space also provides the benefit of full real time volumetric lighting when it's combined with atmospheric fog that can receive shadows, which is exactly what happens in BOTW.
Here's an example:
* Observational Tip: Pay attention to how the movement of the Flag correlates to the light shafts that are produced. The dark regions in between the light shafts come from the shadow volume of the flag. The dynamic contortion of the flag tells us that these shadow volumes are generated on the fly
Aperture-based Lens Flares
This feature will go unnoticed by probably 99% of the people who play this game, so I'm not sure that it was worth implementing, tbh.
Basically, when rays from a bright light source enter a camera lens at some oblique angles, they can produce optical artifacts known as Lens Flares due to the rays internally reflecting inside of the camera elements. Most games just emulate this phenomenon by applying the flare as a post effect that appears when the camera is slightly off-center from the camera frustum; the concept of light internally reflecting within the camera itself is not even even factored into the equation.
In Breath of the Wild, since the engine already emulates a camera aperture for DOF, it tracks the aperture's relative position to the sun and calculates how much lens flare should be produced, even if the sun isn't on screen. But that's not all! Cameras with lots of zooming elements are even more prone to Lens Flares and the flares will change shape and size depending on the shape/size of the aperture and level of zoom. Surprisingly, BOTW approximates these effects as well!
Check it out:
* Observational Tip: You can see that even though the sun is off-screen, the lens flares (circular light artifacts) are still present. More importantly, the shape, size, and clarity of the lens flares scale with the level of camera zoom.
Sub-surface Scattering
Some surfaces are translucent (not to be confused with transparent) in the real world, meaning that light can both pass through the surface and scatter inside of it. Some examples of real world translucent surfaces would be human skin, grapes, wax, and milk. Modeling this unique behavior of light in 3D graphics is called Sub-surface Scattering or SSS. As with most real time 3D rendering solutions, programmers have come up with several methods to approximate the effect without having to simulate light bounces at the molecular level. The method used in BOTW is relatively simplistic but effective.
Any surface that should have some level of translucency will have multiple layers of materials in order to produce SSS. The first layer is the internal material. This material is usually baked with lighting information that gives it a translucent look. Light travels through the material but does not actually light the material itself in real time. On top of this material is the surface material. This material is the more dominant of the two, and is what you will see in most lighting conditions.
The relationship between these materials work in such a way that the dominant appearance of either material is always determined by the ratio between incident light and transmitted light. If the surface material is reflecting more light than the internal material is transmitting, then the surface material will increase in opacity in proportion to the light it's receiving. If the internal material is transmitting more light than the surface material is reflecting, then the surface material will decrease in opacity in proportion to the light it's not receiving. Balancing the opacity of the surface material according to the Incidence/Transmittance ratio is a very smart and efficient way to give materials an SSS effect.
Here are some examples:
* Observational Tip: Note how light from inside the stable can be seen diffusely illuminating the outer surface. Link's shadow on the roof is also illuminated by the light from inside, but not when he's on the ground.
* Observational Tip: Note how the surface material becomes more opaque as it receives more light, obscuring the internal material.
* Observational Tip: Note how the surface material becomes less opaque as it receives less light, revealing the internal material.
Dynamically Localized Lightning Illumination
Lots of games implement the illumination of an environment by lightning as a global light source, where it flashes over the entire environment and all shadow casters cast shadows in predetermined sizes and directions.
In BOTW, lightning strikes are basically big ass camera flashes, each with their own radius and intensity, and have the ability to strike anywhere on map, regardless of the players location. What's interesting about BOTW's lightning system is that shadows dynamically correspond to the intensity and location of the shadow caster's nearest lightning strike. This system is probably the coolest lightning system I've ever seen in a game.
* Observational Tip: Note the change in size, direction, and contrast of the shadows with each lightning strike.
Per-Pixel Sky Irradiance
If Radiance can be thought of as the amount radiation coming from the sun, Irradiance can be thought of as the amount of that radiation that a given surface actually receives. This is a pretty important variable for scattering skylight because its absence is the main reason we can see into space at night! BOTW calculates Irradiance using an algorithm that tracks the sun's position relative to zenith and during sunsets, it starts to remove skylight, pixel by pixel, until there is no Irradiance left. Granted the sky is free of cloud cover and mie scattering, stars will start to appear in the sky, even if the sky isn't dark yet. The color gradient transitions between night and day are really impressive.
* Observational Tip: Well, your supposed to be able to see the stars come out as the sun sets but Twitter compression made sure that didn't happen, lol
Fog Inscatter
In the real world, fog receives both light and shade, like a physical object. This is computationally expensive to do with computer graphics if the fog is Volumetric. BOTW gets around this by creating a fog noise pattern (similar to their ambient occlusion noise pattern, but not restricted to screen space) and applying radiance values from the sun and skylight to produce 'inscatter'. When you combine this with shadow volumes, not only do you get Volumetric Lighting, you also get fog that looks like it has volume even when it doesn't.
* Observational Tip: Note how the fog on the mountain has taken on the color of the available light in the environment and also appears to have volume.
Particle Lights
Almost every particle in the game is emissive (glowing). Many of them illuminate the environment as well. Instead of rendering particles as objects, many particles are simply point light sources that radiate in all directions in 3D space.
Let's take a look:
* Observational Tip: Note how the glowing embers move independently in 3D space, irrespective of the camera.
* Observational Tip: Snow particles are rendered as particle lights in BOTW. An approach that gives the snow particles the illusion that they're reflecting sunlight. It could also simply be an artistic choice.
* Observational Tip: Note how the fire flies illuminate the surfaces they are close to.
Puddle formation and evaporation
Probably the most bizarre but also the most clever Rendering solution in the game. Underneath the entire terrain of the game world, there exists a plane of water materials that will raise and lower to fill water basins with water when it's raining and evaporate the water when the sun comes back out. There is a foam material layer that is used depending on the water surface's relative distance from the ground. The process is pretty straightforward while also serving as yet another impressive dynamic to the game.
Examples:
* Observational Tip: Watch how the water basin 'fills up' with water when it starts to rain.
* Observational Tip: Watch how the water 'evaporates' when it stops raining and the sun comes out.
UPDATE:
I actually went back to look at some of the game's code from the text dump and it pretty much confirms everything that I posted, lol. I suppose I should have done that first instead of investigating the engine during gameplay. Would've made my life a lot easier!
Here's the pastebin:
https://pastebin.com/Jc9b0BCp
And some screenshots of the code:
UPDATE 2:
Added more features of the engine to this post.
And with that, this analysis has been concluded. As always, if you have any questions about the information provided in this post, feel free to let me know.
Enjoy!
Subscribed this thread, amazing post havent digested it all yet.
Do you take requests? (I've games that we want tech analysis done for)
Do you take requests? (I've games that we want tech analysis done for)
Fantastic as usual and very informative. I don't have much to add but do please keep posting these, they're great.
By Kidjr Go To PostSubscribed this thread, amazing post havent digested it all yet.
Do you take requests? (I've games that we want tech analysis done for)
I do, but I actually posted this on Reddit originally, so when I take requests for the next game, it will be on there. However, let me know what games you're interested in anyway, and I'm sure I can take a look into them later down the road.
By charsace Go To Postfollowing this. Amazing work you've done here man.
Thanks. I'm glad you could appreciate it.
By Laboured Go To PostFantastic as usual and very informative. I don't have much to add but do please keep posting these, they're great.
Thank you. I actually wasn't sure that I wanted to do these anymore because there didn't seem to be a lot of interest on the subject here, however, I started posting these on Reddit and they're pretty popular on there. In the future, whenever I post another analysis on Reddit, I'll make sure to post one specifically formatted for SLAENT as well.
By Smokey Go To Postwoah, this is dope
Thanks, Smokey!
By livefromkyoto Go To PostDude.
Now I can't wait to get into that second round of DLC.
Same. I'll also be investigating any potential technical changes with the DLC. There probably aren't any, but it's worth a look.
Another great analysis. Can't wait to see what's next. There is still so much I don't understand, but they're really insightful. Do you have any recommendations for learning this stuff (preferably something hands on or video)? Any course I find just shows how to make something look nice, but never explains the mechanics behind it.
By chrmilou Go To PostAnother great analysis. Can't wait to see what's next. There is still so much I don't understand, but they're really insightful. Do you have any recommendations for learning this stuff (preferably something hands on or video)? Any course I find just shows how to make something look nice, but never explains the mechanics behind it.
Thanks!
I'd start with this course playlist from Udacity:
https://www.youtube.com/playlist?list=PLAwxTw4SYaPlaHwnoGxJE7NFhEWRCIyet
There's a lot more to computer science than just video game graphics, but that course will help you eventually understand most of what is covered in the OP.
Good luck!
By Kabro Go To PostThis is awesomesauce. keep up the good work.
Subscribed!
Thanks! Not sure what I'll do next. Whatever it is, it'll be a game that wouldn't get this kind of analysis otherwise. Most AAA games get covered by Digital Foundry (though I do wish they would go a little more in depth beyond performance issues and visual comparisons), but indie and Nintendo games don't usually get that kind of treatment, so that's where my focus will be.