Physically Based Rendering – Part Two

Developer’s point of view – Basics

In previous post, I explained what you had to know about Physically Based Rendering if you were an artist. If you’re a developer, and reading this article, you may have tried, or are planning  to implement your own PBR system. If you started to read some of the available literature, you’ve probably been struck by the math complexity of it, and by the lack of explanation of the big picture. You usually see articles that focus on specifics parts of the process, and don’t talk much about other parts as they are assumed easier. At some point you have to assemble all these parts, and I had a hard time figuring out how to do it in my readings. I guess it’s considered basic stuff for other authors, but I think it deserves its proper explanation.

I don’t pretend these articles will enlighten you to the point you are ready to implement your own system, but I hope they will give you solid basis and understanding to start reading the literature without saying “WTF?? on every line as I did.

You can find a lexical, at the end, with all the strange words you’ll come across and their explanations.

So here is what I figured out about using PBR and lighting in general in a 3D rendering pipeline.

 

Ligthing

So first, lets talk about lighting in games. It all boils down to 2 things :

  • Computing Diffuse reflection: This represent the light that reflects off a surface in all directions
  • Computing Specular reflection : This represent the light that reflects off a surface directly to your eye.

This image from wikipedia is the most simple and yet the most helpful to understand this

Diffuse reflection

By GianniG46 (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0) or GFDL (http://www.gnu.org/copyleft/fdl.html)], via Wikimedia Commons

To compute each of these factors, we’re going to use a function. This function answers to the delicate name of Bidirectional Reflectance Distribution Function or BRDF.

Don’t be scared by this, it’s a big name for just a function really. Usually, it will be a shader function.

 

Of course there are different BRDFs depending on what you want to compute, and on the lighting model you use. The BRDFs are usually called by the name of the guys that discovered/elaborated them.

Also, most of the time, in implementations for real time rendering, those BRDFs are approximated for the sake of performance. And incidentally, those approximations also have names, that can be people names or technique names…

 

Lighting in PBR

Computing lighting for PBR is exactly the same as with the current rendering ( the system we use today with ambient, diffuse, specular, sometimes called ad-hoc system in the literature) :

For each light source, compute the diffuse factor and the specular factor. The main difference is that the BRDFs used are different, more physically accurate, and works predictably under different light sources with few parameter entries.

 

So what is a light source?

Direct light source

Something that emits light. In games the most common light sources are Directional lights (think of the sun), Spot lights (think of a torch light), Point lights (think of a light bulb).

That’s what is commonly used in the ad-hoc system, and PBR also handle those types of lights.

 

Indirect light source

Something that reflects light and indirectly lights its surroundings. Think for example of a red wall next to a car at daytime, the sunlight hits the wall and the wall reflects red light that, in turn, lights up the car.

This is not handled by the ad-hoc system, or very poorly faked with ambient lighting.

This part is optional for PBR, but that’s actually the part you really want. because that’s what make things pretty!

In games, indirect lighting is done by using an environment map as a light source. This technique is called Image Based Lighting (IBL).

 

So let’s say we’re looking for the full package. we need to compute diffuse and specular contribution for each light source be it direct or indirect.

To do so we need a BRDF for diffuse and a BRDF for specular, and stick to them for each light source for consistency. Also those BRDF should accept as entry the parameters we want to expose for the artists (base color, metalness, roughness), or derived parameters with minimal transformation.

 

So the pseudo code for a complete lighting is this :

I’ll go into more details, in the posts serie, on how to compute each factors, but that’s pretty much it.

 

Choosing your BRDFs

There is a vast choice of BRDF, and I’m not going to talk about all of them but focus on the ones that I use in my implementation. I’ll just guide you to alternatives and provide links to relevant articles for more details. 

I chose to use the same BRDF as the ones used in Unreal Engine 4 from this article by Brian Karis, as I completely trust his judgement. The provided code helped a great deal, but it was far from straight forward to integrate. In the end I had to fully research, and understand all the whereabouts of BRDFs.

 

Diffuse BRDF : Lambert

The most used diffuse BRDF in games. It’s very popular because it’s very cheap to compute and gives good results. This is the most simple way of computing diffuse.  here are the details

Diffuse Lambert factor for a direct light source (directional light) with a yellow surface color.

Diffuse Lambert factor for a direct light source (directional light) with a yellow surface color.

 Some Alternatives :

Oren-Nayar : Gives better visual results than classic Lambert, and has the advantage of using an entry parameter called roughness…rings a bell? Unfortunately, the additional computation cost is not really worth it,IMO. Details here 

Harahan-Krueger : Takes into consideration sub-surface scattering for diffuse lighting (every material surface has layers and light scatters into those different layers before going out of the material in a random direction). A lot of computations compared to Lambert, but may be important if you want to have a good sub surface scattering look for skin for example.  more details in this paper

 

 

Specular BRDF : Cook-Torrance

This is a bit more complicated for specular. We need a physically plausible BRDF. We use what is called a Microfacet BRDF. So what is it?

It states that at a micro level a surface is not plane, but formed of a multitude of little randomly aligned surfaces, the microfacets. Those surfaces acts as small mirrors that reflects incoming light. The idea behind this BRDF is that only some of those facets may be oriented so that the incoming light reflects to your eye. The smoother the surface, the more all facets are aligned, and the most neat is the light reflection. In the contrary, if a surface is rough, the facets are more randomly oriented so the light reflection is scattered on the surface, and the reflection looks more blurry.

 

Microfacet specular factor for a direct light source. On the right a smooth surface, on the left a rough one. Note how the reflection is scattered on the surface when it’s rough.

Microfacet specular factor for a direct light source. On the left a smooth surface, on the right a rough one. Note how the reflection is scattered on the surface when it’s rough.

 The microfacet BRDF we use is called Cook-Torrance. From my readings, I couldn’t find any implementation that use another specular BRDF. It seems like this is the global form of any microfacet BRDF. 

N.L is the dot product between the normal of the shaded surface and the light direction.

N.V is the dot product between the normal of the shaded surface and the view direction.

The other terms are :

  • Normal Distribution Function called D (for distribution). You may also find some references to it as NDF. It computes the distribution of the microfacets for the shaded surface
  • Fresnel factor called F. Discovered by Augustin Fresnel (frenchies are sooo clever), it describes how light reflects and refracts at the intersection of two different media (most often in computer graphics : Air and the shaded surface)
  • Geometry shadowing term G. Defines the shadowing from the microfacets

That’s where it gets complicated. For each of these terms, there are several models or approximations to computed them.

I’ve settled to use those models and approximations :

  • D : Trowbridge-Reitz/GGX normal Distribution function.
  • F : Fresnel term Schlick’s approximation
  • G : Schlick-GGX approximation

I won’t go into the details of all the alternatives I just want to expose an overview of the whole process first.  But I’ll dive into more technical details on the terms I use, in following posts. To have a neat overview of all alternatives you can see this post on  Brian Karis’s blog.

  

That sums up the whole process, but there is still much to explain. In next post I’ll make a focus on indirect lighting, as it’s the part that gave me the hardest time to wrap my head around. I’ll explain the Image Based Lighting technique used, and how you can compute diffuse and specular from an Environment Map.

 

Lexical :

Diffuse reflection : light that reflects from a surface in every direction.

Specular reflection : light that reflects from a surface toward the viewer.

Bidirectional Reflectance Distribution Function or BRDF : a function to compute Diffuse or Specular reflection.

Image Based Rendering or IBL : a technique that uses an image as a light source

Microfacet Specular BRDF : A specular BRDF that assumes a surface is made of a multitude of very small randomly aligned surfaces: the microfacets. it depends on 3 factors called D, F and G.

Normal Distribution Function called D (for distribution). You may also find some references to it as NDF. It computes the distribution of the microfacets for the shaded surface

Fresnel factor called F. Discovered by Augustin Fresnel (frenchies are sooo clever), it describes how light reflects and refracts at the intersection of two different media (most often in computer graphics : Air and the shaded surface)

Geometry shadowing term G. Defines the shadowing from the micro facets

 

We will be moving soon

As was briefly outlined a month ago when we addressed some serious flaws in our forum setup, we will soon be moving from bbPress to Discourse. Along with this comes a new dedicated server and a beautiful Dockerised setup.

What’s Discourse?

Discourse is a new type of forum software. It’s spearheaded by Jeff Atwood of StackOverflow fame, joined by former teammates and other veteran developers. It is meticulously optimised for civilised discussion.

Here are some WIP pics from the migration:

jme-discourse

jme-discourse2

Are you just chasing the next shiny object?

Certainly not. There’s been a change of strategy in how we approach web development for the jMonkeyEngine project. See my personal blog post about it for more detail. Discourse is the new cornerstone of our website. Every new web component we use will tap into the Discourse API, but the two will never be irreparably dependent on one another, as quickly became the case with bbPress.

So what’s next?

Jamie @jayfella (now doing other super secret things) handed the torch on to Hannes @kwando, who is currently wrangling the bbPress-to-Discourse script into shape. We hope to convert all of the following:

  • Users
  • Threads
  • Posts
  • Categories
  • Ratings (only upvotes)
  • Badges

We will NOT convert the following:

  • Private Messages
  • Favorites
  • Subscriptions

Meaning, if you have any PMs (soon to be enabled again), favorites or subscriptions you’d like to make note of, you better do so now. It’ll be at least a few weeks still, but the last warning will be mere days before the migration.

Lurkers beware (again), we will probably do a user purge based on Last Visited, so if you don’t have any posts to your name and it’s been a while since you last logged in, now would be a good time.

It feels so nice to be getting a completely fresh start on our website. We feel very good about it; hopefully it’s infectious.

Physically Based Rendering – Part one

I’ve been looking at Physically Based Rendering (PBR from now on) since a few weeks, because that’s what all the cool kids are talking about these days. I read almost all the interweb about it and finally somehow wrapped my head around the mechanics behind the concept.

None of the papers I read gave me the epiphany though, the understanding came little by little, literally reading some of the papers 10 times. 

The intent of this series of posts is first to brush up the concept of PBR from the artist point of view (the easy part :D), and then to explain the physical concepts behind it and what you have to understand as a developer.

This paper aims to present PBR as I would explain it to my mother. You shouldn’t need a degree in image rendering theories, neither should you need to be a genius to understand what’s coming. 

There are a lot of papers out there, with a lot of complicated words and equations, that assume a solid background knowledge of image rendering, lighting, shading etc…

I won’t assume this here. 

Of course, I’d like this to be as accurate as possible, so if you have more information, or if explanations are not clear, feel free to chime in. 

I’m an artist, I want to know what PBR is :

So you’re an artist, and have some experience in making assets for games. The most commonly used model for describing a material is the Phong reflection model (from Bui Tuong Phong, a clever guy that died very young).

This model describes how light reflects on a material by splitting it in 3 factors: Ambient color, Diffuse color, Specular color. This should sound familiar to 3D game artists. 

This model is a very rough approximation of what’s really going on when light hit a surface of a real life material, but until then it was pretty much enough for a video game. Of course there are dozens of other models and even modification of Phong model, but this one is the most used, and that’s the one we use in jMonkeyEngine. 

The issue with this model is that it’s complicated to have a material that looks consistent under different lighting environment.

  • Ambient is supposed to represent Ambient lighting, being some sort of omnipresent dim light, that tries to fake indirect lighting coming from reflection of light on the surrounding objects. 
  • Diffuse is more straightforward: it’s the actual color of the object when it’s under a white light. 
  • Specular represent the color of the reflected highlights, and the intensity is driven by a “shininess” parameter (at least in jME but that’s pretty common). The issue is that the specular color also drives the intensity because the brighter the color the more intense the specular will be.

All of this leads to a lot of tweaking to look correct, and may not look as good as it should under a different lighting environment. It also relies heavily on an artist’s best guesses about the material. 

So here comes Physically Based Rendering. Not that the previous one was not physically based…but whatever, that sounds pretty cool.

Everybody has their own way to implement PBR, but every implementation share common goals and concepts. 

Goals :

  • Ease the artist’s material workflow.
  • More “photo realistic” rendering. 

Concepts :

  • Every surface has a reflection (specular); even the rougher ones at grazing angles.
  • Energy conservation: a surface cannot reflect more light that it has received. 

This wraps up the entire concept but how does it translate in practice?

A material can now be described with 3 parameters :

Base color : Base color is the raw color of the material, it’s also often referred as the Albedo. It’s similar to the Diffuse color you know from Phong model, but with some crucial differences :

  • It should not contain any shading information. Very often with phong model, Ambient Occlusion (AO) is baked into the diffuse map. Here Base color must be the raw color of the material
  • It does not only influence the diffuse color, but also the specular color of the material.

Metalness : The degree of metallicity of the material. What does that mean? is your material rather metallic or rather not (non metallic materials are called dielectric materials in the literature). Some implementation calls that parameter “specular”, but I found it pretty misleading as it’s completely different as the specular we know today. In practice, just start out with extreme values to get the feel for it: 1 for metallic, 0 for dielectric. 

metalness

Here is the same material with metalness of 0 (dielectric) on the left and 1 (metallic) on the right.

Of course there are intermediary values, but from my reading, most dielectric material should vary from 0.04 and 0.1, and metallic are usually 1. Those values are based on real life measures and you can find some references about them here and here. Note that those values are not subject to interpretation, and are “known” factors and artist may follow them if they want to keep realistic look. 

Roughness : The degree of roughness of the material : Is your material smooth or rough. 0 means smooth, 1 means rough. Some implementations refer to this as Smoothness or Glossiness. That’s essentially the same except it’s the other way around. 1 is smooth and 0 is rough. I find the term “Roughness” pretty much self explanatory and doesn’t leave room for misinterpretation.

Roughness

Here is the same material with different level of roughness from 0 (left) to 1 (right). As opposed to metalness, this parameter is very artist driven. The roughness of a material does not really depend on physics, it’s more related to micro scratches, wearing, etc… So that’s where artists should be creative! 

These parameters are the basics of PBR. Of course, each of them can be stored in a texture, and more common additional parameters can be used.

For example :

  • Normal map : the same as with phong model.
  • AO map : since we can’t bake AO in diffuse anymore, it’s now an extra channel. 

The nice thing is that Metalness, Roughness and AO are grey scaled textures, so basically they only use one channel of a texture. So you can pack those 3 maps in one texture. 

You can find an example asset that should work in a typical PBR implementation here. This page showcases pretty well what the textures should look like. 

That’s it for PBR from the artist point of view. Next week I’ll explain what’s under the hood for you fellow developers ;)

Next post is available here.