Adding lighting to the shader

Hello, until we get some deferred i need to add lighting to my shader

I currently actually don’t care if pointlights and spotlights are supported but i would like to have ambient and directional lights.

What is the common way of adding the functionality? Copy and paste from Lighting?

What is the common way of adding the functionality? Copy and paste from Lighting?

Yes, I believe that is the easiest way. (Not that it’s easy, mind you, just easier than writing it from scratch.)

I would say no!.. Lighting.j3md is a catch all for every posible option. If this is specific to your project, there are countless article on the web with line-by-line shader code for specific methods of applying lighting. Decide what you would like (do some research, look at the rendered output and decide what you like), then write a simplified shader for your lighting.

There is a lot of unecessary overhead in the JME lighting shader if you are only going to apply light in say:

Phong per-pixel shading… just write a 30 line shader that does that.

EDIT: This came across sounding like JME’s lighting shader sucks… far from it. But it’s like going to an all you can eat buffet to get a piece of gum.

Yeah, if you don’t want the option of per vertex lighting, don’t care about normal or parallax mapping, etc… then it’s easier to start from scratch.

If you do want some or all of those things then it might be easier to start from Lighting and rip the stuff out that you don’t need. For example, most of my shaders I’ve ripped out all of the light types except directional and also ripped out specular (the most expensive part of non-bump lighting).

…but it’s nice that normal and parallax mapping are compatible with JME’s way of doing things.

Edit: note, rolling your own also presumes you only want one directional and one ambient. Multiple lights gets trickier and might be worth just ripping stuff out of JME’s shader.

Ok, probably pulling of my own singlepass since i cannot go trough the whole shader pipeline for each light.

I was hoping there is some kind of glsllib with all the lighting math in it.

@zzuegg said: Ok, probably pulling of my own singlepass since i cannot go trough the whole shader pipeline for each light.

I was hoping there is some kind of glsllib with all the lighting math in it.

The non bump, non normal mapped lighting math is like two lines of code, really… probably could do it in one line if you don’t want specular. Everything else is varying setup and so on.

Yeah, adding lighting is really not a problem. At least not if i pass the lightvalues manually.

Using the jme way makes it complicated for me since i would have to pass the values trough the whole tessellation pipeline.

But i am a greedy guy, since there is the technical option of normal+parallax i want that too :slight_smile:

@zzuegg said: Yeah, adding lighting is really not a problem. At least not if i pass the lightvalues manually.

Using the jme way makes it complicated for me since i would have to pass the values trough the whole tessellation pipeline.

But i am a greedy guy, since there is the technical option of normal+parallax i want that too :slight_smile:

Assuming you are going to do this in the same space as JME: (I think all is done in model space currently??)

  • Establishing scene normals… you can grab this from JME’s shaders
  • Adding normal mapping … A line or two of code (can find it in JME shaders)
  • Parralax/Steep Parallax is glsllib file you can leverage.

If you are working in a different space (world for instance):

  • Minor shages for normals+normal maps
  • Parallax/Steep are literally about 15 lines of code, which again is readily available all over the web.

Yes, the entire complexity of regular lighting is “how you setup your space” for the calculations. Rotate normals to view space as JME does or rotate lights to world space… or whatever. Basically all those things you’d have to pass through, I guess.

Hm tried now for a few hours to get the same normal output as the terrain shader uses for lightinput.

However, the terrain shader uses vertex normals and tangents while i want to use only the information from the normal map.

It seems i’m not able to find the right way of dooing it.

  1. On what space is the plain normal map data?
    I assume it’s model space or?

Or how do i transform the normal data without tangents?

Normal maps are in tangent space. In order to properly interpret a normal map you will need to know the regular normal and tangent (and binormal which can be calculated from those) for that point.

You can’t transform the normal data without tangents. Though for terrain in a regular grid it’s probably possible to calculate the tangent.

Hm, i really don’t get why i should need tangents to transform the normals. I know the exact position of each fragment as well as the normal vector of each fragment.
normalMap.r is the x axis, normalMap.g the z axis and normalMap.b is the z axis of the normal vector.

Hm that sounds like i could use them as they are in world space if rotation of the mesh is not allowed?

But since that is not working i am surely missing something

@zzuegg said: Hm, i really don't get why i should need tangents to transform the normals. I know the exact position of each fragment as well as the normal vector of each fragment. normalMap.r is the x axis, normalMap.g the z axis and normalMap.b is the z axis of the normal vector.

Hm that sounds like i could use them as they are in world space if rotation of the mesh is not allowed?

But since that is not working i am surely missing something

Yes, normalMap r is the x axis. Which direction does the x axis point? (hint: tangent vector)

Yes, normalMap.g is the y axis. Which direction does the y axis point? (hint: binormal vector, normal cross tangent)

Yes, normalMap.b is the z axis. This one points in the normal direction… but you need the other two to get an actual local normal vector.

Note: if your terrain is a regular grid then you can calculate a tangent vector which would be something like vec3(0,0,1) cross normal.

…but you still need a tangent vector.

So i guess the simplest working solution would be generating a model/world normal map out of the height map?

@zzuegg said: So i guess the simplest working solution would be generating a model/world normal map out of the height map?

Yes…

Are you not setting the normals for your generated terrain already?

EDIT: Yes was a bit generic. You can generate you model from this and then your normals from the vertex positions. It could be done ina siungle step… but you’ll have to recalculate the vertex position each time if you’re not doing that and then calc’ing your normal info from that. Sheesh… hope this made sense.

EDIT 2: Just don’t try and generate the normals directly from the heightmap data… generate the vertex positions first, then generate your normals using the vertex positions. There… that’s clearer

Hey,

Yeah that would be the usual workflow but won’t work in this case. Look at the picture here:

http://imgur.com/DWmaDNg

Left is the mesh i i generate on the cpu, well actually i generate a plain quad with some subdivisions, the height is set by the vertex shader. All in all at the generation level i have no idea what normal the mesh is going to have after the shader stage.

Right side is the same mesh after the tessellation stage.

Also simple interpolation between the vertices won’t work because i would loose all the additional details.

@zzuegg said: Hey,

Yeah that would be the usual workflow but won’t work in this case. Look at the picture here:

Imgur: The magic of the Internet

Left is the mesh i i generate on the cpu, well actually i generate a plain quad with some subdivisions, the height is set by the vertex shader. All in all at the generation level i have no idea what normal the mesh is going to have after the shader stage.

Right side is the same mesh after the tessellation stage.

Also simple interpolation between the vertices won’t work because i would loose all the additional details.

I wrote code that is posted somewhere around here that will calculate the normal in the shader. If you think it could potential solve the issue, I’ll see if I can find it.

Sure, even if it doesn’t solve the issue i can have some insights on normal calculations.

Ok, i managed to get the normals to output the same as the TerrainLighting normals when using a normal map.

the computeLighting functions signature is:

[java] vec2 computeLighting(in vec3 wvPos, in vec3 wvNorm, in vec3 wvViewDir, in vec3 wvLightDir) [/java]

So i am guess it requires worldViewNormals.

The current question is why would worldViewLightDirection vary from fragment to fragment? (For a directional light)

Goddamn how i hate this space conversion stuff :frowning: