I would say no!.. Lighting.j3md is a catch all for every posible option. If this is specific to your project, there are countless article on the web with line-by-line shader code for specific methods of applying lighting. Decide what you would like (do some research, look at the rendered output and decide what you like), then write a simplified shader for your lighting.
There is a lot of unecessary overhead in the JME lighting shader if you are only going to apply light in say:
Phong per-pixel shading… just write a 30 line shader that does that.
EDIT: This came across sounding like JME’s lighting shader sucks… far from it. But it’s like going to an all you can eat buffet to get a piece of gum.
Yeah, if you don’t want the option of per vertex lighting, don’t care about normal or parallax mapping, etc… then it’s easier to start from scratch.
If you do want some or all of those things then it might be easier to start from Lighting and rip the stuff out that you don’t need. For example, most of my shaders I’ve ripped out all of the light types except directional and also ripped out specular (the most expensive part of non-bump lighting).
…but it’s nice that normal and parallax mapping are compatible with JME’s way of doing things.
Edit: note, rolling your own also presumes you only want one directional and one ambient. Multiple lights gets trickier and might be worth just ripping stuff out of JME’s shader.
@zzuegg said:
Ok, probably pulling of my own singlepass since i cannot go trough the whole shader pipeline for each light.
I was hoping there is some kind of glsllib with all the lighting math in it.
The non bump, non normal mapped lighting math is like two lines of code, really… probably could do it in one line if you don’t want specular. Everything else is varying setup and so on.
Yes, the entire complexity of regular lighting is “how you setup your space” for the calculations. Rotate normals to view space as JME does or rotate lights to world space… or whatever. Basically all those things you’d have to pass through, I guess.
Normal maps are in tangent space. In order to properly interpret a normal map you will need to know the regular normal and tangent (and binormal which can be calculated from those) for that point.
You can’t transform the normal data without tangents. Though for terrain in a regular grid it’s probably possible to calculate the tangent.
Hm, i really don’t get why i should need tangents to transform the normals. I know the exact position of each fragment as well as the normal vector of each fragment.
normalMap.r is the x axis, normalMap.g the z axis and normalMap.b is the z axis of the normal vector.
Hm that sounds like i could use them as they are in world space if rotation of the mesh is not allowed?
But since that is not working i am surely missing something
@zzuegg said:
Hm, i really don't get why i should need tangents to transform the normals. I know the exact position of each fragment as well as the normal vector of each fragment.
normalMap.r is the x axis, normalMap.g the z axis and normalMap.b is the z axis of the normal vector.
Hm that sounds like i could use them as they are in world space if rotation of the mesh is not allowed?
But since that is not working i am surely missing something
Yes, normalMap r is the x axis. Which direction does the x axis point? (hint: tangent vector)
Yes, normalMap.g is the y axis. Which direction does the y axis point? (hint: binormal vector, normal cross tangent)
Yes, normalMap.b is the z axis. This one points in the normal direction… but you need the other two to get an actual local normal vector.
@zzuegg said:
So i guess the simplest working solution would be generating a model/world normal map out of the height map?
Yes…
Are you not setting the normals for your generated terrain already?
EDIT: Yes was a bit generic. You can generate you model from this and then your normals from the vertex positions. It could be done ina siungle step… but you’ll have to recalculate the vertex position each time if you’re not doing that and then calc’ing your normal info from that. Sheesh… hope this made sense.
EDIT 2: Just don’t try and generate the normals directly from the heightmap data… generate the vertex positions first, then generate your normals using the vertex positions. There… that’s clearer
Left is the mesh i i generate on the cpu, well actually i generate a plain quad with some subdivisions, the height is set by the vertex shader. All in all at the generation level i have no idea what normal the mesh is going to have after the shader stage.
Right side is the same mesh after the tessellation stage.
Also simple interpolation between the vertices won’t work because i would loose all the additional details.
Left is the mesh i i generate on the cpu, well actually i generate a plain quad with some subdivisions, the height is set by the vertex shader. All in all at the generation level i have no idea what normal the mesh is going to have after the shader stage.
Right side is the same mesh after the tessellation stage.
Also simple interpolation between the vertices won’t work because i would loose all the additional details.
I wrote code that is posted somewhere around here that will calculate the normal in the shader. If you think it could potential solve the issue, I’ll see if I can find it.