Lighting and Flickering objects

Hello,

I recently attempted to add lighting effects and ended up with some strange results.

Normal state:

http://i.imgur.com/Fw0jc.jpg

Lighting attempt:

http://img35.imageshack.us/img35/9047/lightings.png

The ground is invisible and objects (which have some overlap with others) such as trees and walls are flickering even when the camera doesn’t move.

The only difference with the normal game state is that the material applied on all game objects was changed from unshaded.j3md to lighting.j3md.

What could be causing this, is there additional steps required in order for lighting to work properly in game?

My first guess is that some meshes have no normal data, which makes it impossible to call TangentBinormalGenerator.generate(mesh) on them.

If that is the problem, what would be the best way to generate normal data for those meshes?

It may be a silly question… but did you also add a light to the scene? You did not mention it and at some point in JMEs recent past, objects with the lighting material and no light specified would not be rendered.

Yes light was added (the NPCs and statues are shadowed). The same material was applied to every object in the world, so the light affects all of them.

Then yeah, it must be the normals. The lighting for the vertexes without normals must be coming out as 0 and so the objects aren’t rendered or something.



…you definitely need at least vertex normals for lighting to work properly.



Where does your mesh come from that it does not have normals?

Some objects such as the ground are custom build quads where the buffers were filled manually. Others are j3o’s which where converted from mesh.xml files or old .jme files from which we copied all buffers, if they were defined. Some were also converted to j3o based on .obj files (the .obj file themselves were exported from blender).



For example, the lighthouse is based off an old .jme file:

http://upload.gaiatools.com/files/Lighting4_0.PNG

Interestingly, if I run the same program on Linux:



The trees do not flicker and do appear, and so does the ground. However, they appear completely black, probably because of the lack of normals.

Another interesting thing:



As you can see, the shadow on the character models (which does seem to have normals, since most of it shows up) is very coarse, as if it were all-or-nothing shadow (full brightness polygon or full black polygon).

This is also visible on the lighthouse, which appear to have a very defined black line where the shadow is, rather than a smooth gradient from full brightness to black.



This is on an NVIDIA 9600M GT on Kubuntu 64-bit with the NVIDIA 270.41.19 drivers.

Well, without normals I think the behavior will vary by driver since you are just getting whatever garbage the driver happens to plug in for those attributes. I’m not exactly sure how that works.



What version of JME are you running? And have you set “shininess” on the lighting material? Non-nightly versions of JME had a bug with regards to shininess and nvidia cards.



For the mesh you generate, making normals should be pretty easy. Since your ground it flat you can just set them all to 0,1,0.

Also, since I haven’t seen your lighting setup I’m only guessing… but if you don’t have an ambient light or have setup the ambient + “userMaterialColors” on the material then anything not facing the light will be completely black.

I’ll answer in order.

I’m running a 5 days old nightly build

No, shininess is untouched.

I’m a bit of a newbie when it comes to graphics in general, where should I look in order to learn how to make normals?(I basically know nothing about them)

Although it might be easy for flat surfaces, others are not.

That’s another strange thing, I do have an ambient light but I’m not sure what you mean by “userMaterialColors”:

[java]

AmbientLight al = new AmbientLight();

rootNode.addLight(al);

DirectionalLight sun = new DirectionalLight();

sun.setDirection(new Vector3f(-0.1f, -0.7f, -1.0f));

rootNode.addLight(sun);

[/java]

That’s fine. There are two ways to give your objects ambient light… one is with an ambient light as you’ve done (though I think you’ll need to set the color) and the other is by setting “userMaterialColors” to true on the Material… and then giving the material an ambient color. AmbientLight is easier.



If the meshes you are trying to make normals for are not generated by you then others may be able to answer better. But for the meshes that you make yourself (as I think you said you’ve done), setting the normals is no harder than setting the positions. The value of a normal is just a unit vector that is perpendicular to the surface.



For flat things like a ground plane, this is easy is it just points to the sky: 0,1,0.



For arbitrary meshes, for each vertex you take the cross product of its incident edges and then normalize it. This will give you a normal perpendicular to that triangle as if the triangle were flat. Unfortunately, trying to add normals to arbitrary curved surfaces is a much harder problem… and as I hinted, your better off getting the program that made the meshes to do that.



Though for things like spheres and cylinders a normal can be trivially calculated during mesh generation… since it’s the same vector that was used to create the positions.

Thanks! I’ll take a look at that :smiley:



But, by any chance, does jMonkey already have some kind of “NormalGenerator” which would generate normals based on the Vertex and Index buffers?

Also, say you have 3 incident edges to a buffer, which cross product would be considered the normal? Would it be the average between the three cross product?

That last part depends entirely on the curved surface that you are trying to approximate. If triangles share the exact same vertex and are not flat then they are simulating a curved surface. Otherwise, each triangle will need its on vertex instead of sharing.

pspeed said:
That last part depends entirely on the curved surface that you are trying to approximate. If triangles share the exact same vertex and are not flat then they are simulating a curved surface. Otherwise, each triangle will need its on vertex instead of sharing.

Not sure I quite understand this part. Would it give a reasonable result to simply treat each triangle on its own, and compute a normal vector for it?
For which use cases would this not work?
For generation, I have found the following things which could prove useful:
http://hub.jmonkeyengine.org/groups/free-announcements/forum/topic/normal-generator-solids-of-revolution-for-jme/
http://hub.jmonkeyengine.org/javadoc/com/jme3/util/TangentBinormalGenerator.html

If you imagine something like a D&D 20-side die… all sides are flat, all sides look flat. In 3D graphics, each triangle would have a single normal shared by its vertexes… thus each triangle would have its own set of vertexes not shared by other triangles.



Now imagine a smooth sphere. The only difference between these two meshes is that in the sphere’s case, the normals are vertex specific and not triangle specific. Thus the triangles can share vertexes and normals that they adjoin.

It appears the “roughness” of the shadows are GPU depedent.

Here for example, I’m running in same conditions as in the previous screenshot, but the shadow on the NPCs is very coarse.



On a model, using 2 directional lights.

http://i.imgur.com/erao0.png



On the same model, using an ambient light:

http://i.imgur.com/E9fca.png



The NPC model has normal data and I’m using the TangentBinormalGenerator.



What could be causing this?

Make sure shininess is non zero

Wow, that did help! Thanks!

Still have some flickering, but I’ll tweak around and see what happens.



Since jME3 doesn’t seem to have a “NormalGenerator” to create normal data for meshes which don’t have any, do other tools such as blender have some kind of automation process for it?

Yes, blender supports normal generation.

The first thing I do when preparing to export a model to jME3 is check the normals to make sure they are correct.

Momoko_Fan said:
Yes, blender supports normal generation.
The first thing I do when preparing to export a model to jME3 is check the normals to make sure they are correct.

That's good news~
Would there be a way, preferably in an automated manner, to export those .j3o's back into a Blender-readable format in order to generate those normals, then to export them back into j3o's or some other jME3-readable format (.mesh.xml)?

Afraid not. It is a very good idea to keep backups in the original format (both .blend and .mesh.xml) of the models as you cannot get them back from j3o