My friends and I are working on a game (our first) and having a bit of difficulty achieving a desired effect. We are generating Meshes (Domes, to be exact) and want the faces to be flat-shaded rather than smooth-shaded on these types of objects. From what I gather, we can effectively force flat shading on specific objects by not calculating per-vertex normals, only per-face. Sort of the opposite of this person’s issue: http://stackoverflow.com/questions/4703432/why-does-my-opengl-phong-shader-behave-like-a-flat-shader
My question is, how do we achieve this through jME? Is there a method out there already that I’m just not seeing in the docs?
I take it this must generate the vertex normals automatically, since we’re presently seeing smooth-shading. So I guess my question really is, given a Dome (or any Mesh subclass like it), what is the best way to recalculate the normals to enforce flat shading?
A smooth shaded mesh will generally share vertexes because they share normals. To flat shade you’d need to recreate each triangle with its own three vertexes (unshared) so that all triangle vertexes could have the triangle’s normal. So the Dome mesh is probably unsuitable since you’d effectively be regenerating the whole thing anyway.
I suppose I must be misunderstanding something about how jME generates these meshes, or how they are stored internally. Are you saying that triangles that border one another are sharing the same vertex instances? And that in order to enforce flat shading, I need to make sure that any “overlapping” vertexes are in fact copies rather than instances? If so, is it possible for Mesh objects to describe such a scenario? I ask because I’d rather not create a Geometry instance for each triangle.
Yes, smooth shaded geometry (and the dome in particular) share vertexes between adjacent triangles. It’s a really nice savings that you can really only do when they also share normals and texture coordinates.
And yeah, a mesh can easily describe the the vertexes per triangle way, too… that’s how a cube works after all, 24 vertexes, 12 triangles… some triangles share and some don’t.
Edit: that’s what the index buffer does… associate vertexes in the vertex buffer with the primitive shapes.
rherriman said:
I ask because I'd rather not create a Geometry instance for each triangle. :)
Thanks for your input, pspeed.
- ryan
I'm doing both flat and smooth shading in the same app (using two different meshes). You don't need to create different Geometry instances for each triangle*, you can put a copy of each triangle's vertices in the vertex buffer (so each vertex would have a different index). The overall length of your vertex buffers is going to be much larger than it would be w/smooth shading. It can be hard to get a handle on at first, yes, particularly if you need to be able to edit/interact with the mesh data later on.
*And it might kill your performance if you do create so many geometries rather than a single large mesh - I'm not sure about jME but most (all?) of the engines I've used before are fussy about this.
Obviously your version is working fine too, but is there a reason you went to the trouble of calculating texture coordinates instead of using a color buffer? (VertexBuffer.Type.Color) I mainly ask because I would imagine that a simple color buffer would perform a bit better, since the shader won’t need to sample a 2d “flat color” texture (which I guess you’re doing)? (I’m assuming a shader using texture2D() does a bit more work than just passing a color buffer through. Could be mistaken about what’s going on under the hood.)
You’re right, we are using flat colors, not textures. I added the texture buffer for the simple reason that it wasn’t running without it! The console didn’t mention the color buffer as an alternative. I think I’d rather the color come from the material, though, not the vertices–that’s how we’re handling other world geometry.