Hi guys,
I saw in the forum that there is decals implementation in JME and it is called projective texture mapping.
Am I right that this is the feature which can be used for:
leaving footprints after character walked upon surface
leaving bullet holes on the wall (normal/bump texture updates)
Is there any tutorial/examples on how it work?
I see it as a very powerful feature for development and would like to use it in my project
Yes, it can be used for what you described. Jme implementation was an user contribution and Iām not sure if it still works. Worth taking a look at it though.
It probably still works, though itās really not a reliable implementation in my experience.
From what I remember using it it cloned the entire mesh, replaced the material with a mostly transparent one that had the overlay texture at the right texcoords and rendered it on top which isnāt exactly great or good for performance either.
The final straw that made my axe it from my project was that it did not remove the said meshes every time, leaving ghost objects everywhere.
I think those two uses you mentioned can be done in a lot simpler and lighter ways, like positioning transparent quads at the spots for example. Also neither of these approaches will modify your normal or bump textures. You can only do that yourself by modifying the actual geometry and its material.
Get the collision result @grizeldi suggested and do a getNormalVector() or something like that I think. It also gives you the world position of the hit so you can position the billboard there and have it face the normal.
If you know the triangle of the hit⦠and can figure out the ones that would maybe intersect your bullet shmoo⦠then you should make a mesh that has just those triangles in it instead of a quad.
The problem with a quad is that you will invariably get Z-fighting. If you steal the exact triangles then you can set the depth test function for that geometry to āEqualsā.
I have to solve this problem myself at some point so weāll see what happens.
Hey thatās interesting Iāve never come across this yet. The javadoc lists these enum values:
Always - The test always passes Equal - The test succeeds if the input value is equal to the reference value. Greater - The test succeeds if the input value is greater than the reference value. GreaterOrEqual - The test succeeds if the input value is greater than or equal to the reference value. Less - The test succeeds if the input value is less than the reference value. LessOrEqual - The test succeeds if the input value is less than or equal to the reference value. Never - The test always fails NotEqual - The test succeeds if the input value does not equal the reference value.
It doesnāt really say what exactly their end effect is however. Whatās the general idea behind it?
The general idea is that when the GPU is comparing your fragment depth to the already existing Z buffer depth⦠this is the test it will do.
Do you mean, like, when youād want to use one over another? ā¦because some of these are bound to have some really strange use-cases.
I think generally, the default is GreaterOrEqual (or LessOrEqual⦠or whatever⦠I always get them backwards). Equal is nice for when youāve already rendered one thing and you only want to render the other exactly on what was actually rendered, ie: no overdraw at all because you are only rendering the pixels that are still alive in the zbuffer.
(Back in the before time, the long long ago, when the shader/fixed pipeline was more expensive than an extra draw call, it was common to render a flat shaded mesh first and then render the more GPU intensive version with depthTest=Equals⦠even multiple times for layering of textures, etcā¦)
Iām working on a decal & texture effect shader that should be able to make things like bullet holes and footsteps. I took a different approach that sends uniforms to the existing materials instead of spawning a new mesh over the affected areas.
Hereās a quick gif example using just one texture, but you could also add other types of maps like normal or roughness to make a better effect
Iām planning on making a github project and uploading everything within the next week, so anyone will be welcome to use the shaders if they find them useful.
Yeah this definitely has itās limitations, it wouldnāt be the best way if you need to project anything with a potentially indefinite duration.
As of now I can only have ~80 decals or effects registered at a time on a higher end device (although that could still decrease too since Itās still a work in progress), and Iām still doing testing with a friend to see how well the shader runs on a few different devices. Iām hoping I can also increase that number once I do some optimizations, and I also plan to add a priority system so the user can make sure the most important effects persist over others if the limit is reached at any given time.
Another option would be to have a second texture on the objects that can be āshotā and splat little UV splotches into them. ie: you arenāt painting the splat texture itself but the UVs for the splat. That gives you potentially a little more detail (UV interpolation for the win) and the ability to blend it with the bumps/normals, etcā¦
Can be kind of memory hungry but the texture can be surprisingly low-res and still work because of the UV interpolation.
@MoffKalast Nope a desktop computer lol, Itās still not optimized very well, thereās no differentiation between a simple decal and a full blown texture effect that uses all of the PBR variables and maps. The main limitation right now is that my arrays exceed the space available on the graphics card when I have more than 80 effects at at a time, so Iām thinking Iāll have to cut down some functionality as well.
But Iām fairly new to shaders still so this is also a learning process for me : for now Iām happy it works for projecting that many detailed PBR effects at a time, since thatās what I had initially needed for the spell system and the effects in my game, and it could also be used for anything that fades quickly.
@pspeed Iām passing parallel arrays of data, and looping through in the frag shader to mix the original geometryās values for Albedo, Roughness, Metallic, AO, etc. with the values from the effect/decal. It has a slight framerate cost, but the major issue is that Iām sending so much data over for each effect
I also ran into an issue using IntArrays, so Iām usingFloatArrays for everything until later on when I focus more on the optimization.