It probably still works, though it’s really not a reliable implementation in my experience.
From what I remember using it it cloned the entire mesh, replaced the material with a mostly transparent one that had the overlay texture at the right texcoords and rendered it on top which isn’t exactly great or good for performance either.
The final straw that made my axe it from my project was that it did not remove the said meshes every time, leaving ghost objects everywhere.
I think those two uses you mentioned can be done in a lot simpler and lighter ways, like positioning transparent quads at the spots for example. Also neither of these approaches will modify your normal or bump textures. You can only do that yourself by modifying the actual geometry and its material.
Get the collision result @grizeldi suggested and do a getNormalVector() or something like that I think. It also gives you the world position of the hit so you can position the billboard there and have it face the normal.
Hey that’s interesting I’ve never come across this yet. The javadoc lists these enum values:
Always - The test always passes Equal - The test succeeds if the input value is equal to the reference value. Greater - The test succeeds if the input value is greater than the reference value. GreaterOrEqual - The test succeeds if the input value is greater than or equal to the reference value. Less - The test succeeds if the input value is less than the reference value. LessOrEqual - The test succeeds if the input value is less than or equal to the reference value. Never - The test always fails NotEqual - The test succeeds if the input value does not equal the reference value.
It doesn’t really say what exactly their end effect is however. What’s the general idea behind it?
The general idea is that when the GPU is comparing your fragment depth to the already existing Z buffer depth… this is the test it will do.
Do you mean, like, when you’d want to use one over another? …because some of these are bound to have some really strange use-cases.
I think generally, the default is GreaterOrEqual (or LessOrEqual… or whatever… I always get them backwards). Equal is nice for when you’ve already rendered one thing and you only want to render the other exactly on what was actually rendered, ie: no overdraw at all because you are only rendering the pixels that are still alive in the zbuffer.
(Back in the before time, the long long ago, when the shader/fixed pipeline was more expensive than an extra draw call, it was common to render a flat shaded mesh first and then render the more GPU intensive version with depthTest=Equals… even multiple times for layering of textures, etc…)
I’m working on a decal & texture effect shader that should be able to make things like bullet holes and footsteps. I took a different approach that sends uniforms to the existing materials instead of spawning a new mesh over the affected areas.
Here’s a quick gif example using just one texture, but you could also add other types of maps like normal or roughness to make a better effect
I’m planning on making a github project and uploading everything within the next week, so anyone will be welcome to use the shaders if they find them useful.
Yeah this definitely has it’s limitations, it wouldn’t be the best way if you need to project anything with a potentially indefinite duration.
As of now I can only have ~80 decals or effects registered at a time on a higher end device (although that could still decrease too since It’s still a work in progress), and I’m still doing testing with a friend to see how well the shader runs on a few different devices. I’m hoping I can also increase that number once I do some optimizations, and I also plan to add a priority system so the user can make sure the most important effects persist over others if the limit is reached at any given time.
Another option would be to have a second texture on the objects that can be ‘shot’ and splat little UV splotches into them. ie: you aren’t painting the splat texture itself but the UVs for the splat. That gives you potentially a little more detail (UV interpolation for the win) and the ability to blend it with the bumps/normals, etc…
Can be kind of memory hungry but the texture can be surprisingly low-res and still work because of the UV interpolation.
@MoffKalast Nope a desktop computer lol, It’s still not optimized very well, there’s no differentiation between a simple decal and a full blown texture effect that uses all of the PBR variables and maps. The main limitation right now is that my arrays exceed the space available on the graphics card when I have more than 80 effects at at a time, so I’m thinking I’ll have to cut down some functionality as well.
But I’m fairly new to shaders still so this is also a learning process for me : for now I’m happy it works for projecting that many detailed PBR effects at a time, since that’s what I had initially needed for the spell system and the effects in my game, and it could also be used for anything that fades quickly.
@pspeed I’m passing parallel arrays of data, and looping through in the frag shader to mix the original geometry’s values for Albedo, Roughness, Metallic, AO, etc. with the values from the effect/decal. It has a slight framerate cost, but the major issue is that I’m sending so much data over for each effect
I also ran into an issue using IntArrays, so I’m usingFloatArrays for everything until later on when I focus more on the optimization.