I am making game with jme2 + physics and I need to implement exploding paint bucket. When a bucket explodes paint should be "rendered" to surrounding objects.
I thought it could be done by particles and render to textures feature but how to determine in which objects(textures) particles hit?
I think you may have to cast some rays and do the detection by hand…
You might be able to write a custom Particle influence and do a bounding check using a BoundingSphere constrained to the size of the particle.
jME PointLight does not support texture projection, does it?
It would be easy way to project paint to surrounding textures by PointLight…
there is a test for projected textures in jmetest.
another idea would be texture splatting.
I implemented custom influence which just freezes particles inside a bounding box. There can be many boxes.
Now I need routine to render those frozen particles to textures. Have anybody done anything similar?
I was thinking something like:
- Cause particle explosion
- Check which bounding boxes have "collected" some particles, can be optimized by checking only boxes on certain range.
- Build dummy scene each of those boxes
- Render images of every box on that side which is facing explosion
- Apply rendered images to textures of real scene objects.
Maybe could find facing sides of boxes by casting rays from explosion and then placing dummy scene camera according to normal of side.
Does this make any sense?
I have now routine to get particle's collision point in object's local coordinates. I am wondering is there a way to map local coords to texture pixels so that I could render paint color to those pixels around collision point.
Check out what I have achieved with my very first jME project called PaintMan:
Game is not ready at all but all needed techniques are now developed and in testing stage.
Any comments and suggestions are welcome!
Hey that's pretty cool!