The idea is to bypass rendering geometries by handling particle in virtual space and then create a greyscale image with visual queues the shader can use to render the particles. I have almost no clue what I am doing (seriously) and big thanks to @nehon@normen and @survivor for answering my silly questions.
And here it is with 10,000 particles:
http://youtu.be/9baGXJNPz2w
And finally with 100,000 particles:
http://youtu.be/aMUhmCUQeyc
Would love to get some help from someone who knows what they are doing ;) Anyways, let me know what you think of the idea.... is it worth pursuing? etc...
Yep… thought it may be useful for weather at least. Oh… it may also be worth mentioning, the machine I am doing this on gets terrible frame rate to begin with (with nothing nothing in the scene).
@nehon It’s intentional actually. I normalized the vectors before getting the screen location… reason is, the emitter is tied to the camera location atm and I wanted a quick view from outside of the emitter. The end product will be able to attach to the camera or not… you’ll be able to set the shape of the emitter (atm it is a square).
The reason I attached it to the camera is the original intent is for simulating weather.
I’m not planning on getting as extensive as the particle system in JME for sure… but it will allow for gravity and wind speed/direction to effect the particles.
cool, from what i notice it requires the particle emitter to be on a quad, don’t know if this adds any complications to the way particles are used E.g what if the quad is bigger than the whole screen ?.
I will probably release my sprite library tomorrow (hope i fix the final bug). So we will have to compare it to see what is better ? Your particle emitter may be faster and better.
@tralala It is actually not on a geometry at all. Nothing is rendered through the scene. I’m releasing the vectors in a square pattern atm… but it could be any shape you want it to be.
Here is two PPF’s attached to the scene, one emitter above the camera releasing 10,000 particles (at all times) and another attached below the camera with reversed gravity also releasing 10,000 particles (at all times).
I finally figured out the byte configuration for rendering correct colors in OpenGL. I totally get why it is done the way it is… it was just a little confusing when I first started looking into it.
Also figured out how to properly map distance to grayscale color value… this one was really weird, because a minor error in determining the proper value was creating this weird artifacting right at the horizon line. I thought it was a totally unrelated issue… thankfully I tried to resolve the distance mapping first.
Did a bit of optimization which increased framerate significantly for larger particle counts.
Added a random factor in how much effect wind speed and direction effects particles. This added a considerable amount of realism to the overall effect of rain/snow/etc
Todo:
I’m thinking that since this is geared towards simulating weather specifically, I’m going to just modify the visual reference map to create the final effect. I would think between color streaking and Gaussian blur, it should look pretty realistic. I figure at some later date, I can use this as a basis for overlaying perspective rendered images using the original as placement/size reference.
@Momoko_Fan I haven’t yet… but I have a feeling, this is going to turn into just that… the further I get with it. It’s been a fun exercise either way, but the next step (rendering over the visual queues) is problematic to say the least.
I think the idea (though fun to try) is sort of a bust.
In the end, you would have to simulate a quad with 4 Vectors, rotate them along the Y access to always face the camera, transform them according to view perspective and then use lwjgl to transform the sprite image and add it to the buffer. In the end, I doubt you would gain any performance at all… and considering it is me working on this, it would more than likely loose some
Anyways… it was a ton of fun trying But I think I’m move along on this idea.
The placement Vector of partiicles need to be updated always, but the quad doesn’t unless it’s visible on screen. Of course, this only makes a difference if this is being used to simulate weather and the particles are all around the camera.
/shrug… no idea if I should continue this or not… to be honest. Anyone?
Ooooo… I guess I was wrong and this may yet still work.
Here is the emitter rendering the sprite transform vectors (billboarded along the y axis), running 100,000 particles.
I’m guessing the final step is look into lwjgl image tranformations and the sub-imaging… @normen sub-images (or sub-textures… or whatever they are called) are used in the new texture atlas, I take it?
Anyways… here it is: (probably need to watch full screen to see anything)
@pspeed To be totally honest… I’m not completely sure there will be any difference in the end product, aside from my want to add perspective transforms on the sprite images. This is more for my own learning benefit than anything else. If it makes something cool in the process… even better.
Yeah, if you want the images not to directly face the camera all the time the point sprites don’t really work.
I do something similar with my ropes to get them to always face the camera but only around their own axis. I just send quads down and let the vert shader project the corners out where they are supposed to be. I use the texture coordinate to figure out which corner I’m projecting.
Y axis billboarded sprites are easier and were done in one of the GPU gems articles relating to foliage. Similar concept, though.