Post Process Particle Emitter

So, here is the basic concept and some videos…

The idea is to bypass rendering geometries by handling particle in virtual space and then create a greyscale image with visual queues the shader can use to render the particles. I have almost no clue what I am doing (seriously) and big thanks to @nehon @normen and @survivor for answering my silly questions.

Here it is running with 1,000 particles:

And here it is with 10,000 particles:

And finally with 100,000 particles:

Would love to get some help from someone who knows what they are doing ;) Anyways, let me know what you think of the idea.... is it worth pursuing? etc...
1 Like

Cool. You mean for using this as a post-pro effect for snow, rain etc? I think its definitely worth it.

Yep… thought it may be useful for weather at least. Oh… it may also be worth mentioning, the machine I am doing this on gets terrible frame rate to begin with (with nothing nothing in the scene).

That’s nice!

I’m glad you got through it.

The first one really looks like it rains or snows, it definitely could be a nice addition.

What with the spherical shape though? is it intended or a side effect of the calculation?

@nehon It’s intentional actually. I normalized the vectors before getting the screen location… reason is, the emitter is tied to the camera location atm and I wanted a quick view from outside of the emitter. The end product will be able to attach to the camera or not… you’ll be able to set the shape of the emitter (atm it is a square).

The reason I attached it to the camera is the original intent is for simulating weather.

I’m not planning on getting as extensive as the particle system in JME for sure… but it will allow for gravity and wind speed/direction to effect the particles.

Ooops… I meant that it already allows for wind speed and direction.

cool, from what i notice it requires the particle emitter to be on a quad, don’t know if this adds any complications to the way particles are used E.g what if the quad is bigger than the whole screen ?.

I will probably release my sprite library tomorrow (hope i fix the final bug). So we will have to compare it to see what is better ? Your particle emitter may be faster and better.

@tralala It is actually not on a geometry at all. Nothing is rendered through the scene. I’m releasing the vectors in a square pattern atm… but it could be any shape you want it to be.

Here is two PPF’s attached to the scene, one emitter above the camera releasing 10,000 particles (at all times) and another attached below the camera with reversed gravity also releasing 10,000 particles (at all times).

In case anyone cares to know about progress…

I finally figured out the byte configuration for rendering correct colors in OpenGL. I totally get why it is done the way it is… it was just a little confusing when I first started looking into it.

Also figured out how to properly map distance to grayscale color value… this one was really weird, because a minor error in determining the proper value was creating this weird artifacting right at the horizon line. I thought it was a totally unrelated issue… thankfully I tried to resolve the distance mapping first.

Did a bit of optimization which increased framerate significantly for larger particle counts.

Added a random factor in how much effect wind speed and direction effects particles. This added a considerable amount of realism to the overall effect of rain/snow/etc


I’m thinking that since this is geared towards simulating weather specifically, I’m going to just modify the visual reference map to create the final effect. I would think between color streaking and Gaussian blur, it should look pretty realistic. I figure at some later date, I can use this as a basis for overlaying perspective rendered images using the original as placement/size reference.

1 Like

Have you tried comparing the performance to point sprites? Those are supposed to be very efficiently handled by the GPU

@Momoko_Fan I haven’t yet… but I have a feeling, this is going to turn into just that… the further I get with it. It’s been a fun exercise either way, but the next step (rendering over the visual queues) is problematic to say the least.

In case anyone is still interested in progress…

I think the idea (though fun to try) is sort of a bust.

In the end, you would have to simulate a quad with 4 Vectors, rotate them along the Y access to always face the camera, transform them according to view perspective and then use lwjgl to transform the sprite image and add it to the buffer. In the end, I doubt you would gain any performance at all… and considering it is me working on this, it would more than likely loose some :wink:

Anyways… it was a ton of fun trying :slight_smile: But I think I’m move along on this idea.

Um… maybe I lied.

The placement Vector of partiicles need to be updated always, but the quad doesn’t unless it’s visible on screen. Of course, this only makes a difference if this is being used to simulate weather and the particles are all around the camera.

/shrug… no idea if I should continue this or not… to be honest. Anyone?

Ooooo… I guess I was wrong and this may yet still work.

Here is the emitter rendering the sprite transform vectors (billboarded along the y axis), running 100,000 particles.

I’m guessing the final step is look into lwjgl image tranformations and the sub-imaging… @normen sub-images (or sub-textures… or whatever they are called) are used in the new texture atlas, I take it?

Anyways… here it is: (probably need to watch full screen to see anything)

Given the effort you’ve gone through to do what seems like emulating point sprites, it would surprise me a little if point sprites were slower.

But maybe there is something you are doing differently that I don’t understand.

@pspeed To be totally honest… I’m not completely sure there will be any difference in the end product, aside from my want to add perspective transforms on the sprite images. This is more for my own learning benefit than anything else. If it makes something cool in the process… even better.

Yeah, if you want the images not to directly face the camera all the time the point sprites don’t really work.

I do something similar with my ropes to get them to always face the camera but only around their own axis. I just send quads down and let the vert shader project the corners out where they are supposed to be. I use the texture coordinate to figure out which corner I’m projecting.

Y axis billboarded sprites are easier and were done in one of the GPU gems articles relating to foliage. Similar concept, though.