Particle System Source Code if ya want it…

There is a bit to do with this still, but it’s in good enough working order… here is the source code:

https://code.google.com/p/tonegodemitter/source/browse/

List of TODO’s:
Animation info is not being set in clone and cloneForSpatial
Billboarding does not apply in loaded asset as particle template
I haven’t tested point-based particles since I originally started working on this and it is likely in need of updates.

But, the rest is working as it should…

Here is a sample of how to use via code:

[java]
Emitter e1 = new Emitter();
e1.setName(“e1”); // generated if not set… ignore if you are lazy
e1.addInfluencers(
new ColorInfluencer(),
new AlphaInfluencer(),
new SizeInfluencer(),
new SpriteInfluencer(),
new RotationInfluencer(),
new GravityInfluencer()
);
e1.setMaxParticles(15);

// Emiter shape options:
// Simple point/triangle emitter shape
e1.setShapeSimpleEmitter();

// Or use the following for a JME primitive:
// Sphere e1ES = new Sphere(16, 16, 0.5f);
// e1.setShape(e1ES);

// Or the following for an animated emitter shape:
// Node emitterNode = (Node)assetManager.loadModel(“somePath”);
// e1.setShape(emitterNode, false);

// For particles other than the default quad
// impostor based particle
// e1.setParticleType(ParticleDataImpostorMesh.class);

// Animated asset as particle template
// Node particleNode = (Node)assetManager.loadAsset(“somePath”);
// e1.setParticleType(ParticleDataTemplateMesh.class, particleNode);

e1.setEmissionsPerSecond(15);
e1.setParticlesPerEmission(1);
e1.setDirectionType(DirectionType.Normal);
e1.setBillboardMode(BillboardMode.Camera);
e1.setForceMinMax(2.25f,5.0f);
e1.setLifeMinMax(0.999f,0.999f);

// Some of the other options - there are more… but these are a few
// e1.setUseRandomEmissionPoint(true);
// e1.setParticlesFollowEmitter(true);
// e1.setUseSequentialEmissionFace(true);
// e1.setUseSequentialSkipPattern(true);
// e1.setUseVelocityStretching(true);
// e1.setVelocityStretchFactor(1.25f);

// Set the texture - note there are many other method variations for setting the texture… use what works for you
e1.setSprite(“Textures/default.png”, 1, 1);

// Using an alternate material
/*
// Lighting test
Material mat = new Material(assetManager, “Common/MatDefs/Light/Lighting.j3md”);
mat.setBoolean(“UseVertexColor”, true);
mat.getAdditionalRenderState().setBlendMode(RenderState.BlendMode.Alpha);
mat.getAdditionalRenderState().setAlphaTest(true);
mat.getAdditionalRenderState().setAlphaFallOff(.15f);
e1.setMaterial(mat, “DiffuseMap”, true);
*/

/*
// Unshaded test
Material mat = new Material(assetManager, “Common/MatDefs/Misc/Unshaded.j3md”);
mat.setBoolean(“VertexColor”, true);
mat.getAdditionalRenderState().setBlendMode(RenderState.BlendMode.Alpha);
mat.getAdditionalRenderState().setAlphaTest(true);
mat.getAdditionalRenderState().setAlphaFallOff(.15f);
e1.setMaterial(mat, “ColorMap”, true);
*/

// Enabling the test mode shapes (emitter shape, particle mesh)
// e1.setEmitterTestMode(true, false);

// Examples of changing influencer values
// Color Influencer
e1.getInfluencer(ColorInfluencer.class).addColor(new ColorRGBA(1.0f,1.0f,1.0f,1.0f));
e1.getInfluencer(ColorInfluencer.class).addColor(new ColorRGBA(1.0f,1.0f,1.0f,0.0f));

// Alpha Influencer
e1.getInfluencer(AlphaInfluencer.class).addAlpha(1.0f, Interpolation.exp5In);
e1.getInfluencer(AlphaInfluencer.class).addAlpha(0.0f);

// Size Influencer
e1.getInfluencer(SizeInfluencer.class).addSize(new Vector3f(0.5f,0.5f,0.5f));
e1.getInfluencer(SizeInfluencer.class).addSize(new Vector3f(0.1f,0.1f,0.1f));

// Rotation Influencer
e1.getInfluencer(RotationInfluencer.class).addRotationSpeed(new Vector3f(0.0f,0.0f,0.0f));
e1.getInfluencer(RotationInfluencer.class).setUseRandomStartRotation(false,false,false);
e1.getInfluencer(RotationInfluencer.class).setUseRandomDirection(true);
e1.getInfluencer(RotationInfluencer.class).setUseRandomSpeed(true);

// Gravity Influencer
e1.getInfluencer(GravityInfluencer.class).setGravity(new Vector3f(0.0f,4.0f,0.0f));

// Set transforms
e1.setLocalTranslation(0.0f, 0.0f, 0.0f);
e1.setLocalScale(1.0f, 1.0f, 1.0f);
// e1.setLocalRotation(new Quaternion().fromAngles(0.0f,0.0f,0.0f));

// Initialize, Add to scene & Turn it on
e1.initialize(assetManager); // DON’T FORGET TO INITIALIZE THE EMITTER!!

rootNode.addControl(e1);
e1.setEnabled(true); // DON’T FORGET TO TURN IT ON!! (At some point)
[/java]

I’ll post notifications here if any updates available via the repo, but I really could use a break on this for a day or two.

6 Likes

Heh… forgot to tell you how to animate stuff :wink:

[java]
// Name, speed, blendtime and loop
e1.setEmitterAnimation(“run”, 1, 1, LoopMode.Loop);
e1.setParticleAnimation(“kick”, 1, 1, LoopMode.Loop);
[/java]

So I looked into it and played with the FX designer. You can make some crazy things that’s pretty good. A lot of things that you can’t do with the current JME particle emitter system.

It made me think about something though… It’s not intended as an evolution request, it’s more me rambling about particles in general…
What if…instead of having 1 mesh for all the particles, we do it the brute force way, 1 particle = 1 geometry. BUT, we use geometry instancing (I know it’s not possible right now in JME, but that would be the perfect use case).
Geometry instancing limitations are : need the same mesh, needs the same material…so it perfectly fits.
Influencers become controls, the update loop is just adjusting position, scale, rotation, and parameters on the material, no mesh buffer update…It should be way faster. It also makes physical particles trivial.
But it looks so obvious that I must be overlooking a major draw back somewhere…

Which leads to my second thought : Hardware updating of particles
Same as what you did, 1 mesh for all the particles, BUT influencers are not classes but are shader libs that update the particle’s transforms in the vertex shader. idk how it would work for physics, but IMO it would be a lot faster.

What do you think?

2 Likes

Why there are 3 quads for each particle? Doesn’t it make particles considerably brighter near the y axis?

Regarding hardware particels, instancing:

For conventional instancing you have to create an additional buffer holding the WorldMatrixes for each instance object. If your conventional buffer changes don’t exceed 16 floats/object you get no savings on the gpu bus, and you have an additional mat4*mat4 multiplication in the vertex shader.

There are some optimisations possible, especially for simple shapes such as Quad and BillboardedQuad. You could just use one vert for each particle and the emit the shape on the geometry shader. AFAIK that would save a lot of bandwith.

As for hardware updating, the main problem i think is the setup of those, for the vertex modifications jme3 would need a proper transform feedback loop. Also it will double the amount of buffers needed to store the transformations since you need to ping pong between them.

Better solutions would be using OpenCL or ComputeShaders to make the modifications since it would be possible to read/write to the same buffer. OpenCL might be available on more devices but it would be needed to create a lot of new stuff in the engine. ComputeShaders are gl4.4 i think so it is very likely that currently not many of people will be able to use them in the near term. On the long run i vote for ComputeShaders because they use the glsl syntax and all other objects you pass to/use in other shaders works out of the box.
Speaking of which it would be required to have a fallback for each hardware method. And that probably makes the system really complicated…

@zzuegg said: Regarding hardware particels, instancing:

For conventional instancing you have to create an additional buffer holding the WorldMatrixes for each instance object.


ARB_instanced_arrays alleviate this issue, opengl 3.3 though…

@zzuegg said: As for hardware updating, the main problem i think is the setup of those, for the vertex modifications jme3 would need a proper transform feedback loop. Also it will double the amount of buffers needed to store the transformations since you need to ping pong between them.
We don't need that, transformation must be computed according to the "bind pose" and the time elapsed since the emission

About the fall back, it’s already done…for now “all” the particle emitter implementations around here are CPU based so we’d just have to have the same API for the hardware accelerated version.

Never thought we’d see the day ;D

@normen said:

Seriously tho, adding full editing to these in the SDK would be no problem at all, even without a full-blown editor. You’d just need to add the code to get a list of children for the Control that represent the influencers and animations (and to add/remove them). Most of the values would be editable right out of the box then. I can have a whack at that… if you make it a plugin ^^

@nehon
Geometry Instancing - I was just rambling last night (again) about how cool the complex geometry particles will be once gpu instancing is in place. I got the blank stare and then the final can I finish watch this, now? BUT! I love the idea, no matter what he says :wink:
I think the idea of influencers as shader nodes at that point would be crazy cool.
So many possibilities once this is possibile

@abies said: Why there are 3 quads for each particle? Doesn't it make particles considerably brighter near the y axis?

That is the impostor-based particle mesh. And it is only brighter depending on the material you are using.

If you use this with Particle.j3md and AlphaAdditive blending, then play with the alpha values if you want to tone done the brightness

It’s useful with billboarding modes that follow velocity, so you don’t get the invisible particle band at certain views (or certain areas of your particle mesh).

It’s also useful with alternate materials and producing stuff like grass for your terrain.

@normen said: Seriously tho, adding full editing to these in the SDK would be no problem at all, even without a full-blown editor. You'd just need to add the code to get a list of children for the Control that represent the influencers and animations (and to add/remove them). Most of the values would be editable right out of the box then. I can have a whack at that.. if you make it a plugin ^^

This is setup (if I remembered everything needed) to be pushed out as a plugin now, I just need to copy the latest jar and zipped sources over then have a place to push it to.

When you say list of children for influencers… what does this mean? I’m assuming this needs to be returning them in a specific way.

Oh… and you’ll see a material+shaders called Particle internal to the project. Ignore this… I put it there to test stuff. It’s not used and can go away.

@t0neg0d said: This is setup (if I remembered everything needed) to be pushed out as a plugin now, I just need to copy the latest jar and zipped sources over then have a place to push it to.

When you say list of children for influencers… what does this mean? I’m assuming this needs to be returning them in a specific way.

https://wiki.jmonkeyengine.org/legacy/doku.php/sdk:development:sceneexplorer#control_example

Since your control won’t be 100% picked up by the existing “bean wrapping” you’d have to make such a class in your plugin and do the necessary to return a list of children for that (SceneExplorer) node. It would not require changes in your library, just a special wrapper so everything appears in the SceneExplorer and Properties windows.

1 Like
@normen said: https://wiki.jmonkeyengine.org/legacy/doku.php/sdk:development:sceneexplorer#control_example

Since your control won’t be 100% picked up by the existing “bean wrapping” you’d have to make such a class in your plugin and do the necessary to return a list of children for that (SceneExplorer) node. It would not require changes in your library, just a special wrapper so everything appears in the SceneExplorer and Properties windows.

Oh cool! I wouldn’t have minded a ton of changes, honestly… but that’s even more slick.

@t0neg0d said: Oh cool! I wouldn't have minded a ton of changes, honestly... but that's even more slick.

As said, I can do the base work for this, I guess not everything that needs to be done is intuitive without prior knowledge of the NetBeans node system and the SDK specialties around the SceneExplorer nodes (threading etc.). As soon as theres a plugin for this I can just add the needed classes if you want. But its not exactly black magic either and I only offer this because it should be pretty easy if you know how :wink:

1 Like
@normen said: As said, I can do the base work for this, I guess not everything that needs to be done is intuitive without prior knowledge of the NetBeans node system and the SDK specialties around the SceneExplorer nodes (threading etc.). As soon as theres a plugin for this I can just add the needed classes if you want. But its not exactly black magic either and I only offer this because it should be pretty easy if you know how ;)

Sure! I’d love that… plugin- as in- a non-sdk plugin, correct?

If so,
how do I push this out to the contributors plugin repo again? Or more specifically–
does it require someone to do something on that side?

I really can’t remember how this was set up last time, so forgive the stupid questions.

@t0neg0d said: Sure! I'd love that... plugin- as in- a non-sdk plugin, correct?

If so,
how do I push this out to the contributors plugin repo again? Or more specifically–
does it require someone to do something on that side?

I really can’t remember how this was set up last time, so forgive the stupid questions.

No, I mean a plugin like for tonegodgui, so a SDK plugin. It would serve to deliver the library jar and additionally extend the SceneExplorer to support the objects.

So to start the plugin you’d go along this recipe, maybe you remember it, it was improved based on your input :wink:
https://wiki.jmonkeyengine.org/legacy/doku.php/sdk:development:extension_library

To finally add the plugin (you can do that yourself if you want) you have to edit the project properties in the nbproject folder of the contributions repo.

This post is bit of rambling about an issue and the fact the I forgot to properly credit the original author of the Interpolation class because I had cut/paste and convert it JME friendly code. Which brings up a couple points in how this is now and what I think it should be:

  1. 5 of the influencers use some serious hackry to save array lists. It’s ugly and it needs to change.
  • There is no way of saving an array of strings, so I used a string savable map and appended an index to the key as a work-around for duplicate values. It’s really ugly.
  • The entire Interpolation class should be converted to an enum (I think)… this would remove a bunch of hack-ish code to make this work
  1. It needs to be properly credited to the original author, no matter what the changes are.

EDIT: Aside from the solution of converting the Interpolation class to enum, is there a way of saving an array list of strings or integers, float, etc?

@normen said: No, I mean a plugin like for tonegodgui, so a SDK plugin. It would serve to deliver the library jar and additionally extend the SceneExplorer to support the objects.

So to start the plugin you’d go along this recipe, maybe you remember it, it was improved based on your input :wink:
https://wiki.jmonkeyengine.org/legacy/doku.php/sdk:development:extension_library

To finally add the plugin (you can do that yourself if you want) you have to edit the project properties in the nbproject folder of the contributions repo.

I’ll try and get this done today… after resolving the post above this one. =)

1 Like