Influencer-based ParticleEmitter candidate (mesh-based animated particles)

Just about done with this.

Here are 2 low count particle emitters.

The fire uses animated sprites and rotation influencer along the y axis
Where the smoke uses random start images and rotation influencer along the y and x axis’

Both use gravity, preferred direction and impulse with various settings.

[video]http://youtu.be/jB1mdwaakNI[/video]

1 Like

Birch texture could use a bit of work :wink:

On serious note - fire looks great.

1 Like

@pspeed
Sorry to bother! I’m having to rewrite the ParticleMesh class after all. Aside from lots of Vector creation each loop, it didn’t support rotation along individual axis’. Anyways, It’s just about done, but there is something I can’t figure out and I’m fairly sure you know how this works!

I have a velocity vector and a position. I need to determine the up and left vector relative to these and I’m apparently doing something wrong.

1 Like

You cannot find up and left from just direction. You need one more point of reference - most probably it will be eye vector (either directly eye vector or offset from camera location to particle - there are slight differences for non-centered particles between these).
In my case, I’m using velocity vector for ‘y’ direction of particle (one which gets longer with motion blur) and eyeVector.cross(velVector) for x direction. VelVector is of course normalized at this point.

1 Like
@abies said: You cannot find up and left from just direction. You need one more point of reference - most probably it will be eye vector (either directly eye vector or offset from camera location to particle - there are slight differences for non-centered particles between these). In my case, I'm using velocity vector for 'y' direction of particle (one which gets longer with motion blur) and eyeVector.cross(velVector) for x direction. VelVector is of course normalized at this point.

This seems to work, however now I can’t get the left Vector properly… /sigh

up.set(temp).crossLocal(Vector3f.UNIT_Y).normalizeLocal();

1 Like

@abies
Can you verify this for me? I believe it is correct. I am using an expanding torus as a test and mapping to the face normal.

[java]
p.emitter.getShape().setNext(p.triangleIndex);
temp.set(p.emitter.getShape().getNextDirection());
up.set(temp).crossLocal(Vector3f.UNIT_Y).normalizeLocal();
left.set(temp).crossLocal(up).normalizeLocal();
dir.set(temp);
[/java]

1 Like

Wow… yeah that worked. and the rotation along axis’ is relative to each particle. It looks realllllllly cool

1 Like

Here are the billboarding options:

[java]
public static enum BillboardMode {
Velocity,
Normal,
Camera,
UNIT_X,
UNIT_Y,
UNIT_Z
}
[/java]

And, here is how the individual relative axis’ rotation looks (particles mapped to torus face normals)

Z-rotation:

[video]http://youtu.be/u6jdEVpplMs[/video]

X-rotation:

[video]http://youtu.be/MrlAMLkRl34[/video]

Y-rotation:

[video]http://youtu.be/vTzgoXmkmm0[/video]

1 Like

Guess I should mention that rotations are a Vector3f and can be compounded.

1 Like

That’s really nice, how expensive is all the calculations mapping the particles back to the mesh shape?

1 Like
@zarch said: That's really nice, how expensive is all the calculations mapping the particles back to the mesh shape?

It’s faster than the current ParticleEmitter… much. I did a ton of cleanup in the ParticleMesh (no more variable creation per particle/per loop), the entire update loop of the control has been immensely simplified, the recalculation of the Geometry bounds has been simplified and the entire emitter + particles is a single Control now.

EDIT: More specifically… I don’t actually map them to the shape… the emitter is the shape, so it does a fetch against a face, gets the normal and you have a position + initial velocity vector.

1 Like

@zarch
Actually, instead of me just saying this… here is the new ParticleTriMesh update method. You can compare it to the original and see where it has improved (and mention anything you think is questionable!). Somewhere in this thread is the EmitterShape class that shows how the particles are mapped to a mesh.

[java]
@Override
public void updateParticleData(Particle[] particles, Camera cam, Matrix3f inverseRotation) {
VertexBuffer pvb = getBuffer(VertexBuffer.Type.Position);
FloatBuffer positions = (FloatBuffer) pvb.getData();

    VertexBuffer cvb = getBuffer(VertexBuffer.Type.Color);
    ByteBuffer colors = (ByteBuffer) cvb.getData();

    VertexBuffer tvb = getBuffer(VertexBuffer.Type.TexCoord);
    FloatBuffer texcoords = (FloatBuffer) tvb.getData();

    // update data in vertex buffers
    positions.clear();
    colors.clear();
    texcoords.clear();
    
    for (int i = 0; i < particles.length; i++){
        Particle p = particles[i];
        boolean dead = p.life == 0;
        if (dead){
            positions.put(0).put(0).put(0);
            positions.put(0).put(0).put(0);
            positions.put(0).put(0).put(0);
            positions.put(0).put(0).put(0);
            continue;
        }
		
		switch (p.emitter.getBillboardMode()) {
			case Velocity:
				up.set(p.velocity).crossLocal(Vector3f.UNIT_Y).normalizeLocal();
				left.set(p.velocity).crossLocal(up).normalizeLocal();
				dir.set(p.velocity);
				break;
			case Normal:
				emitter.getShape().setNext(p.triangleIndex);
				tempV3.set(emitter.getShape().getNextDirection());
				up.set(tempV3).crossLocal(Vector3f.UNIT_Y).normalizeLocal();
				left.set(tempV3).crossLocal(up).normalizeLocal();
				dir.set(tempV3);
				break;
			case Camera:
				up.set(cam.getUp());
				left.set(cam.getLeft());
				dir.set(cam.getDirection());
				break;
			case UNIT_X:
				up.set(Vector3f.UNIT_Y);
				left.set(Vector3f.UNIT_Z);
				dir.set(Vector3f.UNIT_X);
				break;
			case UNIT_Y:
				up.set(Vector3f.UNIT_Z);
				left.set(Vector3f.UNIT_X);
				dir.set(Vector3f.UNIT_Y);
				break;
			case UNIT_Z:
				up.set(Vector3f.UNIT_X);
				left.set(Vector3f.UNIT_Y);
				dir.set(Vector3f.UNIT_Z);
				break;
		}
		up.multLocal(p.size);
		left.multLocal(p.size);

		rotStore = tempQ.fromAngleAxis(p.angles.y, left);
		left = rotStore.mult(left);
		up = rotStore.mult(up);

		rotStore = tempQ.fromAngleAxis(p.angles.x, up);
		left = rotStore.mult(left);
		up = rotStore.mult(up);

		rotStore = tempQ.fromAngleAxis(p.angles.z, dir);
		left = rotStore.mult(left);
		up = rotStore.mult(up);
		
        positions.put(p.position.x + left.x + up.x)
                 .put(p.position.y + left.y + up.y)
                 .put(p.position.z + left.z + up.z);

        positions.put(p.position.x - left.x + up.x)
                 .put(p.position.y - left.y + up.y)
                 .put(p.position.z - left.z + up.z);

        positions.put(p.position.x + left.x - up.x)
                 .put(p.position.y + left.y - up.y)
                 .put(p.position.z + left.z - up.z);

        positions.put(p.position.x - left.x - up.x)
                 .put(p.position.y - left.y - up.y)
                 .put(p.position.z - left.z - up.z);

		if (uniqueTexCoords){
			imgX = p.spriteCol;
			imgY = p.spriteRow;

			startX = 1f/imagesX*imgX;
			startY = 1f/imagesY*imgY;
			endX   = startX + 1f/imagesX;
			endY   = startY + 1f/imagesY;

			texcoords.put(startX).put(endY);
			texcoords.put(endX).put(endY);
			texcoords.put(startX).put(startY);
			texcoords.put(endX).put(startY);
		}

        int abgr = p.color.asIntABGR();
        colors.putInt(abgr);
        colors.putInt(abgr);
        colors.putInt(abgr);
        colors.putInt(abgr);
    }
	
//	this.setBuffer(VertexBuffer.Type.Position, 3, positions);
    positions.clear();
    colors.clear();
    if (!uniqueTexCoords)
        texcoords.clear();
    else{
        texcoords.clear();
        tvb.updateData(texcoords);
    }

    // force renderer to re-send data to GPU
    pvb.updateData(positions);
    cvb.updateData(colors);
	
	updateBound();
}

[/java]

1 Like

Glad I posted this. I just caught something in the old code that is a mistake. I hadn’t looked at this yet.

The whole part using:

[java]
boolean dead = p.life == 0;
[/java]

Should actually be:

[java]
if (p.life <= 0) {

}
[/java]

1 Like

Hey… @nehon
Is this the potential issue with Android?

[java]
int abgr = p.color.asIntABGR();
colors.putInt(abgr);
colors.putInt(abgr);
colors.putInt(abgr);
colors.putInt(abgr);
[/java]

1 Like

Your code is so much cleaner than mine… Just a question - how does it behave if you put emitter in a hand of animated model? Will all billboard still work properly? Half of complexity in my implementation is about differentiating between ‘follow local transform’ versis ‘after emission stay in world’ distinction and conditional multiplication by local-to-world and world-to-local transforms.

Start rotating and moving your emitter node slowly and see what happens.

1 Like
@abies said: Your code is so much cleaner than mine... Just a question - how does it behave if you put emitter in a hand of animated model? Will all billboard still work properly? Half of complexity in my implementation is about differentiating between 'follow local transform' versis 'after emission stay in world' distinction and conditional multiplication by local-to-world and world-to-local transforms.

Start rotating and moving your emitter node slowly and see what happens.

The billboarding stays relative to the choice you make… so Normal would change with the animated model.
Where as Camera would stay in the right position but billboard to the camera.

Now… on the subject of Local to World. I have not implemented this yet, because I want to make sure it is disable-able, or the emitter won’t work in the GUI node. If I’m not mistaken, this is just multiplying the vector by the world transform matrix, correct?

And… on the subject of motion blur… can you post a vid to show an example of what the final effect looks like? I would love to see this as an influencer with settings.

1 Like

@abies
I have another question for you.

Billboard modes…

These may need to be extended slightly. For instance billboarding along the X or Z axis relative to Velocity or Normal.

Use case:

Velocity is perfect for an explosion moving outward…
However, a particle with flaming trail behind it would want to billboard along the relative X of velocity to display properly.

Thoughts?

1 Like

One more thing for people to mill over…

One of the issues I currently see with how particles are generated on a mesh is that they start for the absolute center of the Triangle.

This needs to change to a random placement and the particle needs to store the offset from the center and apply this as it updates. Until this is happening, particle placement is far to uniform to be useful for generating grass, leaves, fur, etc.

Also, the larger the triangle, the more weight should be given to particles being generated by the particular triangle.

This last may not happen before submitting the code, but wanted to mention it for anyone who decides to contribute to this.

EDIT: Actually, the only way of doing this in a semi-efficient manner would be to walk the tringle list of the mesh added as an emitter shape and classify each triangle by size group. Then when randomly selecting a triangle to emit from, it would weight it by size classification.

Anyways… just jotting down notes.

1 Like
@t0neg0d said: Hey... @nehon Is this the potential issue with Android?

[java]
int abgr = p.color.asIntABGR();
colors.putInt(abgr);
colors.putInt(abgr);
colors.putInt(abgr);
colors.putInt(abgr);
[/java]

It will work, but writing to buffers is slow on android whatever you do.
Also as much as possible try to prefer bulk put instead of individual puts (it’s faster even on desktop) even if you have to keep a temporary array.
Here for 4 elements…i’m not sure it will have a big impact though.

1 Like
@nehon said: It will work, but writing to buffers is slow on android whatever you do. Also as much as possible try to prefer bulk put instead of individual puts (it's faster even on desktop) even if you have to keep a temporary array. Here for 4 elements...i'm not sure it will have a big impact though.

Oh… good to know. This is how it was done in the original. I’ll update this for sure.

1 Like