Animated 2D Sprites

Hi again,



before I start with my question: I have searched the forums but everything I found on this topic was either outdated or didn’t provide the information I was actually looking for.



The problem is quite simple. I have a sprite - in my case a Picture class (even though I realized during my search that this might not be the best solution) - which I want to animate. I also have a png-file containing several pictures (aligned left to right, all the same height and width (e.g 2 40x40 frames => 80x40 png file) which would resemble an animation if played one after another.



Research suggested that one would load the png file into a texture and use a source rectangle as an overlay which defines the area of the texture to draw onto any object the texture is put on. Translating the source rectangle x pixels (to the next “frame”) on a frame change would therefore provide the illusion of an animated sprite. What I found out, though, is that it doesn’t seem possible in jME3 to do that (at least with an Image/Texture/Picture), am I correct? If so, how would someone create an animated sprite? I heard something about quads that use setTextureBuffer but they apparently lack that function since then.



Thanks,

wiki :slight_smile:

1 Like

Thanks for tipping me into the right direction! I rummaged through the files for the particle shader and found what I was looking for (or so I hope).

Even though, I didn’t understand all of what I’m using now, which is quite frankly always a very bad thing! Nonetheless - for test purposes - I created a Sprite-class extending the Picture class. What it does for now is caling its super constructor, setting the image and retrieving the TexCoord-VertexBuffer of the mesh.

As I understand it, that buffer stores the information of the texture coordinates (is this right?). Then I get the FloatBuffer of the VertexBuffer and now I’m a bit at a loss. I just copied code that was executed in the ParticleTriMesh which reads as follows:

[java]

texcoords.clear();



// I understand this part

texcoords.put(0.0f).put(0.0f);

texcoords.put(0.5f).put(0);

texcoords.put(0.5f).put(0.5f);

texcoords.put(0).put(0.5f);



// why?

texcoords.clear();



// this one’s also clear to me

vb.updateData(texcoords);

[/java]

I could understand clearing the texcoords-buffer the first time (even though I’m not actually sure about what it actually does), but I don’t know why I would clear it a second time afterwards. If someone could be so kind as to explain?

Also, if I misunderstood the vertexbuffer, please point me into the direction of information about VertexBuffers. As far as I understand it’s a Buffer to store information about the textures vertices (which shape the texture has).

As far as Iunderstand this



texcoords.clear();

// I understand this part

texcoords.put(0.0f).put(0.0f);

texcoords.put(0.5f).put(0);

texcoords.put(0.5f).put(0.5f);

texcoords.put(0).put(0.5f);



code has no effect at all if it get’s cleared right after.,

Believe me, it does. Only the upper lower left part of the picture is shown. That’s why I want to know, what the clear function of a FloatBuffer actually does.

When i last looked into the shaders for the aprticle amterial they were the ones that did the work, the texturecoordinates where not the ones responsible for that behaviour.

No, don’t modify the texture coordinates in the code, modify them in the vertex shader.

something like this

Let’s say you have an image with 10 frames, and that images are in the width direction

vert shader :



uniform mat4 g_WorldViewProjectionMatrix;

uniform int m_Index;

attribute vec3 inPosition;

attribute vec4 inTexCoord;

varying vec2 texCoord;

const float nbFrames = 10;



void main(){

gl_Position = g_WorldViewProjectionMatrix * pos;

float x = inTexCoord.x / nbFrames + float(m_Index) / nbFrames ;

texCoord = vec2(x,inTexCoord.y);

}

Hello!

Now, prefacing this, take it with a grain of salt. What Nehon posted is the main section of what you’d need to do to make a shader to do this. Now, having said that, there are TONS OF WAYS™ to do this, and it really depends on exactly “how much control do I need?” :smiley: Also, it depends on if you want the sprite to be affected by lighting. In most cases you don’t but I don’t know your application. In the end, you’ll be making some sort of variation of the SimpleTextured shader or the Lighting shader.



Going the shader route is intimidating, but it’s most definitely the easiest to work with once you have your shader working properly. At the least, you’d just need to tell the shader what the dimensions are and what frame you’re on and bam, done.



Another snippet for your coding pleasure:



[java]

float uv_XDim = m_FrameDimensions[0] / m_SheetDimensions[0];

float uv_YDim = m_FrameDimensions[1] / m_SheetDimensions[1];

float possibleColumns = floor(1.0 / uv_XDim);

float possibleRows = floor(1.0 / uv_YDim);

float currCol = m_Sprite_X / m_FrameDimensions[0];

float currRow = m_Sprite_Y / m_FrameDimensions[1];

texCoord = vec2((inTexCoord[0] / possibleColumns)

  • (uv_XDim * currCol),

    (inTexCoord[1] / possibleRows)
  • ((1.0 - uv_YDim) - (uv_YDim * currRow)));

    [/java]



    This is what I’m using currently, though I’ve been somewhat distracted lately so I haven’t been working much to refine my animations yet :< This snippet is meant to be used with a spritesheet. Though, if you’re not going this route it’s useless, but I figured I’d throw it in the ring for the funsies of it all. :stuck_out_tongue:



    The real biatch is controlling how animations are stored and maintained for spritesheets. It hurts my poor little monkey brain :cry:.



    Cheers!

    ~FlaH

Well, with JME3 the way to do it is shaders.

there is something similar done for particles of a particle emmiter, maybe you could look into the particles shader to have an idea of how it works.

the basic idea is to have a quad displayed on screen using a shader material. You pass to the shader the texture with several images, and the index of the image you want to display. Then you just offset the texture coordinates in the vert shader according to the index.

Look at the animated particles, they are actually a 2d sprite animation (at least some parts of them like the flames in the explosion test thing)



Basic way to go would be to have all sprites of animation in textureatlas, and use a shader to switch betwenn the imageparts in it.

Okay, thanks for all the answers!

But what is the advantage of a shader over manipulating the texcoords directly? It’s a simple adventure game in the styles of monkey island or kyrandia. Is it speed or something else, because I don’t know jack about shader programming and the code snippets you guys provided are not really helping at all. For me, it just seems over-the-top to use shaders to do a simple spritesheet animation.

That’s faster, and IMO it’s easier, but anyway, if you want to go the “CPU side texcoord” way, look at this doc

https://wiki.jmonkeyengine.org/legacy/doku.php/jme3:advanced:custom_meshes

Thanks for the link, and yeah, after reading the doc on shaders it seems easier-ish. But I am correct that I’ll have to write my own *.vert and *.frag file + my own *.jm3d file (for MaterialDef), which I would need to load into a Material, which again would be loaded into a … mesh, right?



Also the *.vert file would need to accept an image (spritesheet) and an integer (which frame actually), calculate the position of the image in the spritesheet, update the vertexBuffer for the texCoords, then (still) do the standard transforms and that’s it? Does that sound anywhere near correct?

Hello again!

wickermoon said:
Thanks for the link, and yeah, after reading the doc on shaders it seems easier-ish. But I am correct that I'll have to write my own *.vert and *.frag file + my own *.jm3d file (for MaterialDef), which I would need to load into a Material, which again would be loaded into a ... mesh, right?
Also the *.vert file would need to accept an image (spritesheet) and an integer (which frame actually), calculate the position of the image in the spritesheet, update the vertexBuffer for the texCoords, then (still) do the standard transforms and that's it? Does that sound anywhere near correct?


For the first part: You'd only need to modify a vert and a j3md. The vert would contain the texture manipulation you're doing for the sprite and the j3md is like a pass-through for the additional variables you'll need to be passing from your application to the material (frame index, image, etc etc). No need to have a modified frag. Fragment shaders are for when you want to actually have the shader affect the geometry's uhhh.... geometry :p for like deformations and such. I've seen people do some crazy shit with frag shaders though but they tend to get a bit specialized so I wouldn't worry about it just yet. You can link to an existing frag shader from your j3md as well, it doesn't need to be custom or a copy-paste of an existing.

For the second part: I think? I'll wait for @nehon on this one. The lucky thing is once you get the quad geometry ready and load in the parameters to the shader, the only thing you should be worrying about for updating the animation is the index. The rest is done by the shader (or should be anyway).

Hope that this helps some.
~FlaH

EDIT: The forum's edit feature has decided it has a ravenous hunger for line breaks. Seems like when I go to edit all my line breaks/enter keys are CONSUMED BY THE VOID! ooOoOoOOoooOoooooh... :p
wickermoon said:
Thanks for the link, and yeah, after reading the doc on shaders it seems easier-ish. But I am correct that I'll have to write my own *.vert and *.frag file + my own *.jm3d file (for MaterialDef), which I would need to load into a Material, which again would be loaded into a ... mesh, right?

Yes, but not to a mesh but to a Geometry, but you got the idea.
For this you can copy the an existing material like the simpleTextured one (j3md, .vert, .frag). Then you'll just have to modify the associated .vert file and some uniform declaration in the j3md file. The geometry will be a simple quad with standard textcoord (0,0;0,1;1,1;1,0)

wickermoon said:
Also the *.vert file would need to accept an image (spritesheet) and an integer (which frame actually), calculate the position of the image in the spritesheet, update the vertexBuffer for the texCoords, then (still) do the standard transforms and that's it? Does that sound anywhere near correct?

the image will be passed to the .frag file, not the vert one. Remember the vertex shader is responsible of computing the position of a vertex on screen (by matrix multiplication), and the fragment shader is responsible of calculating pixels color on screen.
So you should focus on the vert shader because that's where your calculation will take place. so for each vertices you are going to compute the texcoord. You don't have to recompute the texcoord buffer because for each vertex the texCoord will be provided as a vec2 in the shader, you just have to change them on the fly (see my last pot to do so).

tehflah said:
For the first part: You'd only need to modify a vert and a j3md. The vert would contain the texture manipulation you're doing for the sprite and the j3md is like a pass-through for the additional variables you'll need to be passing from your application to the material (frame index, image, etc etc). No need to have a modified frag. Fragment shaders are for when you want to actually have the shader affect the geometry's uhhh.... geometry :p for like deformations and such. I've seen people do some crazy shit with frag shaders though but they tend to get a bit specialized so I wouldn't worry about it just yet. You can link to an existing frag shader from your j3md as well, it doesn't need to be custom or a copy-paste of an existing.

You must be mistaken with the geometry shader, which is not supported by jME3 yet. The fragment shader (also called the pixel shader) is responsible of the color of a pixel.
nehon said:
`
uniform mat4 g_WorldViewProjectionMatrix;
uniform int m_Index;
attribute vec3 inPosition;
attribute vec4 inTexCoord;
varying vec2 texCoord;
const float nbFrames = 10;
`

g_WorldViewProjectionMatrix, inPosition and inTexCoord have to be there for position calculation and Texture-Coordiantes manipulation.
texCoord is the parameter that's going to be passed to the fragment shader, so he'll know what to draw, right?
The constant isn't necessary / desired, especially if I want to have sprites that can have different number of frames per animation, correct?
m_Index is the material attribute I have to set in my Sprite-class via
material.setInt("Index", currentFrame);
and would be the frame I want to show (e.g. second frame in the animation), also correct?
This would also mean, that I'd need to create a MaterialParameter
Integer m_Index
in the j3md file, right?

Also, I'd need to create another MaterialParameter for the maxNumber of frames (e.g. m_maxFrames) (as a substitute for the nbFrames attribute)?

Do I need the m_YCoCg m_LATC and m_Normalize parameters (I do use Alpha-Blending, so m_ShowAlpha is probably necessary)?
Sorry for all those questions, I just want to check if I really got it and don't waste time on some stupid mistakes.
nehon said:
You must be mistaken with the geometry shader, which is not supported by jME3 yet. The fragment shader (also called the pixel shader) is responsible of the color of a pixel.


whoops. My bad. I don't know why I do that, but it's not the first time I've confused the two. XD

~FlaH
wickermoon said:
g_WorldViewProjectionMatrix, inPosition and inTexCoord have to be there for position calculation and Texture-Coordiantes manipulation.
texCoord is the parameter that's going to be passed to the fragment shader, so he'll know what to draw, right?
The constant isn't necessary / desired, especially if I want to have sprites that can have different number of frames per animation, correct?
m_Index is the material attribute I have to set in my Sprite-class via
material.setInt("Index", currentFrame);
and would be the frame I want to show (e.g. second frame in the animation), also correct?
This would also mean, that I'd need to create a MaterialParameter
Integer m_Index
in the j3md file, right?

Also, I'd need to create another MaterialParameter for the maxNumber of frames (e.g. m_maxFrames) (as a substitute for the nbFrames attribute)?

Do I need the m_YCoCg m_LATC and m_Normalize parameters (I do use Alpha-Blending, so m_ShowAlpha is probably necessary)?
Sorry for all those questions, I just want to check if I really got it and don't waste time on some stupid mistakes.

Well every statement of your post are correct, except :
- In the j3md file, the parameter must be called Index and not m_Index, m_Index will be its name in the shader.
- You don't need the m_YCoCg m_LATC and m_Normalize parameters , nor the M_ShowAlpha (it's just a debug param to display the alpha channel on screen). Just use a texture with an alpha channel (like png), put your quad in the transparent bucket, and set the BlendMode of the material to Alpha
this can be done in the j3md file in the thechnique block add a RenderState block :

RenderState {
Blend Alpha
}


Except for that...you got it ;)

Well, so far so good. I got only a portion of the actual image to show. Unfortunately, it is always the leftmost portion and won’t change, no matter what frameNumber I set.



For testing purposes I set everything once in the constructor, so there’s no actual animation, but a still image of a portion of the original image.

Sprite.java:



[java]

Material material = new Material(assetManager, “Materials/Sprite.j3md”);

material.setInt(“Index”, 1);

material.setInt(“MaxFrames”, 4);

setMaterial(material);



TextureKey key = new TextureKey(animationFiles.get(“stdAnim”), true);

Texture2D tex = (Texture2D) assetManager.loadTexture(key);



material.getAdditionalRenderState().setBlendMode(BlendMode.Alpha);

material.setTexture(“m_Texture”, tex);

[/java]



Sprite.j3md:

[java]

MaterialDef Sprite

{

MaterialParameters

{

Texture2D m_Texture // was called m_ColorMap in the original shader

Int Index

Int MaxFrames

}



Technique

{

VertexShader GLSL100: Shaders/Sprite.vert

FragmentShader GLSL100: Shaders/Sprite.frag



WorldParameters

{

WorldViewProjectionMatrix

}



RenderState

{

Blend Alpha

}



Defines

{

TEXTURE : m_Texture

}

}



Technique FixedFunc { }

}

[/java]



Sprite.vert:



[java]

uniform mat4 g_WorldViewProjectionMatrix;

uniform Int m_Index;

uniform Int m_MaxFrames;



attribute vec3 inPosition;

attribute vec2 inTexCoord;



varying vec2 texCoord;



void main()

{

// create a four component floating-point vector (Quaternion)

// out of a 3D-vector and 1 and multiply with the

// World-View-Projection-Matrix to get the actual position of

// the vertex

gl_Position = g_WorldViewProjectionMatrix * vec4(inPosition, 1.0);



// here we’ll need to manipulate the texCoords to let the vertex

// only show the part of a picture we want to



float x = float(m_Index) / float(m_MaxFrames);

texCoord = vec2(x, inTexCoord.y);

}[/java]





and the Sprite.frag:



[java]#import “Common/ShaderLib/Texture.glsllib”



uniform sampler2D m_Texture;

varying vec2 texCoord;



void main(){

// comments have already been there

//Texture_GetColor(m_Texture, texCoord)

//vec4 color = texture2D(m_Texture, texCoord);

//color.rgb *= color.a;

//gl_FragColor = vec4(color.a);



gl_FragColor = Texture_GetColor(m_Texture, texCoord);

}[/java]



So what am I doing wrong, exactly?

Hello again!



Try this from what @nehon posted earlier :



[java]

float x = inTexCoord.x / m_MaxFrames + float(m_Index) / m_MaxFrames;

[/java]



for the float x setting line. Got to remember that the texCoord is calculated for every pixel… I think… I’ve always been a little fuzzy on that. @nehon would know heh. But in any case, x is always equal to 0.25f in your case, and the inTexCoord of y is being passed in as is. So the inTexCoord.x is never referenced.



~FlaH

Tehflah is right.

However, for the “texCoord being calculated for every pixel, that Nehon would know”…mhhh …honestly, it’s fuzzy for me too, only thing that i know is that texCoord are computed for every vertex, and that for each vertex, there is a corresponding bloc of rasterized pixels. The texture coordinates of the vertex are sent to each pixel of this bloc…but honestly it’s all hardware stuff, and you don’t have to worry about it.

You just have to be confident that pixels will receive the good coordinates :stuck_out_tongue: