I’m finally starting to fully integrate the 2D framework into the GUI library… and… what the hell does that mean?
Well… how about a breakdown of what the 2D framework does first? This should make it a bit more apparent as to what you can do with it.
The framework is a port (of sorts) of LibGDX’s functionality in a JME friendly way. It allows you to:
Create AnimElements - AnimElements are a collection of quads (much like a particle emitter’s particle mesh) that you can manipulate on a per quad basis or as a whole.
You manipulate these Transformables by running TemporalActions (moving, scaling, rotating, altering texCoords over time) against them, or directly calling one of the Transformable methods for altering transforms.
You can set the parent linkage of the quads of an AnimElement to have a cascading effect. When altering one, it’s transform effects that of it’s children as well.
On the screen level, you can:
Create different AnimLayers to group like AnimElements in a particular ZOrder range
Use tonegodGUI Listeners to automatically add mouse/touch events to AnimElements and individual QuadData (this includes Mouse Left Down, Mouse Left Up, Mouse Right Down, Mouse Right Up, Mouse Wheel Down, Mouse Wheel Up, Mouse Focus, Mouse Lost Focus and all Android (hopefully iOS as well) touch events.
The card deck in the video is a single mesh:
Limit ZOrder effect to AnimElement, QuadData, Both or None per AnimElement
Limit Mouse/Touch Movement to AnimElement or QuadData per AnimElement
Pretty much all interaction you need to build a game like Plants vs Zombies (good example of the types of animations you can produce with AnimElement), or Card Game, or, or, or… etc, etc in short order.
Here are some other things I have done while testing this, just to help better show off what you can do with it:
Level scoring for most mobile games: (makes use of the AnimManager for sequencing TemporalActions over a timeline)
Image shaped 2D particle emission:
Animated Bitmap Text
I’ll be adding some basics for how to use the 2D framework to this thread until I am able to finally start cleaning up/updating and expanding on the Wiki. Prior to the Screen level support, the framework was semi-limited in it’s application as you would have had to write all of the input handling yourself. So… I’m pretty excited about this addition!
On a related side note:
The technique I am using for reordering the mesh quads is a viable solution for particle emitter meshes as well. What the hell does THIS mean?? It means that, I’ll be able to update my particle emitter to properly sort the particles based on world position so using materials like Lighting.j3md, etc become a completely viable option without the need for AlphaAdditive blending to hide the fact the particles are not rendered in the correct depth order.