Overlapping textures in 2D

Hi toghether,

I’m new in using jmonkeyengine and all in all I’m also new with game development. But my entire life I wanted to implement a game, so I decided to start with a simple 2D Tile Map Game. All of that works very well with jMonkeyEngine (I knew, it’s developed for 3D games). But now I have an issue with textures/geometries (not) overlapping other textures/geometries.

The game is made similar to prison architect or rimworld. It has an x and a y coordinate. The z coordinate is just for scaling with the camera but for the other graphical components, it is never used. So when I have two overlapping textures of two different objects, sometimes the wrong texture/geometry overlaps the other texture/geometry. Is there any way to manage which texture/geometry is visible in front, especially when different nodes attach the textures/geometries?

Thank you very much,


The Z coordinate.

1 Like

Yes, but this is exactly what I don’t want to use, so let me ask again:
Is there any way to manage which texture/geometry is visible in front, especially when different nodes attach the textures/geometries, and the z coordinate should not be used?

You can manually do exactly what the Z is doing by adding your own custom geometry comparator… or you can use Z for what it’s supposed to be used for and do whatever else you are using Z for with what you’d use to keep track of layers.


If that’s what you’re getting then you don’t have something set up right for your 2D camera. You can probably just do it all in the guiNode which is 2D by default really.

And then what @pspeed said.

@pspeed thank you. I’ll try that or I’ll try a workaround with the z coordinate.

@MoffKalast Sorry, I don’t understand exactly what you mean. Right now I use the regular camera class and I disabled the FlyByCamera. Scaling just increases or decreases the z location of the camera. Sure that is not the best way to use it, but it seems to work well. Also the graphics I use are really crappy, so I don’t think I’ll have to look after capacities and resources.

Are you using a perspective or an orthographic camera? Z coordinate shouldn’t make any difference (sort of) if you’re using an orthographic camera that’s unrotated.

Note that these are, very, very, very, very different things. A geometry is the physical shape of something. A texture is nothing but a 2D image that gets smashed onto that shape and is usually used by shaders to give the shape nice colors. If you’re interested in 2D specifically, that probably means lots of moving quad geometries around. Note that textures themselves never “overlap” - but shapes do. If you move textured quads around with the right x/y/z coordinates then all the issues you mentioned above totally go away (and if you ever want to do 2.5D or top-down 3D or something like that it’s a straightforward transition).

@danielp I use an orthogonal camera. I alredy tried to use quads but the interesting point is, that I directly had performance issues while my pathfinding algorithm runs. No idea why. It runs smoothly with boxes. But I’ll try that again.

So what overlaps? The texture, the geometry or the mesh?

Do this. There are more topics in this forum - search “GeometryComparator”.

No you don’t. An orthogonal camera does not do perspective and would not scale geometries depending on their Z distance from the camera. Which apparently yours does.

Edit: Dammit, orthographic not orthogonal. Oops.

Well, to be fair … orthogonal just means “perpendicular” (90°). What we are discussing here is orthographic (projection). Also it’s called a parallel projection.
The theory can be found on Wikipedia (and elsewhere):
[Orthographic projection - Wikipedia]
[Parallel projection - Wikipedia]

It’s good to understand the theory, but you can also just toy with this:
just call something like cam.setParallelProjection(true);
then move camera around and call cam.setParallelProjection(false); again.

1 Like

@MoffKalast & @Ogli mhm ok, sorry. Like I said I’m new in this branch of software development and thought you just mean the mathematical definition of orthogonal. I have to read some information about that first, but I never set the parallelProjection booleon in the camera class knowingly on false, so I suggest parallel projection is active.

Edit: Ahh orthographic :joy: Sorry. I missunderstood that.

Well, usually it should be false by default (when the camera is created).

You can just couple an input key to it (like KeyInput.KEY_O for example).
Then just toggle the flag:
cam.setParallelProjection( !cam.isParallelProjection() );
Now, if you press the O-key then it toggles to the other projection mode of the cam.
Usually you get the cam by this.getCamera() or something like that.

Basically it’s either perspective or parallel (orthographic).
Perspective is how humans see their usual environment, you can change the field of view (fov) too.
Parallel (orthographic) is good if you want to look at 2D worlds or blue prints / paper pages.

Best explore it with “learning by doing”, and if you are interested, there’s always the theory stuff.

And no need to be sorry, you did nothing wrong. :slight_smile:

Cameras are perspective by default.

If your new to all of this then parallel projection is going to feel weird.

Or you can just use the gui node (or gui bucket) which is already parallel projection, already sorts properly on Z, etc…

At this point you’ll reaaaaally want to do yourself a big favor and go through the tutorials in the wiki in detail: https://jmonkeyengine.github.io/wiki. That covers a lot of information that I think you’ll find helpful right about now.

I’ll give a brief answer to your question here though. When you render something, you’re describing some geometry and material properties and getting a (hopefully) pretty picture out. A mesh is the convention used on modern graphics hardware to describe geometry. Note that in jME, a geometry and a mesh are not at ALL the same thing. The Mesh class is just a “math” class that’s used to store information about the vertices, U/V buffers, etc. The Geometry class is what goes in jME’s scenegraph, and the Geometry contains a mesh (describing the shape of the geometry) and a material (describing the surface of the geometry). Now, so far I haven’t mentioned textures yet. Textures are just part of the material. Materials are rendered with shaders, and textures are just one of the possible inputs to a shader (you can also use flat colors and really anything you can think of if you’re writing your own custom shaders). A texture is traditionally just a 2D image that gets wrapped onto a mesh based on the U/V coordinates of each vertex in the mesh, but now the term “texture” can mean a whole lot more and shaders can use them for far more than just coloring a mesh. If you were writing your own shaders, you could define multiple sets of U/V coordinates on the mesh and “overlap” textures in the shader, but

  1. I’ve never seen this done in practice.
  2. I’m not sure why you would want to.
  3. This would definitely be the wrong way to go about what you’re trying to accomplish.

If I understand correctly, you want some textured 2D sprite-like things to overlap nicely. By far the easiest way to do this, as others have pointed out, is to start by putting everything in the guiNode rather than the main scene root node. The guiNode is set up for exactly this sort of thing, and you’ll save yourself lots of trouble and headaches if you go this way. The best part about the guiNode is that the “overlapping” effect you want is trivial - just use different Z coordinates to stack things. Things with a higher Z coordinate go on top, things with a lower Z coordinate go below (Pitch for the wiki: when writing this post I forgot which way the Z coordinates were oriented, but it was really easy to find this information on the wiki.).

Two parting thoughts:

  1. Read the wiki from beginning tutorials through the advanced section, then read the whole thing again until you understand everything. It’s extremely informative and you’ll get not just jME specific information but tons of background information about 3D in general. Doing this would be time extremely well spent and will save you tons of time later. It takes time to understand 3D programming, even with an awesome engine like jME. A little reading now could save you 10s of hours of frustration later.

  2. Pretty much all 3D programming is an enormous sleight-of-hand trick. The muzzle flash you see in an FPS game isn’t a muzzle flash at all, it’s a clever use of particles (or some other technique). Overlapping 2D sprites aren’t overlapping textures, they’re quads drawn over other quads using alpha channels to show what’s behind them, and maybe animated sprite sheets to give the illusion of a walking/jumping/crouching/etc figure. When you use jME for 2D like this, even the 2D is an illusion - it’s 3D that’s cleverly arranged so that it appears 2D, with the 3rd coordinate used to stack objects on top of each other. You might be surprised how many nifty effects you can accomplish with just a few simple meshes used in clever ways.

1 Like

In case you have problems with the pathfinding, you can use the Slick2d library (Slick-Util | Slick2D) that works very well or use my modified code which adapts the A-STAR to 3d. (GitHub - thoced/PathFinder3d).


@danielp thank you for that long answer. I already started with learning more about that through reading.

@thoced thank you also, for your info but the pathfinding is one of the implemented functionalities, which works better than expected, It runs with A* too,

I have another question related to the GeometryComparator. To give you a better overview, here is a picture which should explain my issue;


The geometry of my npc has a lower y coordinate than the tree. So the npc geometry should lay above the tree, so it seems like “3D”. I think you know what I mean. When the npc is behind the tree, so the y coordinate of the npc is higher than the y of the tree, then the tree should lay above. To solve that I wanted to experiment with the RenderQueue and therefor I tried to implemented my own YComparator like this:

public class YComparator implements GeometryComparator {

private Camera cam;

public void setCamera(Camera camera) {
    this.cam = camera;

public int compare(Geometry o1, Geometry o2) {

    if(o1.getLocalTranslation().y < o2.getLocalTranslation().y){
        return -1;
    } else {
        return 1;

Now I don’t have any clue how to attach the new GeometryComparator to jMonkeyEngine, especially to this RenderQueue.Bucket. I found those calls but I do not know how to use them:

    RenderQueue renderQueue = viewPort.getQueue();
    renderQueue.setGeometryComparator(RenderQueue.Bucket.XYZ, new YComparator());

Additionaly I do not understand how I could attach my custom geometry to the objects where I want to use them. I found this call:


Of course I know XYZ stands for e.g. Transparent or Opaque, but how can I use them with custom GeometryComparators and how can I attach them to special objects, like trees or npcs, I want to use them with. All the tiles in my tilemap, of course, mustn’t have this GeometryComparator.

You will be better off making your geometry stretch into z as if it’s 3D even though the corners are using 2D x, y style coordinates.

So in the case of your tree, the bottom would be at say z=0 and the top at z=1. It will look exactly the same except that if you render your character in a similar way then when it’s in front it will look like it’s in front and when it’s behind it will look like it’s behind.

As to the other:
You can’t just make up your own buckets. All you can do is change the comparator of an existing bucket.

You should really be using the guiNode/gui bucket and doing pseudo 3D as I mention, though. It’s way easier.