Small texture UV coordinate issue

I’m a new user, so I can’t embed more than one image or two links in this post, which is annoying. I am generating quad meshes for my terrain chunks and using currently just one texture, since as I understand it, I would need one mesh per texture that I add. My texture is 1024x1024, and divided into 256x256 textures. I have tinkered with the UV coordinates a lot, but am still having an issue with overlapping into neighboring squares. My texture:

Imgur

I’ve confirmed that each square is correctly 256 x 256. Up close, the texture looks fine:

Imgur

But, as you look at the horizon…

Originally, I tried UV coordinates of x: 0.0-0.25 and y: 1.0-0.75. Then after reading up on what they should be set to, I learned that the UV coordinate was supposed to be in the middle of the desired pixel (I think?) and so the formula to address pixel i is (2i + 1)/2n, which led me to these coordinates for that image:

// Other JSON for texture defs
"textureCoordUpperLeft":  { "x": 0.25048828125, "y": 0.99951171875 },
"textureCoordLowerRight": { "x": 0.49951171875, "y": 0.75048828125 }

Still, none of this seems to help. I also tried fooling with GL clamping modes, but none of them fixed the issue. Is there something obvious I missed in the docs?

Set the min filter of your texture to “nearest” and you will avoid the bleeding from distance filtering. Then you will also look all 8-bit blocky like minecraft.

The other alternative is to give your atlas a buffer area and stretch the cell edges into it.

Or don’t use a texture atlas.

I’ve unhid this thing like 3 times. Do the thing, forum :confused:

Thanks @pspeed! That fixed my problem:

Code I added for others:

texture.setMinFilter(Texture.MinFilter.NearestLinearMipMap);

The textures within the atlas are big enough that you can’t really see that it’s blocky anyway, which is good.

I’ve given thought to not using an atlas – the primary issue I foresee is that, unless I’m misunderstanding, I’d have to have one mesh per type of block present in my chunk. Totally doable, just wondering if that won’t get pretty crazy for complex chunks. That said, it’d allow for more complex effects since I would have one material per type of block. Something to consider for the future I guess, but for now, this is perfect. I’ll exhaust the size of even an 8192x8192 atlas eventually, so…

Anyway, interestingly, when I turn antialiasing on, it still seems to have the lines. There are a few different min filters, wondering if I’m picking the wrong one.

Mythruna uses one mesh per texture type (not block type because I reuse some textures for different block types). It let me size them differently, repeat them differently, etc… I even custom sort the partially transparent meshes to prevent the most obvious artifacts. ie: within a chunk, blocks like the thatched roof tiles are always sorted after everything so that the semi-transparent fringe of the hanging down straw doesn’t block the other transparent meshes… since that would be very common when looking out of a thatched roof house.

Yeah, I’ve done z-sorting before when I wrote my own little OpenGL-only engine in C++. Was a pain in the butt, especially with C++ prims (though I really do miss stack allocation for game development, hate having to use member vars everywhere).

Mythruna looks awesome by the way – looks like you are way, way farther along on a project that is similar to mine in a few ways. Your posts helping others elsewhere in this forum were already of great use to me :slight_smile:

Now that you mention it… I forgot how that z-sorting worked. I can’t remember how I decided which polygon was farther than another. Oh right, it was a 2D game so I had to define it manually. How do you handle that in 3D? The origin of the object containing the mesh?

In 3D z-sorting is not a truly solveable problem. For example, suppose you have two triangular faces intersecting at some angle. You can’t truly z sort them because neither one is fully in front of the other - parts of each obscure parts of the other. What you described only works in specific limited cases (or in 2D). That’s why things like the depth buffer are used, but with transparent rendering this is all quite an issue.