Terrain, Tileable textures and normal maps

First the issue, and then the general question.



Issue, when not using normal maps everything looks quite ok, but when i add a normal map all looks pretty messed up. I assume it’s due wrong/messed up texCoordinates, or is it something else? (I have already try’d so many combinations for setting the texCoords)



http://i.imgur.com/tfHt9.jpg



The next question, how to correcly texture a world with overhangs and tunnels without visible tiles? Using special textures which are tilable in all directions, even when they are flipped, or has someone a better approach?



In general i would prefer a WorldPos based texturing, but i don’t know how to deal with changes on the 3rd axis. I would be very thankfull for any idea…

Can you give a tiny bit more info about how you split tiles and how you’re applying the texCoords?

@t0neg0d said:
Can you give a tiny bit more info about how you split tiles and how you're applying the texCoords?



Sure, but i don't know if it helps,

I have a 3d grid of nodes.-

The current node in the following function is always

index[0],index[1],index[2]

I get neightbour nodes trough:

[java]
Vector2f t1 = new Vector2f(0, 0);
Vector2f t2 = new Vector2f(0, 1);
Vector2f t3 = new Vector2f(1, 1);
Vector2f t4 = new Vector2f(1, 0);
ElasticSurfaceNode n1 = this.surfaceMap.getElasticSurfaceNode(index[0], index[1], index[2], type);
ElasticSurfaceNode n2 = this.surfaceMap.getElasticSurfaceNode(index[0], index[1], index[2] - 1, type);
ElasticSurfaceNode n3 = this.surfaceMap.getElasticSurfaceNode(index[0] + 1, index[1], index[2] - 1, type);
ElasticSurfaceNode n4 = this.surfaceMap.getElasticSurfaceNode(index[0] + 1, index[1], index[2], type);
[/java]

This would get a horizontal x,z square from which then i can generate two triangles:
I have to check the summed normals for all vertices to know where the outside is.
p1 to p4 are vertices, no1 to no4 are normals, t1 to t4 are the texCoords, n1 to n4 are the actual nodes.

[java]
if (n1 != null && n2 != null && n3 != null && n4 != null) {
Vector3f normal = n1.getNormal().add(n2.getNormal()).add(n3.getNormal()).add(n4.getNormal());
normal.normalizeLocal();
if (normal.y > 0) {
Vector3f p1 = n1.getSmoothed();
Vector3f p2 = n2.getSmoothed();
Vector3f p3 = n3.getSmoothed();
Vector3f p4 = n4.getSmoothed();

Vector3f no1 = n1.getNormal();
Vector3f no2 = n2.getNormal();
Vector3f no3 = n3.getNormal();
Vector3f no4 = n4.getNormal();

info.addPosition(p1);
info.addPosition(p3);
info.addPosition(p2);

info.addPosition(p1);
info.addPosition(p4);
info.addPosition(p3);

info.addNormal(no1);
info.addNormal(no3);
info.addNormal(no2);

info.addNormal(no1);
info.addNormal(no4);
info.addNormal(no3);

info.addTexCoord(1, t1);
info.addTexCoord(1, t3);
info.addTexCoord(1, t2);

info.addTexCoord(1, t1);
info.addTexCoord(1, t4);
info.addTexCoord(1, t3);
} else {
Vector3f p1 = n1.getSmoothed();
Vector3f p2 = n2.getSmoothed();
Vector3f p3 = n3.getSmoothed();
Vector3f p4 = n4.getSmoothed();

Vector3f no1 = n1.getNormal();
Vector3f no2 = n2.getNormal();
Vector3f no3 = n3.getNormal();
Vector3f no4 = n4.getNormal();

info.addPosition(p1);
info.addPosition(p2);
info.addPosition(p3);

info.addPosition(p1);
info.addPosition(p3);
info.addPosition(p4);

info.addNormal(no1);
info.addNormal(no2);
info.addNormal(no3);

info.addNormal(no1);
info.addNormal(no3);
info.addNormal(no4);

info.addTexCoord(1, t1);
info.addTexCoord(1, t2);
info.addTexCoord(1, t3);

info.addTexCoord(1, t1);
info.addTexCoord(1, t3);
info.addTexCoord(1, t4);
}
}

[/java]


Then i repeat that for all tree axiscombinations, x,z = top and bottom, x,y = font and back, y,z = sides..

I use triplanar texturing: Calculating UV Coordinates for all three planes from world coordinates and then blend them according to weights calculated from the world normal.

Its pretty neat. Slow, but i guess the fastest option available. And: It looks so freakin nice. :slight_smile:





I’m using a simple setup in my project. I didnt quite figure an efficient way to blend multiple materials seemlessly. Its quite a pain. My attempts were either way to slow or didnt offer satisfying results.





Heres an example how it looks. Its a screen with highly repeating textures, since i used 256x256 here. It looks pretty nice with higher or lower scaled textures, too. But meh… my laptop is to crappy to handle even my (mutlithreaded) terraingeneration and physics at a time. ( Im investigating per Voxel physics atm… )







If you need some (GLSL) Shader code i came up with some. :stuck_out_tongue:

@KuroSei said:
I use triplanar texturing: Calculating UV Coordinates for all three planes from world coordinates and then blend them according to weights calculated from the world normal.
Its pretty neat. Slow, but i guess the fastest option available. And: It looks so freakin nice. :)


I'm using a simple setup in my project. I didnt quite figure an efficient way to blend multiple materials seemlessly. Its quite a pain. My attempts were either way to slow or didnt offer satisfying results.


Heres an example how it looks. Its a screen with highly repeating textures, since i used 256x256 here. It looks pretty nice with higher or lower scaled textures, too. But meh.. my laptop is to crappy to handle even my (mutlithreaded) terraingeneration and physics at a time. ( Im investigating per Voxel physics atm... )



If you need some (GLSL) Shader code i came up with some. :P


Thank you, one question, are you blending the UVCoords, or as in the GPUGems described, the color after the lookup?
Because up to now i have only seen methods blending the textures. Sure, if you have some code you want to share it would be a big help.

@KuroSei very cool looking! Is this based off the article from GPUGems? The terrain looks very similar to the example there. (Which I thought was awesome then too!)



@zzuegg

If I read this correctly, you are not stretching the texcoords over larger areas, correct? Which negates scaling of textures (seamlessly anyways). Ummm… wow. The first thing that came to mind is pretty much out the window, I think. The only other option that comes to mind is potentially slow… but her it is anyways.



You could walk the vertices and apply texcoords based on distance from neighbor AND (this would be the important part) distance from the start coord. This way, each tile (chunk, whatever) would cover 0.0 to 1.0 (potentially slightly beyond this depending on if the chunk/tile ends at an overhang) and use texture scaling… which will add insane amounts of “up-close” detail as well. The idea should account for all axis’ so there is no stretching and be gobs faster than other options… just a little more confusing setting up the method to map the coords.



There is the problem of potential seams at the crossover point from vertical to reversed horizontal direction… but I am sure it’s something that could be accounted for and fixed… it just isn’t hitting me atm.



EDIT: actually… this would not be any slower then standard grid tiles.



EDIT 2: Just wanted to mention… I think texture scaling is critical when it comes to producing highly detailed texture maps at small texture resolution…



EDIT 3: actually… the seam issue I talked about is imaginary. That’s why I mentioned needing both the last neighboring vertex’s texcoords and the distance from origin.

I failed to mention this will resolve your issue with normal maps as well, because it takes a complex 3d model and maps the textures much like you do when using blender’s unwrap. Actually… this isn’t a great comparison, as the reversed horizontal positions double back over previously used texcoord space.



Side note: You might need to be aware of the last textcoord used in one chunk to setup the textcoords for adjacent chunks.



EDIT: Gawd… sorry about this. If you give this some thought, you should be able to use the same method you are using to create vertex positions to also map the texcoords. The only difference is the scale in which you are looking at distance.

TerrainLighting material uses tri-planar mapping, so you can borrow the code from there if you would like.

@zzueqq: I compared a bit with the GPU-Gems article and i use almost the same technique they do. Only thing i did further was different pairs of UV for diffuse, normal and some blended ( Overlays so to speak ) textures. I removed that tho. Partially for performance reasons, mostly for the reason, that i still didnt get a satisfying solution for multimaterial terrain.



@t0neg0d: I had a short glance at it when i started my project, but i didnt really read it until just now to be honest. Most of my work is trial and error based. I got the idea of triplanar shading from there, tho. The article is great!



Further your idea to calc UVs is interesting.

A method to fix stuff would be to use shared volumes and take the adjacent volumes for reference when creating the edges.









A little off topic:

Does anyone have experience with 3D textures. Could one simply store Material IDs per Voxel in a 3D Texture ( of lets say 64x64x64 )and use that to calculate things in the shader? ( I got the voxelposition corresponding to the world position of the pixels. With some math this should work, but would be slightly more space consuming… )

Any ideas if that would work out? * Shameless thread hijacking *

@KuroSei said:
@zzueqq: I compared a bit with the GPU-Gems article and i use almost the same technique they do. Only thing i did further was different pairs of UV for diffuse, normal and some blended ( Overlays so to speak ) textures. I removed that tho. Partially for performance reasons, mostly for the reason, that i still didnt get a satisfying solution for multimaterial terrain.

@t0neg0d: I had a short glance at it when i started my project, but i didnt really read it until just now to be honest. Most of my work is trial and error based. I got the idea of triplanar shading from there, tho. The article is great!

Further your idea to calc UVs is interesting.
A method to fix stuff would be to use shared volumes and take the adjacent volumes for reference when creating the edges.

A little off topic:
Does anyone have experience with 3D textures. Could one simply store Material IDs per Voxel in a 3D Texture ( of lets say 64x64x64 )and use that to calculate things in the shader? ( I got the voxelposition corresponding to the world position of the pixels. With some math this should work, but would be slightly more space consuming... )
Any ideas if that would work out? * Shameless thread hijacking *


What were your thoughts on extracting the data on the shader side? Passing the information in this way is a big yes, it's the getting it back out in some proper order/usable form that I would struggle with. Guess I should mention that I struggle with environment maps past the standard usage >.<

I looked it up.

It seems the 3D Texture would work like a 3D Float array. The only problem here is, that it is in a identity cube. Meaning i would have 1/64 steps. Its a bit confusing and i guess i would have to use an offset of 1/128 to prohibit rounding errors.



What im thinking about atm is a bit more problematic. I’ll try to make a little illustration now. We see each other after a little commercial… bad reporter grin





Edit:

Fear my mad paint skills!





What i tried to show here is a typical Voxel-Cube ( pretty minimalistic and so on:P ) for determining the visual output ( MeshBuffer ). The different Colors on the edge are represantative for the VoxelIDs.

I have some ideas on how to work with the data on the GPU, but i dont have a clue how to determine which voxels to use for which triangle. One idea would to calculate something like a weighted median of ALL neighbors - too expensive if you ask me.

Im thinking about an approach using the normal and some approximations on plains, but i guess i reached a dead end here…



The problem is how to calculate the blending from only the world pos of the pixel ( it all works in the fragment shader, since it looks WAY better… ) and the normal of the triangle. Its irritating to have ideas which you cant work with since you lack the knowledge :confused:







Greetings,

A frustrated monkey.





ps: I know, the image is plane wrong. :smiley: This case would have significantly different results than this simple triangle :<







edit 2:

@adrolo: Its not a problem of working on the data… its more the working WITH the data. >.>

I’ve done a fair share of 3D texture work, including dynamically updating and iterpolation between several ones. There are no problems working with them in jME afaik.

@androlo said:
I've done a fair share of 3D texture work, including dynamically updating and iterpolation between several ones. There are no problems working with them in jME afaik.



With what program did you create the 3d textures? I have looked for it, but i did not found many informations about 3D textures generation at all

Depending on how many material types you have you could just set a material value at each vertex and look at the interpolation results.



i.e.

Vert1

Stone: 0

Grass: 1

Dirt: 0



Vert2

Stone: 0

Grass: 0

Dirt: 1



Graphics card will automatically iterate between these values so your fragment shader can just pick whichever has the largest value. Could even do fancy stuff like having a vert at grass: 0.5, Stone: 0.5 - which will then move the boundary for stone and grass closer to that vert but dirt would stay further away.



Works fine for 3 or 4 material types…not so good for large numbers of materials due to the number of variables needed but maybe you can build on the idea somehow…

@zarch said:
Depending on how many material types you have you could just set a material value at each vertex and look at the interpolation results.

i.e.
Vert1
Stone: 0
Grass: 1
Dirt: 0

Vert2
Stone: 0
Grass: 0
Dirt: 1

Graphics card will automatically iterate between these values so your fragment shader can just pick whichever has the largest value. Could even do fancy stuff like having a vert at grass: 0.5, Stone: 0.5 - which will then move the boundary for stone and grass closer to that vert but dirt would stay further away.

Works fine for 3 or 4 material types...not so good for large numbers of materials due to the number of variables needed but maybe you can build on the idea somehow...


Jep, i have a similar system in my mind, it would allow me to use 28 material weights applied to each mesh
Additionally i probably make a materialparameters for each one like NormalUpperLimit/LowerLimit and such thing, maybe even use some noise for interpolation. Should work good i think. But since the basic terrain stuff works i have moved texturing a bit lower in my todo list, want to see some action going on the terrain

@zzuegg

I used jME… I made the 3D textures procedurally.



I honestly have no idea what y’all are talking about here, just saw 3d textures mentioned, just wanted to say I did work with them and have no complaints - neither about the jME or the ogl support. :slight_smile: