Hi forum,
My next game requires (mostly for replay-ability) a randomly generated dungeon to crawl, and a princess to save from it.
Now, I have no problem with the mathematics, probabilities, randomness of it, but my question is this:
Do I use shapes (quads) or models for the rooms and corridors?
My first idea for the rooms was to make a “room” class and then in there make 6 quads for the walls and floor and position them around a “Room node” that way I could make an array of rooms and fill it with new Room()s.
But with this method I think it would be difficult to have corridors with holes for doorways in for example, because in my experience there are still teeny weeny little gaps inbetween the quads even when they are positioned 0f next to eachother.
However, using Quads I could have the length of the corridor and size of the room as another random factor.
If I used models, my guess would be it would use more memory? But it would be easier to do, I wouldnt have to bother with all the positioning.
I’d like to know if anyone else has done anything like this and if anyone has any advice to offer?
Thanks id advance,
glcheetham
You want to generate a mesh from the environment that you then texture using a texture atlas image:
https://wiki.jmonkeyengine.org/legacy/doku.php/jme3:advanced:custom_meshes
It should be only the surface of the final “landscape”. That is if you would create a box world, you would only put the faces that together build the surface in the mesh. This is the most efficient way. Basically its also what “voxel worlds” do. How you create the mesh depends on the complexity of your environment. Boxes etc. are relatively easy, using a “marching cubes” algorithm to create more smooth surfaces is more complex.
I’m thinking dungeons akin to the original Wolfenstein or Doom, would this be possible using your method? Remember, I dont want anything more complicated other than a bunch of cuboids stuck together with occasional holes in them
@glcheetham said:
I'm thinking dungeons akin to the original Wolfenstein or Doom, would this be possible using your method? Remember, I dont want anything more complicated other than a bunch of cuboids stuck together with occasional holes in them
Sure but you still have to do what I mentioned to get optimum performance. Stacking geometries will not get you there.
Ok, i’ve just finished making a corridor mesh out of an array of Vector3f’s, but I have another problem
I want the walls, floor and ceiling to all have different materials.
I suppose I have to use an “atlas image”?
If you could tell me how or point me to a tutorial that would be great
An atlas is just a texture containing lots of different textures within it (i.e, stone, wood, floor, etc).
You adjust the uv mappings of the vertices to map to the correct parts within the texture atlas so that all the sections get the right image.
@glcheetham said:
Ok, i've just finished making a corridor mesh out of an array of Vector3f's, but I have another problem
I want the walls, floor and ceiling to all have different materials.
I suppose I have to use an "atlas image"?
If you could tell me how or point me to a tutorial that would be great
I did. The tutorial also shows you how to set texture coordinate buffers. You have to set them so that each vertex has the correct location of the texture in x/y coords from 0-1
Normen, you’re the best
Zarch, thanks for joining in
I’m still having trouble with the materials. How can I have multiple materials on one geometry? And how can I set these materials to only render on certain faces of my mesh?
@glcheetham said:
I'm still having trouble with the materials. How can I have multiple materials on one geometry? And how can I set these materials to only render on certain faces of my mesh?
You don't need that, you need only one material. The other "looks" are from how the texture atlas looks at that point, the diffuse, normal and light maps do a lot, you don't need special material settings.
And I was planning on having varied corridor lengths by generating a random corridorLength float. Am I right in saying that a texture atlas is designed to wrap around one size of corridor and wouldnt work if the corridor was longer or wider?
@glcheetham said:
And I was planning on having varied corridor lengths by generating a random corridorLength float. Am I right in saying that a texture atlas is designed to wrap around one size of corridor and wouldnt work if the corridor was longer or wider?
Depends on how you set your vertices, you don't have to save too much on them if you do one mesh anyway. Say you have a corridor and want to repeat one tile of wall in it you'd have to divide it up so that you can put the texture coordinates in a way that the texture repeats at the correct location.
I would simply abuse the texCoords to mark what kind of vertice this is, top,left,right,bottom,front,back and modify the shader…
Might sound more complicated in the first, but it isn’t since you already build your own mesh
@glcheetham said:
Ok, i've just finished making a corridor mesh out of an array of Vector3f's, but I have another problem
I want the walls, floor and ceiling to all have different materials.
I suppose I have to use an "atlas image"?
If you could tell me how or point me to a tutorial that would be great
I personally use tiled images and texture scaling/splatting to achieve this effect. You get insanely great detail up close, but... requires you producing great tile-able images & mapping texture coords appropriately to divide up walls from floors from ceiling. Using a noise function to help randomize your overlay opacity maps (moss, dirt, grease, etc on the walls) a bit goes a long way towards realism with tiled textures. This also removes seams because you can texture LARGE areas (i.e. every corridor with a single material at the same time)
EDIT: Let me explain the technique a little further, incase you find it interesting.
1. Designate texCoords 0.0, 0.0 through 0.5, 0.25 as your coord space for all floors no matter if they are one object atm or not.
2. Determine width/height scales you will use when applying texCoords to the Mesh. (i.e. 2f = 0.025 in texCoords), this way when you lay out a long vs. small corridor, you know how to map the texCoords into the floor space of your texture and ensure that one corridor doesn't have the scale different from another.
3. Create a base alpha map that only shows 0-halfway in the X pixels, 0-1/4 way in the y pixels. (This mirrors your floor space in texCoords)
4. Create random overlay alpha maps for splatting other textures. This is fairly easy to do using approximate shapes overlaid with noise.
Ooops... forgot to mention, you can pre-create the overlay alpha maps, if you always use the same layouts for floors, walls, ceiling. You can also pre-create a series of splat alpha maps to randomly choose from.
Anyways... hope this idea helps
Thanks for all the help guys, you’ve been great!
But I still have no comprehension of how a Vector2f texCoord can express the location of where a texture is rendered on the mesh.
I even tried changing all the numbers in my texCoord array into ridiculous numbers and I was still left with an unshaded blue cuboid.
I know every vertex in a polygon is assigned a texture coordinate, but my texCoords are Vector2f and my vertices Vector3f’s, so how can the computer match them up?
The only thing the docs give me on the subject is “Next, define the Quad’s 2D texture coordinates for each vertex”, and your replies, though I’m sure they make perfect sense, I think I would need to know more about the whole texCoord idea before I can understand them
Thanks for the time!
@glcheetham said:
Thanks for all the help guys, you've been great!
But I still have no comprehension of how a Vector2f texCoord can express the location of where a texture is rendered on the mesh.
I even tried changing all the numbers in my texCoord array into ridiculous numbers and I was still left with an unshaded blue cuboid.
I know every vertex in a polygon is assigned a texture coordinate, but my texCoords are Vector2f and my vertices Vector3f's, so how can the computer match them up?
The only thing the docs give me on the subject is "Next, define the Quad's 2D texture coordinates for each vertex", and your replies, though I'm sure they make perfect sense, I think I would need to know more about the whole texCoord idea before I can understand them :)
Thanks for the time!
Did you set the tex co-ordinates buffer?
Did you set a colour map to the material?
Textures are 2 dimensional images.
Texture co-ordinates map the point in the model to a point within that 2d image (the range is always 0->1 so resolution of the image doesn’t effect it).
@zarch said:
Textures are 2 dimensional images.
Texture co-ordinates map the point in the model to a point within that 2d image (the range is always 0->1 so resolution of the image doesn't effect it).
Little bit more about this...
You have an image that is 512 x 512
In texCoords....
texCoord 0.0f, 0.0f = pixel 1,1
texCoord 1.0f, 1.0f = pixel 512, 512
Say you wanted to find pixel 200, 200 of the above image.
1 pixel in texCoords = 1.0 / 512.0f (or the width of the image as a float)
Same process for the height (though usually the images used as textures are square)
200 pixels in texCoords = 1.0f / 512.0f * 200.0f
Basically 1.0f divided by the width of the image, times the pixel you're looking for.
The reason they are handled this way is...
1) Shader texture samples can reside between multiple pixels.
2) No matter what the dimensions of the image used as a texture the calculations are always the same in the shader.