Creating a simple terrain


    While there is no terrain system in jMonkeyEngine 3 yet, I'm trying to implement a simple one by myself.

  My ideas is to have additional m_diffuseMaps and several m_alphaMaps in the shader.

  I want to use these textures to compute final colors for the frags.

  If I create a model (I'm using blender) and I do not "repeat" the texture, everything works fine because I can

  use texCoord variables for original texture, additional diffuse textures, and alpha texture to blend diffuse textures.

  But if I have a texture that is "repeated", I cannot use the texCoords for the alphaMap because it is supposed to be stretched over the whole mesh.

  If there a way to have another "texCoord" that is not repeated?


  For example, I can export a model from Blender two times: ones with texcoord for base texture, and ones for alpha texturing. But how I can make the second texCoord available in the shader?


P.S. The basic terrain system is extremely essential. I think we may create a plug-able system for terrain and implement one which is very basic, while the final terrain system is in development.


Ah I see, well currently JME3 onyl supportes one Texcoordinates for each Geometry.

Instead of using multiple texcoords, you can just specify a scale value in the shader to use for each texture.

  Could you please explain your idea.

I have the base texture which is repeated over the mesh.

I have an alpha texture which is not repeated.

I assume that the texCoord variable is going from 0.0 to 1.0 several times (the number of times the texture is repeated).

And as I understand, I need texCoord for the alpha texture going from 0.0 to 1.0 only ones (for the entire mesh).

How can I scale texCoord to get coordinates for the alpha texture?

P.S. Or should I apply an alpha texture without tilling in blender and scale the base texture coordinates by "how many times I want it to be repeated"?

The ps thingy he means I guess.

Yeah. I have tested it. It works pretty nice.

So right now for the terrain (simple one) I can use blender with proportional editing tool to create a mesh and "multi-texture" it with alpha textures made out of blender`s "vertex paint" tool.

Why isn't our shader integration API flexible enough to handle several texcoords?

Is it a technical issue or it just has not been implemented?

Thank you.

Actually there are very few cases where you need more than one different texture coordinates in newer games, also you need more texture units that way.

 Sorry, I did not get it.

 Do you mean that newer games use more texcoords comparing to regular games?

 I also have another question.

  Is it possible to use per vertex color as a substitute for the alpha map to blend several textures for the terrain?

  What technique is considered standard for texturing terrains?

  It seems working...

  Right now I'm creating a terrain in Blender.
  In a vertex paint mode I'm using red, green, and blue colors as blending masks for texture 1, 2, and 3.
  That is pretty convenient that it is possible to see where you are painting "texture masks".
  Its almost like a real terrain editor :)

  Right now I'm using changed versions of Lighting.{j3md,vert,frag} shaders.

  How the actual jme3 terrain system (in terms of texturing) is going to be implemented?


  Is it possible to use per vertex color as a substitute for the alpha map to blend several textures for the terrain?

Why not specify a mask texture and let the shader do the blending, try to not think so much in the fixed pipeline.


  Mask texture is fine. This is what i tried first.

  But in this case I have to have a separate application to create a texture mask.
  I can use GIMP to edit it, but it is not visual. You can't see where you are actually painting the mask.

  But with vertex colour, I can use blender for everything and export only one OGRE mesh for the terrain.

  Now I'm thinking about dividing the mesh into blocks before adding it to the scene.
  That is a different approach comparing to what we had in JME2. In JME2 the terrain has to be divided into blocks before loading it into the program.
  The advantages are:
      - the size of the block can be dynamic.
      - the terrain mesh itself can be edited in any mesh editor including Blender.

This is how it looks.

how about a special laoder for a ogre.terrain.xml,

creating automatically a texture out of the vertex colors in the xml.

Yep. That is very easily implementable.
And I feel like a texture as an alpha mask is better than vertex colors. But I don't see why.
Could somebody explain, why vertex colors is bad for alpha mask?

Right now I'm able to use fracplanet application to generate really nice terrains and this tool exports the mesh as a python script for Blender that actually has vertex colors! And I'm able to directly (as an OGRE mesh) use it in JME3. The terrain looks really nice because it is generated using fractals.