Another Planet Renderer

@okelly4408 said: What does your erosion filter look like if it does not depend on neighboring chunks?

It does… that’s why I don’t use it. :slight_smile: Though it’s just a 3x3 convolution kernel as I recall.

I have similar filters in Mythruna for fixing up the terrain types as a post-pass (I don’t want one little block of sand in the middle of grass or one little block of grass in the middle of sand). That’s why I generate my terrain as 2 cells wider for terrain classification.

Regarding Vornoi, I’m really quite happy with the perlin noise solutions so far. For different biomes, I will probably have multiple fractals and then blend between them based on some simpler noise (like the 1 factor turbulence above).

Note: one interesting trick for smoothing out the relatively flat+low areas while leaving the mountains jaggy is to run them through a parabolic function. Presuming your sea level is at 0, something like:

[java]
if( h > 0 ) {
// normalize to 0 to 1
float f = h / maxHeight;

// Square it, results still 0 to 1
f = f * f;
h = f * maxHeight;

}
[/java]

This flattens the low lands and exaggerates the differences in high lands a bit without actually changing the maximum height. I used this to make my terrain more dramatic for my low poly renders I posted a week or two back.

Hello all, I am kinda new in this community, but i work since one year on lwjgl to get this kind of thing. I actually first thaught about a “cube generated sphere” but I found out you can get a way better one using a golden rectangle based icosahedron. I actually started making a seed based random geography subdivision system, that uses a formula that subdivides each seed each time to get a new one. I got this (yes, i messed the vertex coloring…

and there is a code: (feel free to use it as you want since this is basic math)
[java]
public static void drawgeodesic( Color col, float r, Position p,float sub, long se){
float L = (float) Math.cos(Util.getGoldenAngle())r2;
float H = (float) Math.sin(Util.getGoldenAngle())r2;
float L2 = L/2;
float H2 = H/2;
//Point a1 = new Point(pos.x+L2,pos.y+H2,pos.z,1,1,1,1);
Random ran = new Random(se);
point a1 = new point(pos.x+L2,pos.y+H2,pos.z, col.red, col.green-20, col.blue, col.alpha);
point a2 = new point(pos.x-L2,pos.y+H2,pos.z, col.red, col.green, col.blue, col.alpha);
point a3 = new point(pos.x-L2,pos.y-H2,pos.z, col.red, col.green, col.blue, col.alpha);
point a4 = new point(pos.x+L2,pos.y-H2,pos.z, col.red, col.green, col.blue, col.alpha);

	point b1 = new point(pos.x+H2,pos.y,pos.z+L2, col.red, col.green, col.blue, col.alpha);
	point b2 = new point(pos.x+H2,pos.y,pos.z-L2, col.red, col.green, col.blue, col.alpha);
	point b3 = new point(pos.x-H2,pos.y,pos.z-L2, col.red, col.green, col.blue, col.alpha);
	point b4 = new point(pos.x-H2,pos.y,pos.z+L2, col.red, col.green, col.blue, col.alpha);
	
	point c1 = new point(pos.x,pos.y+L2,pos.z+H2, col.red, col.green, col.blue, col.alpha);
	point c2 = new point(pos.x,pos.y-L2,pos.z+H2, col.red, col.green, col.blue, col.alpha);
	point c3 = new point(pos.x,pos.y-L2,pos.z-H2, col.red, col.green, col.blue, col.alpha);
	point c4 = new point(pos.x,pos.y+L2,pos.z-H2, col.red, col.green, col.blue, col.alpha);
	long ra = ran.nextLong();
	
	subdiv.draw(a1, b1, c1, sub , p,r, se*ra*100);
	subdiv.draw(a1, b1, b2, sub , p,r, se*ra*100);
	subdiv.draw(a1, a2, c1, sub , p,r, se*ra*100);
	subdiv.draw(a1, a2, c4, sub , p,r, se*ra*100);
	subdiv.draw(a1, b2, c4, sub , p,r, se*ra*100);
	subdiv.draw(a2, c1, b4, sub , p,r, se*ra*100);
	subdiv.draw(a2, c4, b3, sub , p,r, se*ra*100);
	subdiv.draw(a2, b4, b3, sub , p,r, se*ra*100);
	subdiv.draw(a3, b4, b3, sub , p,r, se*ra*100);
	subdiv.draw(a3, c3, b3, sub , p,r, se*ra*100);
	subdiv.draw(a3, c2, b4, sub , p,r, se*ra*100);
	subdiv.draw(a3, c3, a4, sub , p,r, se*ra*100);
	subdiv.draw(a3, c2, a4, sub , p,r, se*ra*100);
	subdiv.draw(a4, b1, b2, sub , p,r, se*ra*100);
	subdiv.draw(a4, b2, c3, sub , p,r, se*ra*100);
	subdiv.draw(a4, c2, b1, sub , p,r, se*ra*100);
	subdiv.draw(b1, c1, c2, sub , p,r, se*ra*100);
	subdiv.draw(b2, c3, c4, sub , p,r, se*ra*100);
	subdiv.draw(b4, c1, c2, sub , p,r, se*ra*100);
	subdiv.draw(b3, c4, c3, sub , p,r, se*ra*100);
}

public static void draw(point a,point b,point c, float s, Position p,float r, long se){
sub2(a, b, c, s, p,r, se);
}

public static void sub(point a,point b,point c, float s, Position p,float r){
	
	point d = Util.getmiddlesphere(a, b, p, r);
	point e = Util.getmiddlesphere(b, c, p, r);
	point f = Util.getmiddlesphere(c, a, p, r);
	if(s-1>0){
	sub(f, e, d, s-1,p,r);
	sub(a, d, f, s-1,p,r);
	sub(b, e, d, s-1,p,r);
	sub(c, e, f, s-1,p,r);
	}else{
		
		Line.draw(d, f);
		Line.draw(f, e);
		Line.draw(e, d);
		Line.draw(c, f);
		Line.draw(b, e);
		Line.draw(c, e);
		Line.draw(a, d);
		Line.draw(b, d);
		Line.draw(a, f);
	}
}
public static void sub2(point a,point b,point c, float s, Position p,float r, long se){
	Random ran = new Random(se);
	point g=Util.getmiddle(a, b);
	point h=Util.getmiddle(b, c);
	point i=Util.getmiddle(c, a);
	Long ra = ran.nextLong();
	Long ar = ran.nextLong();
	point d = Util.getmiddleirregularsphere(a, b, p, r, (long) (ra+(g.x+g.y+g.z)*100));
	point e = Util.getmiddleirregularsphere(b, c, p, r, (long) (ra+(h.x+h.y+h.z)*100));
	point f = Util.getmiddleirregularsphere(c, a, p, r, (long) (ra+(i.x+i.y+i.z)*100));
	if(s-1>0){
	sub2(f, e, d, s-1,p,r,ar*100);
	sub2(a, d, f, s-1,p,r,ar*100);
	sub2(b, e, d, s-1,p,r,ar*100);
	sub2(c, e, f, s-1,p,r,ar*100);
	}else{
		
		Line.draw(d, f);
		Line.draw(f, e);
		Line.draw(e, d);
		Line.draw(c, f);
		Line.draw(b, e);
		Line.draw(c, e);
		Line.draw(a, d);
		Line.draw(b, d);
		Line.draw(a, f);
	}
}

public static float getGoldenAngle(){

	final float a = (float) Math.atan(1.5/Math.cos(Math.PI/8));
	return a;

}
public static float getGoldenAngle2(){
	
	final float a = (float) ((Math.PI-2*(Math.PI/2-getGoldenAngle()))/2);
	return a;

}[/java] 

and to load it use this method:
[java]IcoSphere.drawgeodesic(new Color(0, 1, 1, 1f), 2,new Position(0, 0, 0),0,252);[/java]

btw, the thing is to get each side of the triangle to be the same as the other, and that’s a bit complicated but possible. after there is a way to do mountain like things with a sedd and seeded formula besed sinus and cosinus plane to wrap arround the sphere (180 high an d 360 large and the top side must be the same and the bottom side also and the 2 on each side also.

I will try to get that to work on my brand new started Jmonkey project.

Hope I helped you.

PS: I think icosahedron based is the most efficient since every geometry in java is triangle based and so it gives a way more smooth rendering.
PPS: for textures you might want to make it terrain variation / center based so you get stone textures on the side, sand on the beach and other cool stuff.
PPPS: what you already did is awesome!

@toolforger said: Unless procedural generation exists twice, once as Java and once as shader. Though that would be hard to do because round-off errors would be different.

actually, you can: you have to get the fractal generator, get the triangles and create the normals and fuse them with the ones of the neightbor triangles so it has the possibility of rendering light and the terrain with the exact same renderer:

just to better explain:
a verice that shares multiple triangles has to get all normals of the attached triangles added and then get them shrinked to 1 like this:

warning, you have to get all normals before adding to a length of 1 to get it to work well.

this image shows, if you look good, that if you add the bllue vector and the red vector and if these have the same length you get the middle vector of both.

there is the formula:
vector A + Vector B + other vectors = new vector
new vector . setlength(1)

Adding length-1 vectors and then setting the sum to length 1 is exactly the same as simply taking the arithmetic mean of all of them. You don’t even need to do setLength (which involves an expensive square root), simply divide by N after summing up N vectors…

What I do not see is how you’re avoiding problems with duplicated code in Java and GLSL.

@toolforger said:

What I do not see is how you’re avoiding problems with duplicated code in Java and GLSL.

huh? there is no duplicated code, each of these repetitions of the formula are 20 faces of the icosahedron… btw the second of subdiv is not used, sorry to have posted that, i actually had problem sand tryed to solve them by making a clone of the method, but i found out it was another problem and fixed it and fogot to delete the clone. if you want the formula for the normalisation, here it is:
[java]Vector3f.cross( otherVector3f ) [/java]

and yes, good idea for the division, thanks, gonna use that.

You’re doing terrain generation in the shader.
Which means the Java side doesn’t know about the terrain, and has to regenerate it to have a heightmap (or whatever is best). Since the Java side can’t easily peek into the shader results, that means coding the same heightmap generation code in Java as well.
Unless you’re doing things very differently than I’m imagining, which is entirely possible :slight_smile:

Yep last part is right, my ide is:

-make an icosahedron based sphere
-set a terrain variation based on longitue & latitude
-set a random variation so the terrain doesnn’t look artificial
-get all generated vertices and set their normals
-apply shader and textures on the triangles

here the heightmap is based on a “3D function” example:

f(x,z)= 14zx

wich is read from the code as:
height is equal to 10xLongitudexLatitude

and then wraps it.

sadly this is a bad example since there is a rule to follow if you want tu use that kind of thing:

the function must have these criterias:

#OOOOOOOOOOOO#
#OOOOOOOOOOOO#
#OOOOOOOOOOOO#
#OOOOOOOOOOOO#

(this is how the sides of the function look: the format is 360 longitude and 180 latitude)(o is some wiered geometrical stuff)

all '-'must have the same height
all ‘#’ must be symetrical to the opposide side (so it is seamless)
all ‘=’ must have the same height.

then the computer reads this function and tells a point to get the vector longer by the height (the cartesian coordinates have to be converted in geographic ones first(so the computer knows how much height to apply))

If someone gets a more efficient method to do the same or know things about “3D functions” please tell me.

Edit:
I also think about making a new king of file appliance dedicated to function based hightmaps. How does that work:
each line determines a function on a specific period of the 180x360 plane that is applyed to the spere.This technique is seamless since if 2 points overlap, they woll be merged so triangles are glued together and there is nothing else to see than a continuous terrain)

I think this could be the best thing to replace image based hightmaps, since these are limited: functions are infinitely precise and can have an unlimited number of different combinations and can be done in an infnitely precise/detailed way so one file will never be outdated (even in 2K years when there are 20G displays and quantum computers with litterally infinite calculation power (in the optimistic case)) so one actual java based game could but really could have the code nessesary for doing stuff like that, the only limit is the power of the machine…

I would be greatfull if someone knows a procedual java based fractal texture maker programm (just to “inspire” myself a bit :D)

The issue with planet generators is that any irregularity in the mesh can mislead the generation algorithm into artifacts. E.g. if you have a generator that works well for squares, and apply it on a latitude-longitude mesh, you’ll find that the generated terrain will get squished in east-west direction as you go towards the poles, making the terrain more jagged.

If you use an icosahedron-based triangular mesh, this effect will be distributed around the 12 vertices (“poles”) so maybe less noticeable. The effect will still be there however.
If your algorithm is good, then the generated terrain will not have any bias around these special points, or along any of the principal axes of the grid. You should be able to perceive such irregularities if you let your program generate new terrains at a rate of, say, 60 per seconds so it’s a blur, and let the whole thing rotate - if you don’t see a change in the blur it’s uniform, if you see lines rotating you have an axis bias, if you notice points these are likely one of the poles.

actually you might have misunderstood:
the function based hightmap and the random based irregularities are made so it is the same if it is the same position (sadly my code is only cartesian so it cannot rotate, ill poste the new code once i’v done it) so every point, independently of ant kind of formation (if applied in an already spherical environnement) should look same from far between the techniques but the points will be a bit ofset but becose the points are not on the same position, btw, if you apply the function hightmap on a sphere, you can set randomly where it starts or where it stops, the only thing is that you have to define where the equator and the 0 meridian are, but it cant (if the heightmap is good) get wrong becose it is simply independent: example:

take a sheet of paper, wrap it arround a cylinder, you will see there is no difference if you start at one point or another, the paper will wrap arround the same ammount of cylinder and the image will be the same (if there is one on the sheet). it’s the same for the sphere.

I think you’re talking about how to make adjacent mesh cells generate the same heights at the shared border.
As far as I understood it, you’re generating each vertex once per mesh that’s adjacent to the vertex, then average the values.
This can work. However, it’s a bit of unnecessary work since you’re generating every mesh vertex as many times as there are adjacent mesh areas.
Also, averaging tends to cancel out deviations, so you’ll get a smoother landscape than you otherwise would - which is not a problem if you do the averaging the same number of times for each mesh vertex, but at the poles you’ll do it more often and get a landscape that’s smoother around that area simply because you’ll have height values that were smooted out more often than those in normal vertices. (In the case of an icosahedron and trigonal submeshes, five times instead of three times.) Or maybe you’re simply assuming the poles to have zero height, in which case they will have a different distribution characteristics than ordinary mesh points no matter what :slight_smile:
Personally, I’d generate each point using the same RNG state. I.e. reinitialize the PRNG from whatever data goes into your calculation. I.e. if midX = (mid1X + mid2X) / 2 + rnd(), reinitialize rnd() with mid1X xor mid2x. (Initializing random number generators is somewhat expensive, I’d probably strike some compromise between “it’s random enough” and “it’s fast”.)

Anyway. I’ve been talking about two points that are quite different:

  1. Make sure that the randomness is uniform. If your are not doing it right, you’ll get smoother (or rougher) areas near the large mesh boundaries than near the small ones. You’ll notice that as you zoom out or in, or if you do the continuous terrain generation thing I described.
  2. (Not relevant if you’re not running the terrain generation once on CPU in Java and once on GPU) The terrain generation might result in different results (that’s hardware dependent). You need to check whether the differences average out or accumulate; in the latter case, you need to nail down the generation process to happen on one side and one side only. If you have a multiplayer game and let each client generate its terrain independently, assuming that they must get to the same results, you’ll get differences between machines. One way to deal with inaccuracies is using integer arithmetic for everything that needs to be 100% accurate.
1 Like
@toolforger said: I'd probably strike some compromise between "it's random enough" and "it's fast".)
yep, that is why config files /hud's exist :D
@toolforger said:
  1. Make sure that the randomness is uniform. If your are not doing it right, you’ll get smoother (or rougher) areas near the large mesh boundaries than near the small ones. You’ll notice that as you zoom out or in, or if you do the continuous terrain generation thing I described.

in a perfact icosahedron based sphere all triangle have the same boundaries :stuck_out_tongue: (unless it’s not what you mean) but yes, I agree.
@toolforger said:

If you have a multiplayer game and let each client generate its terrain independently, assuming that they must get to the same results, you’ll get differences between machines. One way to deal with inaccuracies is using integer arithmetic for everything that needs to be 100% accurate.

seeeds? this is why i use java’s random generator, it can have seeds (i tried it without and gave me a beautiful unstable sun :P) and the seeds are computer independent (unless minecraft source code lies :P(it uses it for map generation… so…))

@toolforger said:
  1. (Not relevant if you’re not running the terrain generation once on CPU in Java and once on GPU) The terrain generation might result in different results (that’s hardware dependent). You need to check whether the differences average out or accumulate; in the latter case, you need to nail down the generation process to happen on one side and one side only.

huh? how? i donno undersatand (yep, not studiing informatic, nor math) how could it occur? how to solve it? what to do?

Java’s RNG is repeatable, yes.
It’s a linear congruential generator (LCG). That’s a time-honored one, good enough for proof-of-concept and even productive code (as Minecraft demonstrates). However, it is possible to have regularities with it, and if you see any regularities, you’ll never know whether it’s the RNG or someting you’re doing wrong.
For a project where such things need to be done rigorously, I’d probably use a Marsaglia algorithm. It’s 7 operations instead of the 2 of LCD, but it passes far more randomness tests (i.e. has fewer kinds of repeating patterns), and 7 64-bit operations is still pretty darn fast given the overall overhead.
(Mersenne is an often-recommended alternative, but it uses a large state space so reinitialization becomes slow, and it also stresses the CPU cache.)

Don’t worry too much about the CPU/GPU thing; I thought you were generating the terrain using OpenGL (i.e. inside a shader), but that was probably something I misunderstood.

1 Like

oh, thanks for the rng, gonna implement it (people choose on theyr singles player but on servers it is choosen by the programm.) (i saw in my phere there were some regularities (sadly).
actully I was in basic stuff with open gl, so thre were no shaders and only the strips. so only cpu use and only gpu for rendering at the end.

Any chances to see the source?

I am not sure why one thinks that an icosahedron approach to sphere creation is more accurate or more efficient than what I am doing. I take a cube with each of face of the cube acting as a quadtree. I then transform the position vectors to those of a sphere with the following code:

[java]
vec3 spherize( in vec3 v){
//denom1 = radius * radius * 2.0
//denom2 = radius * radius * radius * radius * 3.0
float x = (v.x * (sqrt(1.0-((v.yv.y)/denom1)-((v.zv.z)/denom1)+(((v.yv.y)(v.zv.z))/denom2))));
float y = (v.y * (sqrt(1.0-((v.x
v.x)/denom1)-((v.zv.z)/denom1)+(((v.xv.x)(v.zv.z))/denom2))));
float z = (v.z * (sqrt(1.0-((v.yv.y)/denom1)-((v.xv.x)/denom1)+(((v.yv.y)(v.x*v.x))/denom2))));

return vec3(x,y,z);
}

[/java]

The distortion is extremely minimal using this method and I am able to do LOD fairly easily…
I also am unsure of why you would use a method that demanded a replication of GPU code on the CPU (or vice versa…) That seems like it would be very inconvenient and prone to errors. I generate all elevation data on the GPU and then just download the data to a buffer in a separate thread so that I can have access to terrain data on the CPU…

Also, if you’re interested here is a video showing the current state of progress on the planet:
[video]Java Planet Renderer Update - YouTube

3 Likes

Well, the issue with GPU generation is that it’s somewhat difficult to get at height data.
You can’t place labels or buildings on a surface if you don’t know the terrain height after all.
IS there a way to get a heightmap out of the generated surface?

Icosahedron is a way to distribute the distortions across a larger area.
Actually it might be smarter to use a tetraeder - any unhandled distortions will stick out better.
But essentially the question is whether you handwave the distortions away (“not large enough to matter”) or want to split hairs over them (“the randomness MUST be independent of scale and rotation”). It’s essentially a matter of your project’s goals, which means it’s a matter of taste. (Personally, I’m towards the hair-splitting side of the spectrum, if only to make sure that any distortions I see later aren’t a mesh artifact.)

@okelly4408

Very cool! How’s the performance coming along?

I also used the same cube normalized into a sphere technique for jmeplanet.

It’s a shame we have at least 3 separate planet renderer projects now. We’d end up with something truly amazing if we worked together.

Yep that’s true, but different people have different ways to do things, but every project that does kinda the same have common parts, so we should share them, for example, i thaught that a triangle vertice index could be usefull to make seamless planets, so before generating a new vertice it checks into index with a key to find if the vertice has already been generated. I am currently working on it so no example for now…

@toolforger said: Well, the issue with GPU generation is that it's somewhat difficult to get at height data. You can't place labels or buildings on a surface if you don't know the terrain height after all. IS there a way to get a heightmap out of the generated surface?

Icosahedron is a way to distribute the distortions across a larger area.
Actually it might be smarter to use a tetraeder - any unhandled distortions will stick out better.
But essentially the question is whether you handwave the distortions away (“not large enough to matter”) or want to split hairs over them (“the randomness MUST be independent of scale and rotation”). It’s essentially a matter of your project’s goals, which means it’s a matter of taste. (Personally, I’m towards the hair-splitting side of the spectrum, if only to make sure that any distortions I see later aren’t a mesh artifact.)


Yes there is a way to access the terrain data. You can simply download the framebuffer buffer into CPU memory and read it there. I download this buffer in a separate thread and then look up the center pixel of the heightmap to offset each center of each quad in order for my LOD system to work…
And distortion is not a huge problem for me. I have not seen anything that made me worry so far and have looked fairly hard.

@aaronperkins said: @okelly4408

Very cool! How’s the performance coming along?

I also used the same cube normalized into a sphere technique for jmeplanet.

It’s a shame we have at least 3 separate planet renderer projects now. We’d end up with something truly amazing if we worked together.


The program is performing fairly well. Implementing frustum culling helped a lot.
And I wouldn’t say the cube is normalized into a sphere…the equation more so wraps the cube around a sphere…I did a study a while back and I found that simply normalizing vectors had a max level of distortion of around 35% while using the above method returned a percent distortion of around 12% (I wrote a paper on it, I will have to read it to see how I calculated those values)

And I like to think of the different planet renderers as sort of a competition…a race, per se, to see who can make the best renderer.

While projects are doing things differently, they will gather experience on what works how well.
Once they’re through with that they’ll look for improvements, and they’ll look at the competitors and reimplement what they’re doing - or adapt their code so that they can simply copy the code. In the end, such projects tend to converge - unless there are fundamental differences, in which case it is okay to still have multiple projects, because then the different projects will appeal to different needs.