Far render

Hi everybody,



I’m actually trying to make a space simulation game. I have stars and planets rotating around them.

My problem is the following : I need to always have a populated space environment (i.e. you always see a lot of stars when space, even farther than you see when on earth). As a matter of optimisation, I set the view frustrum to 1000, so that planets, asteroids etc are rendered only when “close” to them (we’re in space, still). So I imagined I could tell the engine to render any loaded star, whatever its distance is, so as to have an always populated environment. But I then face two problems :

  • I call the

    [java]

    renderManager.renderGeometry(entity.getGeometry());

    [/java]

    on stars that I check to be farther than view frustrum. But they are still not rendered (when they suddendly appear, they are already “big” - yet, still far, but in view frustrum - and could then be viewed from farther).
  • Some of my randomly generated stars are “tiny” - and that’s normal - but they don’t show up very good; I was thinking of a particle system, but it would be too heavy I think; any way to achieve a better, yet more realistic, render of those?



    Thanks in advance.

You are not supposed to call the renderManager this way…? Did you check the tutorials?

Hmm, which one do you think in particular? I’ve been looking for something on rendering, even custom rendering, but wasn’t able to find something that could solve my problem (or at least, I didn’t see how it should be used to solve it).

You should write your own shader based on the existing sky shader.

Hmm, I’m a bit lost here, though I know what shader is. I mean, how am I supposed to use a shader for this? Should I flatten far objects render?

I think your particle idea isn’t that bad… A combination of what the particle system and the sky shader do would probably work.

Basically if you write your own shader based on the existing skybox one then you will be able to add the stars in the shader. That will run much faster as there are no extra objects cluttering up space and needing to be processed all the time and it also solves all your “popping” problems.



The position of the stars could be random (easy) or even specified using a texture or similar passed in.



Take a look at the shader tutorials (and wezrule even did a bunch of video tutorials for them recently).

I’ve implemented chunks, so as to have the biggest space possible (since I want to be able to travel through a randomly generated space, through star systems, through galaxies, and even through clusters and super clusters). Isn’t there a contradiction of using a skybox?

About the sky shader, I’ve been searching a bit this particular topic, and mainly found that :

http://hub.jmonkeyengine.org/groups/development-discussion-jme3/forum/topic/jme3-dynamic-skydome/

a one-year dead thread.



So, if I get everything correct, I will have to take all my very far objects, render them on a skybox, and only update them when I’m getting close to them (as if I detach them from the skybox)? I’ve been wondering about such a system, but I don’t see how to manage positions from space to skybox to space.

You could also represent the stars with a point cloud of sprites and just readjust as the player moves from one “zone” to another.



No matter what, the key is that you will render far away objects at one scale and near objects at another. The mechanisms used may vary but that’s the general idea.

So I don’t add the farthest objects to the scene, but compute their position to print them on a skybox? But how do I know when to load them so?

If they are not close enough to be within the camera’s frustum… I would think you could use:


  1. A predefined texture map as a texture to pass to a shader… a pixel for star placement… pixel’s opacity to define intensity of light rendered by the shader.



    or


  2. (my preference) If your game is divided into sectors/zone/etc. Calculate the placement of stars (3d vectors to 2d pixel placement) at load time per zone/sector and create your shader’s texture map to account for the user’s perspective.



    Anything far enough away to be rendered this way, isn’t going to be effected by minor movement inside of a given sector.
@lightmax said:
So I don't add the farthest objects to the scene, but compute their position to print them on a skybox? But how do I know when to load them so?


Depending on the amount of data in the background (I'd suspect LOD will cull everything but the stars, right) then you could just draw it as a point cloud. No need to flatten it into a texture. Just put the point cloud at 1/1000th scale or whatever and make sure it's drawn first. When you move within the zone don't move the camera, move the scene. Move the local zones by a normal amount and the point cloud by 1/1000th the amount.

Use separate viewports if Z fighting becomes an issue... but really the point cloud can probably be drawn with no z write or z test.

Yeah, a common trick for space games is to put that player’s ship at 0,0,0 and then move everything else around them rather than the other way around.



(Attach the local world into a node, move that node).

Keep in mind the star meshes would still pop into view at the far viewing distance if you use a static background. That’s normally ok tho. The amount of programming work to get a full 360 dynamic impostor system to work, with all the stitching along borders etc. would be extreme.



You might also wanna look into point anti-aliasing btw. That will become an issue.

it helps to blend the stars with a shader in after they are larger than 2 pixels, no alising effecgts, and no pooping up

1 Like
@EmpirePhoenix said:
it helps to blend the stars with a shader in after they are larger than 2 pixels, no alising effecgts, and no pooping up

Ohh..
@zarch said:
Yeah, a common trick for space games is to put that player's ship at 0,0,0 and then move everything else around them rather than the other way around.

(Attach the local world into a node, move that node).


I implemented a chunk system, each chunk containing a part of the universe, whatever is in this chunk (so it could simply be a planet or a part of a galaxy). When the camera goes out from the central chunk (I've got a 3x3x3 chunk-divided 3D space) all the chunks and their content are moved, so that the camera is always in the central chunk.

@pspeed said:
Depending on the amount of data in the background (I'd suspect LOD will cull everything but the stars, right) then you could just draw it as a point cloud. No need to flatten it into a texture. Just put the point cloud at 1/1000th scale or whatever and make sure it's drawn first. When you move within the zone don't move the camera, move the scene. Move the local zones by a normal amount and the point cloud by 1/1000th the amount.

Use separate viewports if Z fighting becomes an issue... but really the point cloud can probably be drawn with no z write or z test.


So I only load my central chunk, and for the others chunks, I use a point cloud where the size will vary with the distance and size of the star? I've been searching, there is no primitive to do a point cloud (and I don't know if I should then use a texture where they are printed)? Furthermore, when reaching the limit of the central chunk, some stars won't show up before crossing it, even if they are very close, no?

@EmpirePhoenix said:
it helps to blend the stars with a shader in after they are larger than 2 pixels, no alising effecgts, and no pooping up


Using the same shader?


Actually, I'm getting a bit lost - with all those solutions, what should I finally do?
@lightmax said:
So I only load my central chunk, and for the others chunks, I use a point cloud where the size will vary with the distance and size of the star? I've been searching, there is no primitive to do a point cloud (and I don't know if I should then use a texture where they are printed)? Furthermore, when reaching the limit of the central chunk, some stars won't show up before crossing it, even if they are very close, no?


http://code.google.com/p/jmonkeyengine/source/browse/trunk/engine/src/test/jme3test/effect/TestPointSprite.java

The TestPointSprite class may help you see how a point cloud might be used with images. You'll have to dig a little. Mythruna's night sky star field is a point sprite cloud.

You would just have to make sure that your relatively near chunks cover enough space so that the transition works from point sprite to real star or whatever.

If you mean that no star should be at the border of my chunk, it will be hard, since I also intend to make space objects move (since I want to make a game on an universe life time). And stars can be spawned at chunk border too.



Do I really need a shader if I use a point cloud? Since the point cloud should be computed at chunk loading?

@lightmax said:
If you mean that no star should be at the border of my chunk, it will be hard, since I also intend to make space objects move (since I want to make a game on an universe life time). And stars can be spawned at chunk border too.

Do I really need a shader if I use a point cloud? Since the point cloud should be computed at chunk loading?


No, I mean that your near border will be so far away that you will not notice the transition. Usually you have at least a 3x3 grid of "near" chunks so there is always at least one near chunk between you and the point cloud.

You do not need a special shader if you use a point cloud. These are sort of separate techniques.