Sky looks distorted when using SkyFactory.createSky with spheremap

Hi, When using the SkyFactory with a spheremap the sky looks heavily distorted. I searched the tests for an example and found a similar issue in the TestEverything test. Here too the cathedral loos distorted.

Ill try to attach an image to show what I see on my screen.

Here is the view for the TestEverything test. As you can see part of the sky is relatively recognizable, but the rest is heavily distorted.

Could you post a little bit of code? That would help finding the error and I’m interested in using that effect^^ :smiley:

That Test is part of jME, just look at the jME sources.

I can confirm that this tests looks that way.

It always did, I don’t think its a bug but rather the wrong type of map.

normen, do you know of a texture that would work? I tried different spheremaps and it always looks distorted.

That’s actually why I took one of the samples, expecting it to work and being well understood.

If I did I would know it and not just think it ^^

point made :slight_smile:

So, basically at this point I see no prove of the spheremaps working in combination with the SkyFactory.

ok it’s a bug. I tested with another spheremap where i noticed repeating patterns. (saw 2 suns in my world where there was only 1 in the spheremap)

In fact, I only posted a single screenshot of the distorted part before (which i call north, it is the default cam orientation). If I turn around in my world, I see some other distortions but they are more subtle.

In the image, i labeld 4 screenshots: North, south, ceiling and floor. If you look at the north screenshot, although difficult to spot. it actually repeats the scene. You can see the cathedral’s dome, the floor and monument repeated in there.

It seems for the “north” we are looking at the outside of the sphere, while for the other parts we are looking at the inside of the sphere.

ok more analysis.

The north hemisphere is actually the spheremap texture itself. The distortions is where the spheremap is black or with other words empty

Actually i found an issue in the sky shader

you have this in the frag shader

vec3 dir = normalize(direction);

gl_FragColor = Optics_GetEnvColor(m_Texture, direction);

the dir variable is never used…

then in Optics_GetEnvColor you have

// compute 1/2p

// NOTE: this simplification only works if dir is normalized.

float inv_two_p = 1.414 * sqrt(dzplus1);

The problem is if I use the normalized direction, the distortion is worse…

@Momoko_Fan any input on this?

damn, you just beat me to it. tried the same thing.


// NOTE: this simplification only works if dir is normalized.

float inv_two_p = 1.414 * sqrt(dzplus1);

//float inv_two_p = sqrt(dir.x * dir.x + dir.y * dir.y + dzplus1 * dzplus1);


What happens if you flip it back to the old “unoptimized” way?

I can’t really make that math line up to being the same… but I haven’t tried very hard and I’m sleepy.

different distortion towards the pole, similar to when using the normalized direction in the fragment shader. (looks exactly the same, but difficult to tell ofcourse) Which would be inline with your comment regarding normalization.

Im new to shaders and everything, but I don’t get how normalizing the normalin the fragmend shader could have any effect. The vertex shader already normalized that parameter in Sky.vert. So it seems normalization does not occur there in the vector shader? Does it need an out modifier for that parameter perhaps? (tried it, nothing changes)

Why does normaling the vector in the vertex shader doest not impact the fragment shader?

nevermind that last question, im guessing the answer will be interpolation

Because from one vertex to another, the normals are interpolated linearly. They have to be renormalized to bring the curve back.

I hope that makes sense. That is my understanding anyway.