Rendering with multiple viewports with filters

See the part after that - you see I was already guessing that he means exactly that and only needed a “yes” to confirm. Since other games do the same (e.g. SSAO and shadow only for the closest objects) this might be something that others solve in a similar way.

The question with the modern z buffer is still interesting for me btw.
24 bit z buffer sounds like less than the 32 bit integer z buffer that I used to know.
Are these 24 bit already floating point numbers or some sort of logarithmically mapped integer bits (like in the old z buffers which were hardware supported)? (i.e. 32 bit integers, and mapped via a formula for logarithmic z floating point distribution)

Z buffer is in hardware. In my experience it has traditionally been 16 or 24 bits. The ability to have 32 bit z-buffers is relatively new on my timeline.

This is a pretty good overview on what’s in a Z buffer:
https://www.sjbaker.org/steve/omniv/love_your_z_buffer.html

I though the idea went back to the TNT2, which means 1999. Afaik that was the first to unify the z-buffer with the stencil buffer. 24-bit + 8-bit = 32 bit. It is enough for the z-buffer, and the stencil too easy to manage, and it is quick to clear both at the same time. Voodoo 3 did not have a stencil buffer, but had a 32-bit depth buffer. The Riva128 might had the option to use either of these.

I am not that sure if a 32 bit zbuffer is that useful though. Like in my case, having large z values in the scene also means that there is a large difference in the detail that can be displayed in that distance. And both for avoiding z-fighting and managing far and close resources it seems to be more convenient to organize the scene around multiple viewports and multiple z buffers… But my knowledge is still small in this area so i might be wrong.

I agree with you. There are plenty of benefits to splitting a large (really large) scene this way… z-buffer optimization is just one of them.

1 Like

I am not sure if I am getting closer, but there is some progress.

In a test application I was able to get 2 previews into 2 separate textures and blend them together in the mainviewport with the ComposeFilter. There are still some issues, especially with the water filter.

I will try to apply this solution to my main app and see if this works.

The water filter is one of the filters that won’t work well with multi viewports.
Since water is infinite, it will render well beyond your “closest” viewport and thus you will experience artifacts for objects at far distances.

Maybe we can try to use the old “projected grid water” from jME 2 which was ported to jME 3 by someone. That looked sweet and might have some pros that the filter water doesn’t have. I kind of never liked the idea to use post processing just to get a water surface anyway.

In my experience it looks like the water filter kills the alpha in the viewport, and also, I don’t think it should overextend the frustum, shouldn’t it?

In the video I posted, the water is in the further frustum, but it is visible in the closer one, and if I apply it the other way around, the big box is not visible at all(that is: no alpha in the texture applied as the second texture)

In that case the water filter needs to be modified to only write alpha where water exists… In the code, alpha is hardcoded to 1.0.
https://github.com/jMonkeyEngine/jmonkeyengine/blob/master/jme3-effects/src/main/resources/Common/MatDefs/Water/Water15.frag#L426

Wow, just wow. Thanks for the quick reply. I am not sure if I fully understand this :smiley: Shaders are beyond my level as of now. Is this a bug or is it intended this way? Maybe mine is a very atypical scenario.

Are there any JMETests that describe the multiple viewport near and far technique ?