Rendering with multiple viewports with filters

Hi,

I was trying to implement multiple viewports to solve z-buffer issues in my flight game, and by browsing the forum, the easiest solution looked to be having multiple viewports, and I also found this:

https://bitbucket.org/aaronperkins/jmeplanet/src/bd5fff780f24e6fba090ec0f6435e9894e57b12f/src/jmeplanet/PlanetAppState.java?at=default&fileviewer=file-view-default

So, based on my finding I though that this is going to be trivial, and I set up 2 fullscreen main viewports, with different frustum.
All worked well, up until I added post processing filters. From this point, the front view always overwrites the far view.

May I help your ask if this should work at all? I already put a few hours into this, based on all the code samples I found, but it doesn’t work. If I set

   nearViewPort.setClearFlags(false, true, true);

I expect this to work, but the nearViewPort will completely overwrite the background.

I plan to continue with getting jmeplanet working on Tuesday(that is the link), and if that works, then I assume the issue is in my code.

However, could anyone confirm that multiple fullscreen mainview with multiple independent filter processors should work?

My other idea is that preview is needed for this and maybe I can combine the results in the colour buffer…

Hey,

the guys from this thread have some code for this situation: [Solved] Bloom Filter applied to near view port does not create transparent background - #21 by Ogli

I did not yet finish the little demo using their code, but you should be able to find your way using the code of @zissis and @Tryder - just download it from the other thread. Tryder’s version has bloom filter and filter post processing. Did not test their code yet, which means that you have the honor to do it by yourself now… :chimpanzee_closedlaugh:

Be aware that multiple frustums reduce the framerate (in theory). Each new frustum should need one complete render pass - so using 4 frustums will reduce the framerate from 100 percent to around 25 percent. The effect depends on how much the frustums show.

Happy coding,
:chimpanzee_smile:

Thanks for the advice, I took a look at the codes,but I hope I can tailor these to my needs. I want to have something simple, yet performant.

What is interesting to me, that aaronperkins code works with the same framebuffer and main viewports and it is a very convenient and logical solution.

If I understand correctly, the framebuffer is the same for all viewports. If I clear the depth and stencil buffer, and I use a transparent background I expected the effect applied correctly. That is: where the scene is in the frustum it overwrites the further layer, where the scene is out of the frustum it will blend with the background color.

For shadows I think this should be the case, and maybe even the FogFilter should work perfectly with this setup.

I will look at the solutions provided, but what I was planning to do,if the mainview does not work, is this:
Create 2 previews and blend them together into 1 texture and render that to the colorbuffer of the mainview. Probably I am wrong.

Actually I am using 4 right now in my game and saw no performance loss … I am on the 3.1 alpha 3 build.The reason why there is no seeming performance loss is because nodes are not rendered on more that one viewport at any one time due to culling. At first glance you would think that you would get a drop in performance but in reality you don’t.

Here is a screenshot of my 3D navigation grid across a seen that spans over 10,000 units with ZERO z fighting. My game is frame locked at 60 fps by the way and this is running on a tablet with an integrated video card on shared memory.

That planet in the distance has a cloud layer that animates interdependently that would be giving me MASSIVE z fighting without the multiple viewports.

Bloom is on for all viewports by the way.

4 Likes

Yes, my theory was wrong.
It would only reduce the framerate if you “stack” new frustums behind one.
If you “split” one frustum into 4 then it should give ~ the same frames per sec.

1 Like

zissis:

Do you apply your filters through a filterprocessor? From what I understand of Tryder’s post is that FilterProcessor won’t be friendly with multiple overlapping viewports.

And not just that, if I understand this correctly a FilterProcessor will create a new framebuffer for its viewport, so any framebuffer assigned to a viewport should be handled cautiously.

I am still far away from a solution unfortunately, but trying :slight_smile:

FilterPostProcessor changes the target framebuffer for the ViewPort it is attached to. Hence you will need to apply that same framebuffer to all other viewports you’re using. Also remember that any filters that rely on depth will not work, because you’re clearing the depth buffer to allow a higher depth range.

1 Like

Important hint. I think ShadowFilter, SSAO and Depth of Field might be among the candidates.

Could this be done somehow?:
1st render the far away view frustum with post processing
2nd render themiddle view frustum with post processing
3rd render closest view frustum with post processing
Will reduce performance quite a lot but work?

Or could this be done?:
Render depth to 16 bit buffer or 32 bit buffer
(all view frustums share it or hand it over to the next one)
Since z buffers are 8 bit mostly, we need to use a trick to get that many bits
Use that custom buffer for custom variants of the ShadowFilter, SSAO, DoF

Maybe you have more ideas (beside from not using depth-dependent filters)?

To be honest, I’m not that much into modern 3D rendering at the moment.
Got quite enough to do with all the other stuff - graphics is quite low on my todo list.

Thanks for the feedback Momoko_Fan.

What I was thinking about is using separate previews with separate framebuffers/filterpostprocessors and combine their results into the main viewport. There is only one important thing to have this work: the FilterPostProcessor must keep the background alpha, so I can combine the colourbuffers into one colourbuffer in the mainview. I hope this will work, but I am still in the phase of figuring out how to implement this.

So you apply post-processing three times instead of once. It could work, but all filters will need to start writing correct alpha so that you can blend the results from the 3 post-processed scenes. I am not sure if it is worth the trouble.

You mean just use a 32-bit (floating point) depth buffer? Yeah, you could do that. By default jME uses a 24-bit depth buffer.

All 3 filters that you listed could live without depth information. That means you will see shadows / SSAO only on parts of the scene which are closest and DoF could only focus on that part of the scene. Not a bad tradeoff in my opinion.

Guys, @Tryder and I solved this months ago. The file you are looking for is here → MultiViewportAppState.java - Google Drive

zissis, I looked at Tryder’s solution before, but I am not yet convinced that it’s going to work for me, and I still need to find the details from that code that I can reuse.

If I understand correctly Tryder’s solution provides a custom implementation instead of the jme filters. I want to reuse the filters that already exist, especially because there are so many useful among those, and I would like to see a solution where I don’t have to deal with the code of those components.

Maybe my jme knowledge is not good enough yet, but Tryder’s solution does not look such that it satisfies this requirement.

As for the bloom filter you can not use it on overlaying viewports out of the box only because of how it’s designed. This is why @Tryder added his code to my multi viewport app state. I had the exact same problem as you are facing with my open world true scale space game. I solved the z fighting with my multi viewport app state and I solved my glow and bloom requirements when @Tryder added his code to my app state code. I don’t understand why you want to go through the pains we have already endured especially when we released the solution … but it’s your call.
By the way, our solution also gives you a filter post processor for every viewport so using out of the box post processing filter with it is a no brainer … except for one like bloom which will never work on multi viewport anyways

1 Like

my 2c…

  1. Majority of filters will only need to be applied the the closes viewport (eg SSAO will only really be visible in the near field).

  2. make sure to leverage downsampling on the distant viewports, if the filter doesnt support it, add it, its only a few lines of code…

1 Like

I am not sure if we are talking about the same thing. My main issue is not the bloom filter, actually I haven’t tried that yet. I have an issue with every filter, though what I wanted to use originally was the WaterFilter.

If I add any filter to the closest viewport, I cannot see any of my underlying viewports. It doesn’t matter if there is an individual filter post processor an each individual viewport, the last filter processor on the last view will creates its own framebuffer and overwrite everything behind it. I tried FogFIlter, DirectionalLightShadowFilter, and WaterFilter.

If I look at the FilterPostProcessor code, it looks like this is the intended behaviour, but I am not sure if I looked at every cornercase in that code.

But maybe I do something very wrong in my code. Please give me some time to create a new clean project and check this through a simpler example, and I will get back with the test results.

Did I really write 8 bit? Isn’t there a 32 bit hardware integer z buffer in graphics cards?
Or is that the past now and we can select any buffer we want today? (is it all “software z buffer” and no “hardware z buffer” anymore)

Hm … strange, my OpenGL bible says that SSAO uses a ray marching and needs depth information. Maybe there are other techniques for it.

What do you mean by “only on parts of the scene which are closest” - you mean, they use the depth values of the last frustum (closest frustum) which was rendered last? Hm, interesting… Now I’d really like to try that (and I’ve seen videos from other games where shadows only appear for the near objects but not for the far objects of a high-z-range-scene).

:chimpanzee_confused:

yeah, ray marching isn’t used in the jME SSAO or the mine, or any other method I’ve come across (that is useful), it’s more about mult-sampling depth buffer to approximate “close” geometry.

He means that you’d do depth-based filters only on the nearest viewport because there is no reason to do them for the far viewports as the effects are unnoticeable.

Maybe I understand this wrong, but the expected behavior for me was that you must not care about these details.

  1. Farviewport writes to colourbuffer.
  2. You clear depth buffer and stencil.
  3. Close viewport has a closer frustum: there is a large area where zbuffer = zmax → the pixels should be blacknoalpha, which is transparent. Colourbuffer of near viewport should mask the colourbuffer of the farviewport nicely.

But this is not the case.
Actually, I have a suspicion which I need to check in the evening: the filterpostprocessor uses a nontrasparent darkgrey background…

Hmmmm… I think I fundamentally misunderstood how that blacknoalpha works in this case…