[SOLVED?] Filters on overlapping Viewports

Hello,

I’m making a program which is very UI heavy. It relies on multiple UI layers switching contextually to let the user work efficiently. One of these layers is dedicated to 3D controls for translation / rotation / scale, as depicted below (around 0:35 (sorry about the music)):

https://youtu.be/CPvVwW3rRAk?t=34

I’d like to be able to add filters (particularly bloom) to these controls – however, they are rendered on a separate ViewPort, which is then applied as a texture to a Quad, but any attempt to apply a bloom (or any other) filter to the ViewPort responsible for these controls seems to toss the buffers below it into an abyss (the handles do glow, but everything else turns black :cry:).

Is there a proper way to accomplish this without fighting the framework? Ultimately, I am looking to be able to do this type of per-ViewPort filtering fairly flexibly, as there are multiple layer types which demand different effects.

Thanks!

(Also on a somewhat off-topic note which I should probably make a separate thread for, the video above is of a fairly robust Inverse Kinematics system I’ve developed and am looking to either contribute to JME or make available as a plugin (I’m not sure what’s preferable for this sort of thing), and I’d greatly appreciate any advice or help with doing that intelligently).

I worked with multiple ViewPorts a while ago but changed direction (I think possibly do to issues like filters) so I probably don’t have an answer, but why do you use separate ViewPorts for this? Also a screenshot of the problem might help others understand what the issue is. Seems like a cool project idea.

@rufsketch1
That vid looks pretty nice. Those handles are a great visualization of constraints.

I’m open to alternative approaches if there’s a better way. Really anything that will allow me to render different elements of a scene onto different targets, and let me selectively apply filters to some targets and not to others.

Right! So for example, I have some rotation handles which the user can hover the mouse over. Currently, mousing over an interactive element of that handle makes it change color to indicate what sort of interaction would occur.

What I would like, instead of the colors to simply change, would be for a glow effect over the hovered element, as manually doctored below.

But when I try to apply a glow filter to the handles ViewPort, what I actually get is this:

Everything behind the filtered ViewPort goes black :frowning:.

Thanks! The actual constraint type itself is a novelty of my own devising (I call them Kusudama constraints), and generalizes the concept of reach cones to allow for almost arbitrarily shaped, smoothly varying boundaries. The math is kinda neat.

1 Like

I infer that the handles are in a PostView viewport. Why not render them in the default viewport?

Why don’t you use a shader to “highlight” certain parts of your scene instead of filters? Are you after a highlighting effect or something similar?

Not exactly (I think? (is the GUINode considered PostView?)) .

My current setup is basically as follows:

  • I have 5 ViewPorts, each responsible for their own type of visualization. These are

    1. HandleVP (displays manipulation handles for rotation / translation / scale [and a few other informational things situated in 3D space with weird transform rules for easy selection])
    2. Front_Inactive_AssetVP (displays all inactive 2D assets which are to be rendered in front of the currently selected 2D asset)
    3. ActiveAssetVP (displays all active / selected 2D assets, or their children)
    4. Back_Inactive_AssetVP (displays all inactive 2D assets which are to be rendered behind the currently selected 2D asset)
    5. SpaceVP (displays floor grids / bones / any 3D objects that don’t require any fancy transformation or occlusion tricks).

Each of these ViewPorts is rendered onto its own dedicated Quad, and each of these Quads are attached to the GUINode and rendered on top of one another such that the HandleVP is always the foremost quad, and depending on context, SpaceVP is always either the furthest quad (so that the user’s view isn’t occluded by the 3D armature when they are focusing on 2D assets), or second-to-foremost quad (so that the 2D assets don’t occlude the armature when the user is manipulating a rig). That’s a mouthful to describe, but basically I’m always in one of the two situations illustrated by the frames of the following schematic gif:

What i would like to be able to do with regard to filters is something like this:

Essentially, I’d like some way to attach filters to particular ViewPort layers or groups of ViewPort layers so that the Asset Filters only affect the *AssetVP ViewPorts, and the Handle Filters only affect the HandleVP ViewPort, so that my handles can still glow when a user hovers over them, and the user can still see fog and blur and colorizing effects on the 2D assets, but the handles manipulating those assets do not get fogged or blurred.

However, what currently happens is either this:

or, if I resign myself to not having a glow effect on HandleVP, this:

Basically because I’m not sure how to manage this layering scheme in the default ViewPort.

Isn’t this what the BloomFilter is already doing?

Custom geometry comparator to give you layers? (Lemur has one you can use ‘out of the box’.)

I’m not entirely sure what the motivation for going to all of this trouble is, though… there may be some aspect we miss about the “why” in your explanation above.

So stepping back, what were the problems you were trying to solve if these were all in the same viewport?

Does this help? (I’ve transcribed the text, so please enable captions if my handwriting is too atrocious) (also, apologies for the terrible framerate, I can re-record if it isn’t sufficient to illustrate what’s going on)

I am currently using Custom Comparators to allow for manual ordering of 2D assets relative to other 2D assets, however, using Comparators for everything in the same ViewPort would create very difficult issues with transparency of focused vs unfocused Assets. I might be willing to tough that out if it will allow me to selectively apply filters (like Depth of Field Blur) on some objects and not on others, but I don’t really see how it could allow for that.

Well depth of field blur is for distanced objects. It’s not supposed to have exceptions. Blur is equal to distance. Post effects are composed after everything else as the name suggests. So in my view trying to alter that path is wrong.

If you want to focus on an object then do that. Animate closer to it. Or alter the focus distance.

In respect of your ordering techniques, disabling the depth buffer will render it over everything. So when a speech bubble becomes active, disable depth on that bubble alone and enable it on the rest.

What I would do in regard to “active speech bubbles in front and bones in front when they are active” is put the bubbles in a node that is offset from the center and imitates the rotation of the camera. So if the speech bubbles are active bring them -1 on the z axis (toward the camera) and rotate the parent with the camera. And when they aren’t active put them +1 on the z axis (away from the camera).

So as the camera rotates they are either always in front if active or always behind if inactive.

At least that’s how I see how to overcome your video. I’m not certain the video explains your situation entirely.

(Just to avoid possible confusion about my use case: those aren’t speech bubbles, they’re vector graphics with deformable textures. They are intended to serve as lines and fill shapes for drawing/animating cartoon characters [which can be posed by the armature], I just happened to write on them for the purposes of this demonstration).

To clarify, I do not want the active asset to render over everything, I only want it to render over every armature, and over every 2d asset which it is in front of. However, it should still render behind any 2d assets which are in front of it. In other words, with regard to the render ordering functionality, the behavior shown in the video is the desired behavior. The issue is more to do with an inability to selectively apply filters over this otherwise correct behavior.

Yeah, I guess it must have been confusing. Sorry about that. I’ll remake the video tomorrow using my voice instead of my bad handwriting.

The answer to that is you can’t apply a post processor selectively. It has no concept of a scene graph or objects, only texture lookups or various kinds.

Bloom can work if you apply a glow color to a material, which will give you the selective bloom.

Since you’re using the GUI node you could draw a quad the size of the gui behind the selected item and use a blur shader on it so it appears as though the items behind are blurred.

At this point though I’d really need to see exactly what you’re after. Are you maybe imitating something you saw? Pictures say a thousand words. Maybe I just don’t get what you are trying to achieve. I just can’t “see” your vision. It might just be me. I can see you are trying :stuck_out_tongue:

I think that’s why he’s using layered viewports.

The issue is that the FPP is causing the background to clear. There is no concept of chaining viewport FPPs in this way.

Sorry, I haven’t had a chance to make the new video yet (I can tonight if the following doesn’t turn out to be sufficient), but

I guess a very close approximation to this particular subdomain of problems I’m facing would be something like what’s available in Unreal 4’s editor, which manages to display stuff like this:

Here we have a depth of field blur on one layer for the scene, and clean crisp interaction handles, object outlines, and bounding boxes on a separate layer for intractable and symbolic elements. The outlines are presumably generated with something akin to a bloom filter, and yet they are still occluded by elements in the scene, but also do not interfere with the DoF filter applied to that scene. I want to be able to achieve basically that, a crisp and clean bloom or outline filter on the Handles in one layer / ViewPort, and a DoF blur on assets in a separate layer / ViewPort.

So in my case, and the image above, I think exceptions are definitely desirable. (And I imagine this case extends to many game that want depth of field blur, but not on symbolic things like character or occlusion outlines or waymarkers, because symbolic things shouldn’t be subject to physical effects) . I agree that Post Effects should be the second to last step, but the last step should definitely be compositing. I don’t know that its sensible to prevent the user from compositing multiple post effects. Is this by design for some reason I’m not seeing, or is it more of a known problem?

Yup!

So, one thing I noticed is that even though the background goes black behind my transformation handles when I apply a bloom filter to them, the blackness seems to be restricted to that particular viewport / quad. So if I move the HandleVP quad to the left or right, I can still see the scene rendered where the HandleVP isn’t in the way. Similarly, if I change the Quad’s blend mode to something like BlendMode.AlphaAdditive, the scene behind it adds up over the black portions and shows through (though, it also adds to the non-black portions, making the glowing handles too bright).

This would imply the SpaceVP and *AssetVPs are getting rendered, but the HandlesVP is just occluding them with its black background. It seems like it should be simple to have the BloomFilter give the HandleVP a clear transparent background instead of a black background, but I can’t figure out where in the code the background is being set to an opaque black. I think I’m very confused as to how exactly background clearing works. I tried reading through / mucking about with the code but clearly my intuitions must be wrong because if anything it has just left me more confused :sob:.

Is there any good documentation on multi pass filters or multiple render targets in JME3? Maybe some advanced guide to designing my own multi pass filters? I see something called ComposeFilter that I would expect to do what I want, but it specifically says that it’s important for the alpha color to be 0 in order for it to work . . .which apparently isn’t the case here?

They don’t look like post processing effects to me. They look like materials.

That yellow rim around the wineglass is post-processed, I’m pretty sure. I might be wrong and Unreal 4’s editor might be doing something trickier but it can definitely be done postprocess too, by the method described in this youtube tutorial.

I’ve included some screenshot examples from the tutorial below. Note particularly how the outline stays crisp and easily discernible despite the motion blur around everything else.

How would one accomplish this in JME?

I didnt’t watch it all, I scanned through it, but from what I can see, he’s using a penumbra effect to detect edges on a shape using materials.

Having done a “utility” type project with JME (rather than gaming), and running into similar types of blockers, this kind of challenge is interesting to me. I didn’t watch that entire video but I can see your logic here.

In the technique in the video, it seems like they basically have a special ViewPort(s) which doesn’t get rendered, but which feeds data into a post filter for a ViewPort which does get rendered?

I do suspect that, since JME is fundamentally targeted at gaming rather than 3D utilities, this may not be as straightforward as one would hope. I had to do some not-so-typical lower level shader work to get the features I needed.

Unfortunately I’m a little rusty with the JME internals so I probably can’t come up with many useful suggestions.

I managed to get it working by making a copy ofthe FilterPostProcessor class, and tweaking it in the smallest way imaginable. Specifically, I changed this line

private Format fbFormat = Format.RGB111110F;

to

private Format fbFormat = Format.RGBA8;

(I think there was another line in the code where it was conditionally declared to something else, and I changed it there too).

So basically everything was rendering with a black background because the FilterPostProcessor is set to draw on a FrameBuffer with no alpha channel. I don’t know why this is a private variable the user isn’t allowed to change or extend, or if changing it is going to come back to bite me. But so far the modification has been working EXTREMELY well.

5 Likes

Ha, that kind of thing (custom post filter “transparency”) is what first occurred to me, but I assumed it couldn’t be that simple because someone would have mentioned it. :sweat_smile:

Sometimes you have to override or use internals in ways that weren’t originally intended. But I don’t want to imagine what we’d have to do to work around a big engine like Unity when the capabilities aren’t quite there.

1 Like