I’m trying to make some shaders that apply transparent/translucent effects to some objets. I tried with the filters(simplerefractionfilter with objects instead of the whole scene) and the sceneprocessor (simplerefraction) examples that comes with the ShaderBlow project but both systems have the same problem (with different effects), and this is that both use the objects in front of the object to make their effects.
What I’m looking for is more what the “WaterFilter” does but this one is just too complicated for me to understand it just now so I was wondering if there is any other simpler way to achieve what I look for.
The water processor and filters works perfectly, what makes an unwanted effect is the refraction effects on the ShaderBlow project. The “filter” one (the one that use a filter instead of a sceneprocessor) uses the same “pattern” as here.
In that topic there is a heat haze example that have the same issue, it distorts any object that cross “the line” between the camera and the distorting object, what is not realistic at all.
Ok, I tried the filter posted here which code is here. For this one is true that it just distorts the “red object’s” edges. For the TestSimpleRefractionFilter_Objects test package in the ShaderBlow project, however, it have the problem I said. I don’t know if it is because of the java code or because of the shaders themselves, could you point me where to look better for? (because I looked on the java code but I can’t see where they differs for having the different behaviour).
The shaders used by the filter (defined on it matdef) are: frag, vert (on GLSL100) and: frag, vert (on GLSL150).
The shaders used by the refractor are: frag, vert when using the refraction technique (forced by the filter) and empty shaders when not forced.
You could probably tweak that out by doing a depth check before sampling for the distortion, there would be some stretching on the edges but it would be minimal.