Projective Texture Mapping

Update:
“Projective Texture Mapping” is now available as a jME3 SDK plugin. The source code has also moved to the contributions repository. Don’t expect too much of it. You’ll most likely want to create your own solution if you need something like that, but it might be a good starting point.

The plugin also contains a sample project template to demonstrate the features:

Update:

  • Fixed the RC2 incompatibility @haze pointed out (caused by jME3 r9956).
  • Fixed a model space vs. world space brainbug (thanks to @the_accidental and @nehon for finding it and providing a solution)

Snapshot: ProjectiveTextureMapping-2012-11-24.zip

Update:

  • “(Single)TextureProjectorRenderer” is default again. I said multi would be faster, but that was a mistake.
  • Some refactoring:
    — “ProjectiveTextureMapping1pass” -> “ProjectiveMultiTextureMapping”
    — “TextureProjectorRenderer1pass” -> “MultiTextureProjectorRenderer”
  • Z-fighting fix “setPolyOffset(-0.1f, -0.1f)” by default
  • Different CombineMethods in “MultiTextureProjectorRenderer”
  • Bugfixes
  • ul and li don’t work btw :smiley:

Snapshot: ProjectiveTextureMapping-2012-03-15.zip


I wrote a post processor which does Projective Texture Mapping. See also this paper by nVidia. It is related to the thread Projected Texture on this forum.

Features:

  • Can project an arbitrary number of textures onto scene geometry
  • You can choose which geometry shall be affected (GeometryList)
  • Perspective and parallel projection
  • Back projection and backside projection fixed
  • Fall off distance and fall off power to limit the projection

Video demonstration of latest version (note the rune is just projected onto one geometry):

<!–leave this below youtube link–>

Motivation:
A use case of projected textures are targeting circles and ground effects known from popular MMOs by BlizZzard and Buy-o-ware. It can also be used to simulate the light cone of a flashlight/torch.

<!–leave this below youtube link–>

The code including a test application can be checked out here:
http://code.google.com/p/survivor-jme/source/browse/#svn%2Ftrunk%2FProjectiveTextureMapping

Please note that this is work in progress. It might change frequently and it will surely have bugs.

Future work:
I’m planning to write a shader that can do all in one pass like this:
[java]
uniform vec3 m_ProjectorLocation[NUM_PROJECTORS];
uniform mat4 m_ProjectorViewProjectionMatrix[NUM_PROJECTORS];

for (int i = 0; i < NUM_PROJECTORS; i++)
{
vec3 pl = m_ProjectorLocation<i>;
mat4 pm = m_ProjectorViewProjectionMatrix<i>;

// process an arbitrary number of projectors in a single pass
}
[/java]
I don’t know if this makes sense, though. I can’t yet estimate performance bottlenecks.

I’d also love to see this in the GL1Renderer.</i></i>

18 Likes

Can you try projecting over a hill to see the threshold where the normals become back-facing (with some front-facing ones too)? The normal check per-pixel might not be too necessary with normal maps. Parallax projection it could be very noticeable however.

Ok, so I tried the transforms, didn’t seem to make any difference.



cosAngle = dot( inNormal, normalize((g_WorldMatrixInverse * vec4(m_ProjectorLocation,1.0) - inPosition));



nor:

vec4 worldPos = g_WorldMatrix * vec4(inPosition, 1.0);

cosAngle = dot((g_WorldMatrixInverseTranspose * inNormal, normalize(m_ProjectorLocation - worldpos));



Ok, so here’s projecting over a hill:





And here’s the same, but using the ProjectorDirection to calculate cosAngle:







In case it’s not clear, in the first picture it’s not projecting on any /wrong/ faces, but it’s also not on all of the /right/ faces.



in the second, it’s nearly there, but obviously not quite right because it’s using the direction of the centre of the camera, and not accounting for the angle of the projection as you stray from the centre.

looks like z fighting. Unfortunately disabling depth test could solve the issue but it would have bad side effects.

i guess you could add a polygon offset to the projective material try polygonOffset 1 1

I just spotted the missing bracket above:

cosAngle = dot( inNormal, normalize( (g_WorldMatrixInverse * vec4(m_ProjectorLocation,1.0)) – inPosition));



didn’t help either.



An offset of 1,1 makes nothing visible, -1,-1 has no effect (the default is 0.1, 0.1)



Time for me to stop this for today, if no one suggests anything by the time I get back to it tomorrow I’ll try get some example code up for people to play around with.

Hi @survivor



very nice work!!!



I want to create a animated ground effect. So, I want to animate the proyected texture. I think the way to animate the texture is changing the proyected texture every a half a second or so. I know that I can use SimpleTextureProjector.setProjectiveTextureMap() method to change the texture.

But I’ve taken a look to @mifth simpleSprite shader. You can set a tiled texture in this shader and the shader will change the tile to make the animation on the texture.

Do you think that it will be a good idea to mix simpleSprite shader with Projective texture mapping? I will try to do it, but I want to know if it will be possible or not. I mean, perhaps you’ve already try something like that and it is not posible (and the only way to do it is using setProjectiveTextureMap method).



Thanks

1 Like

excellent man, really nice work :slight_smile:



what is performance like in it’s current form ?

Thanks very much for this contribution! I am sure many people will be happy to see this :slight_smile:

At the moment, a projected texture costs approximately 5% fps. I suppose it’s the render*() calls. I’m curious if the one pass shader can improve things. Unfortunately, holidays are over for me tomorrow. :confused:

This is really cool!



I am currently trying to apply it as a flashlight in a game I’m messing around with.



Is it possible in the current state to limit the projection distance, possibly with some kind of degradation.



And is it possible mark objects as cast and receive shadow of the projection?. In effect having geometry block parts of the projection for surfaces behind it?

1 Like

You can use “projector.setTargetGeometryList(gl)” to limit the projector to that geomerty. It might also be possible to limit the projection by “projCoord.w” in “ProjectiveTextureMapping.frag” but that’s not yet implemented. Edit: Just implemented. Shadow-mapping-like occlusion is also not (yet?) implemented.



@udoprog: Added support for fall off. Usage:

[java]

projector.setFallOffDistance(2.1f);

projector.setFallOffPower(4f);

[/java]



“TestProjectiveTextureMapping.java” demonstrates it.

I made some optimizations, especially to “TextureProjectorRenderer1pass” which can now handle an arbitrary number of projectors by using multiple passes with up to 8 projectors per pass (“1pass” is a bit misleading). Since it’s ~3.3x faster than “TextureProjectorRenderer”, “TestProjectiveTextureMapping1pass” has become the new default run target of the project.



Unfortunately, these decals will not work with displacement / parallax mapped surfaces. Does anyone have an idea how to handle that?

Wonderful! Wonderful!

Wondering how it would feel to play with this in the fully fledge editor of jme :slight_smile:

@survivor could you just run the displacement/parallax methods after the projective texturing?

No, sice I need the displaced texture coordinates to do the lookup in the projective texture map. Normal mapping would be ok as it is since it does not offset the diffuse texture anyway. But correct parallax of decals would require to pass the offset texture coordinates to the projective texture shader somehow.



Here’s a video demonstrating the problem:

http://www.youtube.com/watch?v=8vhWf2xfSjg



Edit: I think the way to do this is a pre-pass that combines the decal / projective texture and the diffuse texture and replaces the diffuse texture of the original material. This way, one could even create decals that have normal and parallax map themself. This would enable bumped and parallaxed decals on bumped and parallaxed surfaces without altering the material / shader (it is altered on-the-fly). I don't know if it's doable and the performance might be horrible due to many texture swiches, but the idea is interesting.

Edit2: Related stuff:
http://www.unitymagic.com/shop/en/unity-decal-framework/
@survivor said:

Edit: I think the way to do this is a pre-pass that combines the decal / projective texture and the diffuse texture and replaces the diffuse texture of the original material. This way, one could even create decals that have normal and parallax map themself. This would enable bumped and parallaxed decals on bumped and parallaxed surfaces without altering the material / shader (it is altered on-the-fly). I don't know if it's doable and the performance might be horrible due to many texture swiches, but the idea is interesting.


@survivor That's kinda what I was getting at. Shaders definitely aren't my forte however.... I was assuming that if you projected the texture onto a flat surface then the later pass of the parallax shader would already have the appropriately colored frag information. The stuff that I have read isn't very clear on this.

@survivor , I added a link to your code. Because it’s very vital for the community.



http://code.google.com/p/jme-glsl-shaders/



http://i.imgur.com/fy0oz.png

Thanks! I’m currently fixing bugs and doing some cleanup. I’ll let you know when I’m done and there’s an initial version that can be put into another repository. My google code repository will stay my playground for preliminary version.

Update:

  • “(Single)TextureProjectorRenderer” is default again. I said multi would be faster, but that was a mistake.
  • Some refactoring:

    — “ProjectiveTextureMapping1pass” → “ProjectiveMultiTextureMapping”

    — “TextureProjectorRenderer1pass” → “MultiTextureProjectorRenderer”
  • Z-fighting fix “setPolyOffset(-0.1f, -0.1f)” by default
  • Different CombineMethods in “MultiTextureProjectorRenderer”
  • Bugfixes
  • ul and li don’t work btw :smiley:



    Snapshot: ProjectiveTextureMapping-2012-03-15.zip



    I’d like to discuss two things:


  1. How to be a “good guy” SceneProcessor that works together with others. Because at the moment, mine doesn’t. HDRRenderer for example: If HDR is before PTM, there are no projective textures. If HDR is after PTM, there are completely black parts of geometry (where “gl_FragColor.a = 0.0” in PTM shader). PSSM shows similar behavior in combination with HDR. My question: Am I doing something wrong or are SceneProcessors not meant to work together?


  2. How to combine multiple textures? MultiTextureProjectorRenderer enables advanced methods of combining multiple textures that aren’t possible if I render one texture per pass (standard TextureProjectorRenderer). But it would be possible if PTM was implemented as Filter that first renders / combines all textures to an offscreen buffer and then blends this combined texture into the main frame buffer. Filter also seems to be better designed for working together with others.



    Perhaps someone (@nehon ?) could review my code an let me know which things I could do better or what one wouldn’t do the way I do. Other than that, this snapshot should work.
2 Likes

great! you are a real shader guru. nehon guru propably will accept other guru :smiley: