Texture issue with multiple cameras/renderers

I'm working on an application with multiple views of the same scene displayed in separate windows using LWJGLCanvas.

I have got a very basic 3d world working with a basic terrain and some boxes representing my actors.  Views are generally from the perspective of the actors.

I had originally run into a problem with TerrainBlock and rendering the textures in the multiple windows.  The problem was that I was trying to create one TextureState from a single renderer and use that in the other renderers.  I worked though that by creating a wrapper around the state of scene elements that creates a separate scene graph for each renderer and creates separate TextureStates for each renderer.

That worked great for TerrainBlocks and everything seemed great.  The problem is that when I try to now add a Skybox, I run into the same issue again.

I looked at the code for Skybox and saw that it was getting Renderer though DisplaySystem calls which I figured was the issue.  So I hacked Skybox to allow me to set a specific renderer explicitly but I am still seeing the issue.

So, next I noticed that TerrainBlock extends AreaClodMesh and SkyBox is basically a group of Quads which extend TriMesh directly.  I tried hacking Quad to extend AreaClodMesh (and remove the warnings that started firing off) which didn't make any difference (not really surprising.)

I played around some more with a SkyDome class I found on the forums and got the same behavior.

Is there any advice or pointers to where I should look to figure out what is going wrong?  I feel that this should be possible since I can get it to work with TerrainBlock.  Is it something about AreaClodMesh or ClodMesh that is the key or am I completely on the wrong track?  Does anyone know specifically why this is occurring?  I've tried following the code to see where the TextureState is being rendered and I'm struggling to follow it.  Even just a pointer to where the image is rendered would help.



I did some more debugging and put in an override in draw() in the SkyDome.  The renderer being passed to draw is always the same one that created the TextureState.

Any hints welcome.  Thanks.

:slight_smile: I solved it, sort of.  The initial problem is related to the TextureManager.  It was returning the same Texture for the two instances of the Image created from the same file.  As a hack, I created two files and loaded the Textures from the two instances.  It seems that the ProceduralTextureGenerator's composed Image instances don't get the same key in the TextureManager.  I think I just need to do something to make this the case for the sky image too.

So now the second sky looks good but the first one has a new problem (where the first one used to look OK).  The first sky now appears green and grey most of the time (like the terrain, coincidentally) and flashes the proper colors.  It's kind of cool, kind of like a thunderstorm during a hurricane, but not really the desired effect.

I think this might have something to do with these Textures being enabled immediately.  The only reason I think that is that I noticed that the terrain textures were not enabled at first and they seem to work fine.

Any help is appreciated and thanks to anyone who spent any time thinking about my first question.

More info:

Is it possible that the ZBufferState is doing something to disable the Textures?  I tried moving my little guys around to see if I could force it to be correct and at one point everything does look good.  When one of the boxes is off in the distance, in the view of the other one, both looking in the same direction, both skies are perfect.  But if I move the one in back up a little, the sky goes white (and I think is disabled).  Then if I drive that one in front of the other, it switches so that the other view no longer sees the sky but the one in front does.  I can at this point move them back and forth and reliably produce this switch.  The green sky is gone unless I turn them around to face each other.

Any thoughts?