Memory problem with TextureRenderer and MipMaps

Hey there, I am currently rendering 13 small scenes to textures once at startup. The texture size is 512x512 and everything works just fine, except the resulting textures dont look good after applied to a quad, because MipMaps are not enabled.



When i set the MinificationFilter to Trilinear or BilinearNearestMipMap, only 3 to 6 rendered Textures actually show a content, all the others are fully transparent. Funny thing is its always the first two scenes and the last scene that was rendered correctly, or the first three and the last three, and so on. The scenes in the middle are kind of always missing.



I tried much but only one thing helped: change the texture size to 256x256 and every scene gets rendered correctly. Is it possible that the Memory runs out and previously rendered textures get discarded or something like that?



I really need the texturest to be at least 512x512, so how can I achieve that? Can i compress textures that are drawn by a TextureRenderer? What else can be the problem?



thanks in advance guys.

Andy

How are you loading the textures? What is their format? What's your video card? Latest drivers?

  1. I'm not loading them, I create Texture2D objects and render to it with TextureRenderers (one for each Texture)
  2. Standard Format (RenderToTextureType.RGBA)
  3. GeForce 7600
  4. Yes, latest drivers.



    This all works fine when I DONT enable MipMapping for these Textures.

I was wondering about memory too…



Perhaps an opengl profiler would help

Just for the heck of it, did you try 1024x1024? I strongly doubt it’s a memory issue… You could try seeing if anything shows on the performance stats with jNVPerf[/rul].

Interesting, with 1024x1024 only 0 - 2 textures are rendered correctly. With 512x512 it were 3 - 6 images.



When I switched back to 512x512 again, suddenly all images were rendered correctly!  :? Still I don’t see this as the solution  :expressionless:



Furthermore, I noticed that some textures, when rendered incorrectly, show some noise, which seems (of course its noise) random. The grey thing in the image below is in the background and the incorrectly rendered texture is transparent with the noise.







Now I am really confused, I’ll try to take a look at jNVPerf. If you can think of a solution before i have to torture myself with that, I’d be glad.

Is this Windows?  and does anybody else out there have a 7600 (or something from the same family)?

Ubuntu Linux

This is really odd… How long have you been using this card? This could be caused by faulty VRAM… Can you test it on any other machines?

I've seen this too and it does seem to be related to card memory somehow. It is only for RTT textures though, probably since they might not be possible for the card to reload when after out, or similar.

I'm having this same problem, I upgraded my graphics card as well, nothing.

Do both LWJGL and JOGL do the same thing?

basixs said:

Do both LWJGL and JOGL do the same thing?


I have no idea, and I guess I dont have the time to test since my game is quite matured and wont run at all on JOGL I guess.

Well, one day and a computer reboot later, the 512x512 textures still work most of the time. (sometimes one or two are missing) :?

A temporary fix: If I render the Textures again at a later point in the application, they are all rendered correctly, without exception. I first thought that maybe this is a threading issue and the TextureRenderer and TextureManager write to the graca RAM simultaneously at startup, but I made sure every TextureRender and TextureManager operation is done in the OpenGL thread, so that cant be it right?

Also, I tried it at home with a 6800 graca and the behavior is similar.

well, i narrowed it down (i guess)



I have two GameStates.



GameState 1 renders the various Geometries to Textures.

GameState 2 uses the Textures that GameState 1 renders.



The problem described above occurs when i attach GS1 to the GameStateManager before GS2 (which would be logically correct, since Textures need to be rendered before being used)



When i attach GS2 before GS1 (means that GS2 is actually using Textures from one render-cycle before) every Textures shows up fine.



Does that give anyone an idea of what could go wrong?

Is this just a normal StandardGame implementation?



That almost sounds like the buffers aren't being swapped correctly…



Maybe it would help if post what your render functions look like for those to states…



And out of curiosity are you using the newest jME (which has the newest LWJGL)?  From their forum it seems like it was mainly targeted at Mac and Linux (which might explain my jump in framerate ;)) it just got updated (around a week ago, maybe less).

And have you asked over there for help?



(and does OS make a difference? I also wonder if this might be a driver issue…)

basixs said:

Is this just a normal StandardGame implementation?

yes

basixs said:

And out of curiosity are you using the newest jME (which has the newest LWJGL)?

yes

basixs said:

And have you asked over there for help?

not yet ;)

basixs said:

(and does OS make a difference? I also wonder if this might be a driver issue...)

The bug appears under Ubuntu Linux and the same under Windows Vista Business on several machines.

here some code:



The first GameState wich takes care of rendering the textures.



The object of the type SymbolSceneManager holds a collection of SymbolSceneRenderer objects.



The class SymbolSceneRenderer renders a Scene once in the init() method to a first texture and in every draw call to another texture much like ImposterNode. It is like ImposterNode, except it renders the initial state of the scene to a seperate texture at init():



The init method is called inside the OpenGL Thread during an update() cycle.



SymbolSceneManagerGameState:


  public void init(Game game) {

    cleanup();

    symbolSceneManager.init(standardGame, game, texWidth);
    symbolSceneManager.setName("symbolSceneManager");
  } // init


  @Override
  public void cleanup() {
    if (symbolSceneManager != null) {
      symbolSceneManager.deleteTexturesAndClear();
    } // if
  } // cleanup


  @Override
  public void render(float tpf) {
    if (symbolSceneManager != null) {
      symbolSceneManager.draw(standardGame.getDisplay().getRenderer());
    } // if
  } // render


  @Override
  public void update(float tpf) {
    if (symbolSceneManager != null) {
      symbolSceneManager.updateGeometricState(tpf, true);
    } // if
  } // update



any ideas?

I think the best thing you could do is to create a test (or series of tests) that isolates the problem (possibly LWJGL not jME based) and see if that reveals anything…