So one of my developers is running into an issue running our game engines client:
java.lang.ArrayIndexOutOfBoundsException: Index 18 out of bounds for length 16
at com.jme3.renderer.opengl.GLRenderer.setVertexAttrib(GLRenderer.java:2868)
at com.jme3.renderer.opengl.GLRenderer.setVertexAttrib(GLRenderer.java:2921)
at com.jme3.renderer.opengl.GLRenderer.renderMeshDefault(GLRenderer.java:3153)
at com.jme3.renderer.opengl.GLRenderer.renderMesh(GLRenderer.java:3191)
at com.jme3.material.logic.DefaultTechniqueDefLogic.renderMeshFromGeometry(DefaultTechniqueDefLogic.java:70)
at com.jme3.material.logic.SinglePassAndImageBasedLightingLogic.render(SinglePassAndImageBasedLightingLogic.java:260)
at com.jme3.material.Technique.render(Technique.java:166)
at com.jme3.material.Material.render(Material.java:1028)
at com.jme3.renderer.RenderManager.renderGeometry(RenderManager.java:614)
at com.jme3.renderer.queue.RenderQueue.renderGeometryList(RenderQueue.java:266)
at com.jme3.renderer.queue.RenderQueue.renderQueue(RenderQueue.java:305)
at com.jme3.renderer.RenderManager.renderViewPortQueues(RenderManager.java:877)
at com.jme3.renderer.RenderManager.flushQueue(RenderManager.java:779)
at com.jme3.renderer.RenderManager.renderViewPort(RenderManager.java:1108)
at com.jme3.renderer.RenderManager.render(RenderManager.java:1158)
at com.jme3.app.SimpleApplication.update(SimpleApplication.java:273)
at com.jme3.system.lwjgl.LwjglWindow.runLoop(LwjglWindow.java:537)
at com.jme3.system.lwjgl.LwjglWindow.run(LwjglWindow.java:639)
at com.jme3.system.lwjgl.LwjglWindow.create(LwjglWindow.java:473)
at com.jme3.app.LegacyApplication.start(LegacyApplication.java:481)
at com.jme3.app.LegacyApplication.start(LegacyApplication.java:441)
at com.jme3.app.SimpleApplication.start(SimpleApplication.java:128)
at io.tlf.outside.client.Client$2.run(Client.java:96)
I can run it just fine on my two dev machines. I am running a GTX 1080TI, and he is running a GTX 1660TI on the machine that is having issues.
I believe the issue happens when we load our test character model, it does not occure before the model is loaded, and in our test env it is the only model being loaded right now.
This issue occurs on master using lwjgl3, opengl 3.2. I will see if I can get his logging output for the driver info, but it is late where he is.
in RenderContext.java that is used in GLRenderer where you have exception
public final Image[] boundTextures = new Image[16];
?
but really not sure why only in some GPU it dont work.,
edit:
when looking more carefully its about:
attribIndexList
in RenderContext.
but im not sure what it is exactly about? for example Texcoord 1ā¦12? maybe this GPU have own limits.
Do simple blue box work for him? (some new default jme project with blue box)
I believe the issue happens when we load our test character model, it does not occure before the model is loaded, and in our test env it is the only model being loaded right now.
also last guess could be related to morph indexes, if this character have them. (i seen there was limit anyway)
since you anyway know its character related would be worth to check āanimation amountā/āmorph index amountā/āshader you apply texture/etc amountā related things if work without something.
Hmm. So our model has 18 textures total. But not on the same geometry. There are 8 geometries and each has its own material with 1 to 3 textures. All of the materials are using PBRLighting matdef.
I too am not sure how it is possible that this behavior would be different between machines.
I also have tested on my laptop which has a NVIDIA Quadro K2100m without any issue.
It does have morphs, and quite a few of them, but no animations yet.
I will have him run the blue box and see what occurs. This is a new machine for him, his old one ran our engine fine, but he did not ever try loading the player model on his old machine.
For those following along at home, the exception happens on this line:
ālocā comes indirectly from the Vertex buffer type:
Attribute attrib = context.boundShader.getAttribute(vb.getBufferType());
int loc = attrib.getLocation();
āattribsā comes from the context:
VertexBuffer[] attribs = context.boundAttribs;
Thatās as far as Iāve looked. For some reason, the context only has 16 bound attributes but the vertex buffer type is trying to hit 18.
Makes me wonder what these values are normally⦠and if the shader is adding its own vertex attributes that JME doesnāt define or something. Just random things that come to mind.
OK, I do not know if it is morph related. But I did learn some things that are interesting.
First, he has two GPUs, one of which is integrated. It is the integrated GPU that is having the issue.
I got the logs from him:
OpenGL Renderer Information
* Vendor: NVIDIA Corporation
* Renderer: GeForce GTX 1660 Ti with Max-Q Design/PCIe/SSE2
* OpenGL Version: 3.2.0 NVIDIA 445.75
* GLSL Version: 1.50 NVIDIA via Cg compiler
* Profile: Core
The Nvidia card works just fine for him, the crash is on the AMD card.
We have not yet tested any more than this, we will be testing not having any morphs in the next couple days.
EDIT: The GLSL version difference is very interesting between themā¦
Also, is there a way to get the information printed here from jme? I did some digging and I see it comes from the GL object that is allocated from lwjgl, but I did not see anywhere where the GL object is exposed, and jme does not store this information from what I can find, it simply prints it out. I would like to include this in the Sentry.io info we send back on crashes.
EDIT: Also is there a way to force jme which gpu to pick? by default windows is using the amd gpu.
No. And Iāve always wanted to add it. You have to go directly to lwjgl to get it.
As far as remember: no on that, too. As I recall, Minecraft has a similar issue so you may be able to find stuff on the web about it.
Iād love to learn differently, though.
Edit: I mean there is no programmatic way. The OS itself should provide a way for certain apps to be forced to use a particular GPU⦠but the user has to initiate it, as I recall.
However, if you use (or can create) a native .exe that launches the JVM via JNI (Hence, JVM is in the same process as the .exe ) You can get away with causing the native launcher to export a couple of C Symbols.
We are only using the PBRLighting shader in our test scene right now. I have not tested with others. I also have only been testing with our player character, no other models have been loaded into the scene.
Today I will see if we can arrange more extensive testing where we swap the shader and the model. I will also test reducing the textures.
Maybe, as a troubleshooting measure, patch Renderer context to print the VertexBuffer.Type and VertexBuffer.Format, as well as their index in the attribs[] right around line 2868? At least give you an idea of how the first few compare to each other under the different cardsā¦