Crash in GLRenderer: Index 18 out of bounds for length 16

So one of my developers is running into an issue running our game engines client:

java.lang.ArrayIndexOutOfBoundsException: Index 18 out of bounds for length 16
    at com.jme3.renderer.opengl.GLRenderer.setVertexAttrib(GLRenderer.java:2868)
    at com.jme3.renderer.opengl.GLRenderer.setVertexAttrib(GLRenderer.java:2921)
    at com.jme3.renderer.opengl.GLRenderer.renderMeshDefault(GLRenderer.java:3153)
    at com.jme3.renderer.opengl.GLRenderer.renderMesh(GLRenderer.java:3191)
    at com.jme3.material.logic.DefaultTechniqueDefLogic.renderMeshFromGeometry(DefaultTechniqueDefLogic.java:70)
    at com.jme3.material.logic.SinglePassAndImageBasedLightingLogic.render(SinglePassAndImageBasedLightingLogic.java:260)
    at com.jme3.material.Technique.render(Technique.java:166)
    at com.jme3.material.Material.render(Material.java:1028)
    at com.jme3.renderer.RenderManager.renderGeometry(RenderManager.java:614)
    at com.jme3.renderer.queue.RenderQueue.renderGeometryList(RenderQueue.java:266)
    at com.jme3.renderer.queue.RenderQueue.renderQueue(RenderQueue.java:305)
    at com.jme3.renderer.RenderManager.renderViewPortQueues(RenderManager.java:877)
    at com.jme3.renderer.RenderManager.flushQueue(RenderManager.java:779)
    at com.jme3.renderer.RenderManager.renderViewPort(RenderManager.java:1108)
    at com.jme3.renderer.RenderManager.render(RenderManager.java:1158)
    at com.jme3.app.SimpleApplication.update(SimpleApplication.java:273)
    at com.jme3.system.lwjgl.LwjglWindow.runLoop(LwjglWindow.java:537)
    at com.jme3.system.lwjgl.LwjglWindow.run(LwjglWindow.java:639)
    at com.jme3.system.lwjgl.LwjglWindow.create(LwjglWindow.java:473)
    at com.jme3.app.LegacyApplication.start(LegacyApplication.java:481)
    at com.jme3.app.LegacyApplication.start(LegacyApplication.java:441)
    at com.jme3.app.SimpleApplication.start(SimpleApplication.java:128)
    at io.tlf.outside.client.Client$2.run(Client.java:96)

I can run it just fine on my two dev machines. I am running a GTX 1080TI, and he is running a GTX 1660TI on the machine that is having issues.

I believe the issue happens when we load our test character model, it does not occure before the model is loaded, and in our test env it is the only model being loaded right now.

This issue occurs on master using lwjgl3, opengl 3.2. I will see if I can get his logging output for the driver info, but it is late where he is.

Any ideas?
Thanks,
Trevor

just a guess:

in RenderContext.java that is used in GLRenderer where you have exception

public final Image[] boundTextures = new Image[16];

?

but really not sure why only in some GPU it dont work.,

edit:

when looking more carefully its about:

attribIndexList

in RenderContext.

but im not sure what it is exactly about? for example Texcoord 1ā€¦12? maybe this GPU have own limits.

Do simple blue box work for him? (some new default jme project with blue box)

I believe the issue happens when we load our test character model, it does not occure before the model is loaded, and in our test env it is the only model being loaded right now.

also last guess could be related to morph indexes, if this character have them. (i seen there was limit anyway)

since you anyway know its character related would be worth to check ā€œanimation amountā€/ā€œmorph index amountā€/ā€œshader you apply texture/etc amountā€ related things if work without something.

1 Like

Hmm. So our model has 18 textures total. But not on the same geometry. There are 8 geometries and each has its own material with 1 to 3 textures. All of the materials are using PBRLighting matdef.

I too am not sure how it is possible that this behavior would be different between machines.
I also have tested on my laptop which has a NVIDIA Quadro K2100m without any issue.

It does have morphs, and quite a few of them, but no animations yet.

I will have him run the blue box and see what occurs. This is a new machine for him, his old one ran our engine fine, but he did not ever try loading the player model on his old machine.

this 18 textures total seems to be coincidence then.

no animations, then i would check if without morphs it work in first place :slight_smile:

1 Like

For those following along at home, the exception happens on this line:

ā€˜locā€™ comes indirectly from the Vertex buffer type:

        Attribute attrib = context.boundShader.getAttribute(vb.getBufferType());
        int loc = attrib.getLocation();

ā€˜attribsā€™ comes from the context:

        VertexBuffer[] attribs = context.boundAttribs;

Thatā€™s as far as Iā€™ve looked. For some reason, the context only has 16 bound attributes but the vertex buffer type is trying to hit 18.

Makes me wonder what these values are normallyā€¦ and if the shader is adding its own vertex attributes that JME doesnā€™t define or something. Just random things that come to mind.

Itā€™s definitely vertex attribute related, though.

like you said.

and boundAttribs come later from attribIndexList in RenderContext if im not wrong, but here i hold because im not sure what fill it.

i still anyway think it might be morph amount related.

OK, I do not know if it is morph related. But I did learn some things that are interesting.
First, he has two GPUs, one of which is integrated. It is the integrated GPU that is having the issue.
I got the logs from him:

OpenGL Renderer Information
 * Vendor: ATI Technologies Inc.
 * Renderer: AMD Radeon(TM) RX Vega 10 Graphics
 * OpenGL Version: 3.2.13560 Core Profile Forward-Compatible Context 26.20.11026.3002
 * GLSL Version: 4.60
 * Profile: Core  
OpenGL Renderer Information
 * Vendor: NVIDIA Corporation
 * Renderer: GeForce GTX 1660 Ti with Max-Q Design/PCIe/SSE2
 * OpenGL Version: 3.2.0 NVIDIA 445.75
 * GLSL Version: 1.50 NVIDIA via Cg compiler
 * Profile: Core  

The Nvidia card works just fine for him, the crash is on the AMD card.
We have not yet tested any more than this, we will be testing not having any morphs in the next couple days.
EDIT: The GLSL version difference is very interesting between themā€¦

1 Like

It is interesting because 3.2 should have glsl version 1.5.

Good olā€™ ATI being ATI.

1 Like

Also, is there a way to get the information printed here from jme? I did some digging and I see it comes from the GL object that is allocated from lwjgl, but I did not see anywhere where the GL object is exposed, and jme does not store this information from what I can find, it simply prints it out. I would like to include this in the Sentry.io info we send back on crashes.

EDIT: Also is there a way to force jme which gpu to pick? by default windows is using the amd gpu.

No. And Iā€™ve always wanted to add it. You have to go directly to lwjgl to get it.

As far as remember: no on that, too. As I recall, Minecraft has a similar issue so you may be able to find stuff on the web about it.

Iā€™d love to learn differently, though.

Edit: I mean there is no programmatic way. The OS itself should provide a way for certain apps to be forced to use a particular GPUā€¦ but the user has to initiate it, as I recall.

1 Like

Not in java-land, as far as I can tell.

However, if you use (or can create) a native .exe that launches the JVM via JNI (Hence, JVM is in the same process as the .exe ) You can get away with causing the native launcher to export a couple of C Symbols.

See Force using high performance GPU for AMD Switchable Graphics by peterix Ā· Pull Request #132 Ā· LWJGL/lwjgl Ā· GitHub for the lwjgl teamā€™s take on this.

Nvidia version: http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf (about page 4)
For AMD Cards: Solved: Can an OpenGL app default to the discrete GPU on a... - AMD Community

(You could easily include both versions)

NOTE: this is just results from a quick search crawl, I have not actually tried the technique.

Yes, currently im in progress of doing this:


#include <iostream>
#include <jni.h>
#include <string>

using namespace std;

int main() {

    JavaVM* jvm;                      // Pointer to the JVM (Java Virtual Machine)
    JNIEnv* env;                      // Pointer to native interface
        //================== prepare loading of Java VM ============================
    JavaVMInitArgs vm_args;                        // Initialization arguments
    JavaVMOption* options = new JavaVMOption[1];   // JVM invocation options
    char *opt = (char*) "-Djava.class.path=OutsideClient.jar";
    options[0].optionString = opt;   // where to find java .class
    vm_args.version = JNI_VERSION_1_6;             // minimum Java version
    vm_args.nOptions = 1;                          // number of options
    vm_args.options = options;
    vm_args.ignoreUnrecognized = false;     // invalid options make the JVM init fail
        //=============== load and initialize Java VM and JNI interface =============
    jint rc = JNI_CreateJavaVM(&jvm, (void**)&env, &vm_args);  // YES !!
    delete options;    // we then no longer need the initialisation options. 
    if (rc != JNI_OK) {
        // TO DO: error processing... 
        cin.get();
        exit(EXIT_FAILURE);
    }
    //=============== Display JVM version =======================================
    cout << "JVM load succeeded: Version ";
    jint ver = env->GetVersion();
    cout << ((ver >> 16) & 0x0f) << "." << (ver & 0x0f) << endl;

    jclass cls2 = env->FindClass("io.tlf.outside.client.Main");  // try to find the class
    if (cls2 == nullptr) {
        cerr << "ERROR: class not found !";
    }
    else {                                  // if class found, continue
        cout << "Class MyTest found" << endl;
        jmethodID mid = env->GetStaticMethodID(cls2, "main", "([Ljava/lang/String;)V");
        if (mid == nullptr)
            cerr << "ERROR: method void mymain() not found !" << endl;
        else {
            cout << "Main found, loading outside";
            jobjectArray arr = env->NewObjectArray(0,      // constructs java array of 5
                env->FindClass("java/lang/String"),    // Strings
                env->NewStringUTF(""));   // each initialized with value "str"
            env->CallStaticVoidMethod(cls2, mid, arr);   // call the method with the arr as argument.
            env->DeleteLocalRef(arr);     // release the object
        }
    }


    jvm->DestroyJavaVM();
    return 0;
}

Code mostly taken from here: Calling Java from C++ with JNI - CodeProject

1 Like

OK, here is my finished wrapper, it has been tested and did work at selecting the non-integrated gpu.

// WindowsOutsideLauncher.cpp : This file contains the 'main' function. Program execution begins and ends there.
//
#include <windows.h>
#include <iostream>
#include <jni.h>
#include <string>

using namespace std;


// high performance gpu for nvidia and amd
extern "C"
{
    __declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
    __declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
}


int main() {
    SetDllDirectory(L"./jvm/bin/server");
    SetEnvironmentVariable(L"JAVA_HOME", L"./jvm");
    cout << "Prepairing to launch Outside Client" << endl;
    JavaVM* jvm;                      // Pointer to the JVM (Java Virtual Machine)
    JNIEnv* env;                      // Pointer to native interface

    JavaVMInitArgs vm_args;                        // Initialization arguments
    JavaVMOption* options = new JavaVMOption[1];   // JVM invocation options
    char *opt = (char*) "-Djava.class.path=OutsideClient.jar";
    options[0].optionString = opt;   // where to find java .class
    vm_args.version = JNI_VERSION_10;             // minimum Java version
    vm_args.nOptions = 1;                          // number of options
    vm_args.options = options;
    vm_args.ignoreUnrecognized = false;     // invalid options make the JVM init fail

    cout << "Creating jvm" << endl;
    jint rc = JNI_CreateJavaVM(&jvm, (void**)&env, &vm_args);  // Build JVM
    delete options;    // we then no longer need the initialisation options. 

    if (rc != JNI_OK) {
        // TO DO: error processing...
        cerr << "Failed to create jvm" << endl;
        cin.get();
        exit(EXIT_FAILURE);
    }

    cout << "JVM load succeeded: Version ";
    jint ver = env->GetVersion();
    cout << ((ver >> 16) & 0x0f) << "." << (ver & 0x0f) << endl;

    jclass cls2 = env->FindClass("io/tlf/outside/client/Main");  // try to find the class
    if (cls2 == nullptr) {
        cerr << "ERROR: class not found !";
    }
    else {                                  // if class found, continue
        cout << "Class MyTest found" << endl;
        jmethodID mid = env->GetStaticMethodID(cls2, "main", "([Ljava/lang/String;)V");
        if (mid == nullptr)
            cerr << "ERROR: method void mymain() not found !" << endl;
        else {
            cout << "Main found, loading outside" << endl;
            jobjectArray arr = env->NewObjectArray(0,      // constructs java array of 5
                env->FindClass("java/lang/String"),    // Strings
                env->NewStringUTF(""));   // each initialized with value "str"
            env->CallStaticVoidMethod(cls2, mid, arr);   // call the method with the arr as argument.
            env->DeleteLocalRef(arr);     // release the object
        }
    }


    jvm->DestroyJavaVM();
    return 0;
}

With that said, we did test the model with all mesh morphs removed, and it did not fix the issue on the amd card.

1 Like

Still no idea why the different behaviors depending upon which card, but something odd:

RenderContext.boundAttribs is a final array that is hard-coded initialized to new VertexBuffer(16);

So, why not throwing errors for index 16 or 17?

Have you narrowed it down to a specific object and shader yet?

On Linux I use this environment variable to switch between integrated graphics:

env DRI_PRIME=1 // this chose AMD for me
env DRI_PRIME=0 // and this chose Intel

1 Like

There also is an environment variable for Nvidia, I think, which could be set from java in the main() method, maybe.

We are only using the PBRLighting shader in our test scene right now. I have not tested with others. I also have only been testing with our player character, no other models have been loaded into the scene.

Today I will see if we can arrange more extensive testing where we swap the shader and the model. I will also test reducing the textures.

Maybe, as a troubleshooting measure, patch Renderer context to print the VertexBuffer.Type and VertexBuffer.Format, as well as their index in the attribs[] right around line 2868? At least give you an idea of how the first few compare to each other under the different cardsā€¦

1 Like

You confirm that you are using the unmodified release of the engine as it was released by us and that you did not modify the shader?