The way to avoid a (finished with non-zero exit value + atio6axx.dll) error

Update :
To avoid this error, use directBuffer for data in : new Image(format, data, colorspace)
and
android is not effected by this issue.
So, try to use ByteBuffer.allocateDirect allocate bytebuffers relate to nativeObject.

======original post=======
What confused me is:
My android app and desktop app shire SAME core game-engine lib , and Android app runs fine.
It seems if a custom texture is use on a material the system will crush on PC but works fine in android.
the received error message :

#
# A fatal error has been detected by the Java Runtime Environment:
#
#  EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x00007fff0344ee6c, pid=13600, tid=10452
#
# JRE version: OpenJDK Runtime Environment (17.0.7) (build 17.0.7+0-b2043.56-10550314)
# Java VM: OpenJDK 64-Bit Server VM (17.0.7+0-b2043.56-10550314, mixed mode, tiered, compressed oops, compressed class ptrs, g1 gc, windows-amd64)
# Problematic frame:
# C  [atio6axx.dll+0x1f6ee6c]
#
# No core dump will be written. Minidumps are not enabled by default on client versions of Windows
#

> Task :desktop:DesktopApp.main() FAILED

Execution failed for task ':desktop:DesktopApp.main()'.
> Process 'command 'D:\android\Android Studio\jbr\bin\java.exe'' finished with non-zero exit value -1073740791

the texture :

        ArrayList <ByteBuffer> data = new ArrayList<>();
.......
        ima = new Image(Image.Format.RGBA8,100,100,32, data, ColorSpace.Linear);
        t2d = new Texture2D(ima);

and

public void update( float tpf ) {
......
ima.setUpdateNeeded();
}

Did you already test with a 128 * 128 size texture? Possibly, your graphic card doesn’t support Textures that have not dimensions of a power of two.

See “NonPowerOfTwoTextures” in

1 Like

Thank you . I’ll try it later and post the result in this thread.

update:
just tried 128x128. It failed again.

BTW
Some minecraft player seems experienced this issue as well:
When I try to play Minecraft it crashes (AMD tech forum)

That’s because it’s a graphics card bug and not a game engine bug.

3 Likes
        ArrayList <ByteBuffer> data = new ArrayList<>();
//data.add(ByteBuffer.allocate(128*128*4));          android: ok  PC :failed
data.add(ByteBuffer.allocateDirect(128*128*4));      android: ok PC: ok
        ima = new Image(Image.Format.RGBA8,128,128,32, data, ColorSpace.Linear);
        t2d = new Texture2D(ima);

and the data eventually goes to:

final class TextureUtil 
private void uploadTextureLevel(GLImageFormat format, int target, int level, int slice, int sliceCount, int width, int height, int depth, int samples, ByteBuffer data) {
        if (format.compressed && data != null) {
            if (target == GL2.GL_TEXTURE_3D) {
                // For 3D textures, we upload the entire mipmap level.
                gl2.glCompressedTexImage3D(target,
                                            level,
                                            format.internalFormat,
                                            width,
                                            height,
                                            depth,
                                            0,
                                            data);

Android would not be affected by this because all memory is ‘direct’ there.

Point of note: we might have been able to spot the problem for you if the critical piece of code had been included in the original post instead of ‘…’ For maximum help, be careful what you snip out.

2 Likes