My experience with jME on iOS

I’ve managed to solve this issue on mine with your help.

I dont know if this problem is specifically related to my IPAD which is IPAD 3 as I dont have access to and therefore cant test on any other physical IOS device and the simulators are not true to life I find.

My IPAD 3 has a resolution of 2048x1536. When the screen is initially rendered it seems to render the scene at the full resolution but for some reason the camera is only rendered at 1024x768.

This effect of that is you can only see the bottom left quarter of the scene. The spatials are only partially visible. As you said you can cause a reshape by pysically changing the orientation of the device.

To force this same functionality I copied the reshaping code in the harness from APPLICATION.DID ROTATE to APPLICATION.DID BECOME ACTIVE.

This works but I had another related problem regarding the input handler.

What appears to be happening is, it is rendering the scene at 1024x768 then using SCALE factor to stretch to 2048x1536 with a scale factor of 2.

This is the code in the harness which shows this. the ‘scale’ variable is set to 2 in my case to stretch it to 2048x1536
This explains why the initial screen is rendering in the bottom left corner because until the harness applies this scale factor it is only rendering at 1024x768 and doesn’t render properly.

  • (void)didRotate:(NSNotification *)notification
    {

    CGRect originalFrame = [[UIScreen mainScreen] bounds];
    CGRect frame = [self.glview convertRect:originalFrame fromView:nil];

    JNIEnv* e = getEnv(self.vm);
    if (e) {
    float scale = _glview.contentScaleFactor;

      (*e)->CallVoidMethod(e, self.harness, self.reshapeMethod, (int)(frame.size.width * scale), (int)(frame.size.height * scale));
      if ((*e)->ExceptionCheck(e)) {
          NSLog(@"Could not invoke iOS Harness reshape");
          (*e)->ExceptionDescribe(e);
          (*e)->ExceptionClear(e);
      }
    

    }

Therefore I just copied this same code to the DID BECOME ACTIVE method as well. Obviously, if you do it this way you have to re-apply this when the IOS plugin is updated.

However, I then found another problem related to INPUT HANDLER.

It seems this SCALE factor is not applied correctly/at all to the co-ordinates that are returned when the screen is touched. The causes some strange results when ray casting in 3D space.

The mouse co-ordinates are fine on OSX or WINDOWS. The android handles it correctly so this points to AVIAN.

The way I got around it was to manually calculate the scale factor and apply it to the X and Y co-ordinates returned from the INPUIT HANDLER before ray casting and it works fine.

I can give code examples if anyone has this and cant get it to work.