My experience with jME on iOS

hi, does that mean we cant use networking at all in IOS?

Is this an Avian thing or IOS?

You can force register objects even if they don’t have the @Serializable annotation. You can see I do it here in Zay-ES-net to avoid having JME dependencies in the core (core classes don’t need to import Serializable and therefore are 100% JME-independent):

https://code.google.com/p/jmonkeyplatform-contributions/source/browse/trunk/zay-es/extensions/Zay-ES-Net/src/com/simsilica/es/net/EntitySerializers.java?spec=svn1198&r=1198

Edit: note that this may or may not fix the issue since I guess isAnnotationPresent() will still be in the code.

Avian or jME-Avian? I simply don’t know, sorry. Maybe it’s just a missing flag.

Thanks Paul. Registering the messages manually with specific (known) serialisers solved the problem.

Thanks again.

Glad it worked. It’s a good thing to know how to do but it’s unfortunate that you had to.

9 times out of 10, registering a message with FieldSerializer is what you want.

1 Like

Were my traces helpful in any way or is there something else I still can do to narrow down the problem?

I’ve managed to solve this issue on mine with your help.

I dont know if this problem is specifically related to my IPAD which is IPAD 3 as I dont have access to and therefore cant test on any other physical IOS device and the simulators are not true to life I find.

My IPAD 3 has a resolution of 2048x1536. When the screen is initially rendered it seems to render the scene at the full resolution but for some reason the camera is only rendered at 1024x768.

This effect of that is you can only see the bottom left quarter of the scene. The spatials are only partially visible. As you said you can cause a reshape by pysically changing the orientation of the device.

To force this same functionality I copied the reshaping code in the harness from APPLICATION.DID ROTATE to APPLICATION.DID BECOME ACTIVE.

This works but I had another related problem regarding the input handler.

What appears to be happening is, it is rendering the scene at 1024x768 then using SCALE factor to stretch to 2048x1536 with a scale factor of 2.

This is the code in the harness which shows this. the ‘scale’ variable is set to 2 in my case to stretch it to 2048x1536
This explains why the initial screen is rendering in the bottom left corner because until the harness applies this scale factor it is only rendering at 1024x768 and doesn’t render properly.

  • (void)didRotate:(NSNotification *)notification
    {

    CGRect originalFrame = [[UIScreen mainScreen] bounds];
    CGRect frame = [self.glview convertRect:originalFrame fromView:nil];

    JNIEnv* e = getEnv(self.vm);
    if (e) {
    float scale = _glview.contentScaleFactor;

      (*e)->CallVoidMethod(e, self.harness, self.reshapeMethod, (int)(frame.size.width * scale), (int)(frame.size.height * scale));
      if ((*e)->ExceptionCheck(e)) {
          NSLog(@"Could not invoke iOS Harness reshape");
          (*e)->ExceptionDescribe(e);
          (*e)->ExceptionClear(e);
      }
    

    }

Therefore I just copied this same code to the DID BECOME ACTIVE method as well. Obviously, if you do it this way you have to re-apply this when the IOS plugin is updated.

However, I then found another problem related to INPUT HANDLER.

It seems this SCALE factor is not applied correctly/at all to the co-ordinates that are returned when the screen is touched. The causes some strange results when ray casting in 3D space.

The mouse co-ordinates are fine on OSX or WINDOWS. The android handles it correctly so this points to AVIAN.

The way I got around it was to manually calculate the scale factor and apply it to the X and Y co-ordinates returned from the INPUIT HANDLER before ray casting and it works fine.

I can give code examples if anyone has this and cant get it to work.

“scale factor of 2” sounds familiar :wink: I used it to correct the nifty coordinates. As I did not need “mouse picking” I wasn’t sure that this workaround is required here too.

For horizontal orientation the situation is even worse. The coordinate system is shifted somewhat.

This needs to be fixed in the iOS renderer. Sorry, @normen, it’s your turn :grin:

Good news: jME runs on iOS 64 bit!

Avian fixed ARM64 exactly one month ago!

I think this is the relevant fix: Fixes for arm64, new clang, new ios SDK by joshuawarner32 · Pull Request #430 · ReadyTalk/avian · GitHub
It explains the problems in my logs above.

NOTE: The bug fix is not contained in Avian 1.2.0, you need to get main.

2 Likes

The SDK iOS plugin already supports that for a while now.

Sorry, but the Avian bug fix mentioned above is essential. My apps don’t run without it. See my logs above.

Updated workaround for encoding ISO_8859_1 on my list. (Item 4.)
Example:

Class.forName("sun.nio.cs.ISO_8859_1"); // required on iOS
Charset.forName("ISO-8859-1");         // will crash without previous line on iOS

UPDATE: I found out that it works even better if I exclude all charset classes from ProGuard. Well, that was sooooo obvious… :stuck_out_tongue_winking_eye:

1 Like

To support annotations just add these ProGuard exclusions:

-keep public class java.lang.reflect.Proxy { *; } \
-keep public class java.lang.reflect.InvocationHandler { *; }

Can’t believe that it was that easy… :joy:
List updated.

1 Like

Hi Guys,

I finally got around to doing more testing on 3.1 on IOS.

I have some more observations.

  1. Sound is definitely not working using any format. OGG, WAV etc.
    During the init stage it does say "Failed to load audio library’

I have tried it in it’s most basic form and all all the variations in the JME test examples but none produce sound.

See log below.


Registered loader: BitmapFontLoader for extensions [fnt]
DesktopAssetManager created.
IGLESContext getRenderer
Camera created (W: 640, H: 480)
Camera created (W: 640, H: 480)
Failed to load audio library
Loaded material definition: Unshaded
Loaded Common/MatDefs/Misc/Unshaded.j3md with J3MLoader
Loaded Interface/Fonts/Default.png (Flipped) with IosImageLoader
Loaded Interface/Fonts/Default.fnt with BitmapFontLoader

  1. As you have only included the 64 bit simulator in the new compiled files then you have to change the section of the OTHER LINKER FLAGS in the IOS project to reference the 64bit simulator files.
    I.E.
    Change
    …/…/build/ios-i386/libs.list
    to
    …/…/build/ios-arm64/libs.list

This will allow it to compile and run on 64 bit simulators. If you change it in the project section / build settings it will inherit to the targets

  1. Change the BUILD FOR ACTIVE ARCHITECHTURE ONLY to YES for Debug.

  2. In the TOUCH START / MOVE / END events in JMEAppDelegate cast the TOUCH.TIMESTAMP to (jlong) or it corrupts the x and y co-ordinates before it reaches the IOS harness on 64 bit devices.
    By casting it to a jlong it reduces the accuracy so it works on all devices but the touch co-ordinates are still fine.
    See below.

  • (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent )event {
    NSLog(@“touchesBegan”);
    JNIEnv
    e = getEnv(self.vm);
    if (e) {
    UITouch *touch = [touches anyObject];
    CGPoint position = [touch locationInView: nil];
    (*e)->CallVoidMethod(e, self.harness, self.injectTouchBegin, 0, (jlong)touch.timestamp, position.x, position.y);
    if ((*e)->ExceptionCheck(e)) {
    NSLog(@“Could not invoke iOS Harness injectTouchBegin”);
    (*e)->ExceptionDescribe(e);
    (*e)->ExceptionClear(e);
    }
    }
    }

  • (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent )event {
    NSLog(@“touchesMoved”);
    JNIEnv
    e = getEnv(self.vm);
    if (e) {
    UITouch *touch = [touches anyObject];
    CGPoint position = [touch locationInView: nil];
    (*e)->CallVoidMethod(e, self.harness, self.injectTouchMove, 0, (jlong)touch.timestamp, position.x, position.y);
    if ((*e)->ExceptionCheck(e)) {
    NSLog(@“Could not invoke iOS Harness injectTouchMove”);
    (*e)->ExceptionDescribe(e);
    (*e)->ExceptionClear(e);
    }
    }
    }

  • (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent )event {
    NSLog(@“touchesEnded”);
    JNIEnv
    e = getEnv(self.vm);
    if (e) {

      UITouch *touch = [touches anyObject];
      CGPoint position = [touch locationInView: nil];
      (*e)->CallVoidMethod(e, self.harness, self.injectTouchEnd, 0, (jlong)touch.timestamp, position.x, position.y);
      if ((*e)->ExceptionCheck(e)) {
          NSLog(@"Could not invoke iOS Harness injectTouchEnd");
          (*e)->ExceptionDescribe(e);
          (*e)->ExceptionClear(e);
      }
    

    }
    }

I also had to add some extra code to pass over the touch scaling factor. If you are getting any problems getting the ray casting to work let me know all ill post my code.

I suppose I could create a pull request for all this but dont know how to and cant be bothered reading the instructions.

If this is not the correct place to post this tell me where i should.

Thanks a lot. And of course you should post your scaling factor code here. Unfortunately I cannot verify it at the moment, but you should not keep it for yourself.