Unexpected (asynchronous?) game initialisation

I’m getting an odd problem with JME CVS (and 0.10) that I’m sure I didn’t get before.



I’m setting up a game using code very similar to (and based on) SimpleGame, but with some extra nodes, lights etc.

I then extend this to add some terrain, and a plane that flies along in its update(time) method.



In older versions, when I started the game it used to set up the level, short delay, then I would see the plane start where I expected, and fly along, with everything (terrain, skybox, plane) visible from the first frame.



The problem I have now is that when the game starts, I first see plain blue (background colour), then my crosshair texture appears in center of the screen, then the skybox fills itself in, then the plane appears, then the terrain. By the time this happens (it takes a few seconds) my plane has normally flown a good way, into a mountain. It’s only at this point that the update handler on my camera node kicks in, so I see the camera view fly across the level to where my plane is sat in a mountain, then the collision detection lifts the plane up out of the mountain.



The order might be slightly different from how I’ve described it (hard to see the exact order of terrain and plane appearing), but the basic point is that it seems that the init code is being executed asynchronously with the update, or possibly it is just that meshes are being loaded as the game runs. Either way, it’s really confusing me since I had thought everything was single threaded.



After the initial weirdness, the game runs as expected.



If anyone wants to give it a go, the code is at aircarrier.dev.java.net and will run (oddly) with JME CVS or 0.10. With minor alterations it should run with earlier versions of JME to compare. It needs some textures from jmetest as well. The aircarrier CVS is set up as an eclipse project, you just need to run CarrierSimpleLevel with the right libs etc. If anyone wants to give it a go, just let me know if you have problems  :slight_smile:

That is very odd indeed. I can't really think of any real reason for behavior like you mention…Can you give us some stats on your computer and card, is it a dual processor or HT CPU?  You are not seeing this with the jmetests?

My PC is pretty standard - AMD64 3000 (just the one), GF 6800GS, 1GB, running linux (ubuntu hoary).



I think I've actually tracked down the cause of the problem, and I was jumping the gun to say it was a loading problem, doh :slight_smile: However it is something that might be worked around in the engine.



I put in some more debugging to monitor the startup, and it seems like what is breaking my code is the fact that when the game starts, there is an initial frame with a small elapsed time on timer, then no more frames for about 2 seconds. When I was testing before, I missed this first frame before the 2 second gap, so I didn't realise the initialisation was complete before the gap.



When the update is done with a time of 2 seconds, the plane moves a considerable distance. This in turn causes the spring tracking of the camera to fling the camera off into the far distance (as it simulates two seconds of acceleration due to the spring being far more stretched than it ever should be). Somehow this stops everything displaying, even the cursor on the HUD :slight_smile: Then as the camera flings itself back in from the void, the cursor reappears, then the skybox, then finally the terrain. In fact the first frame of 2 seconds doesn't cause the problem all on its own - subsequent frames are smoothed down from this 2 seconds to the more realistic 1/200th of a second I get while running, over the course of the 32 frames averaged by the timer (gives an exponential dropoff I think):



1.72, 0.87, 0.6, 0.45, 0.36, 0.31, 0.26



Those are the frame times seen by update - each actually takes about 1/200th of a second, but I get those frame times reported by timer.



To test this, I kept the plane still for 100 frames (past the moving average count for Timer) and it works fine.



This does lead to an odd problem though - it seems like the first very long frame should be excluded from averaging in the Timer (I'm not that keen on the averaging in general actually) since otherwise, I get about 6 "extra" seconds of time at startup, that don't really happen in the game, but are reported to update, causing all the weird stuff I'm seeing. The plane travels quite a long way very very quickly (a few seconds of game logic in a few hundredths of a second of real time, on each of the first few frames). This makes the camera spring go nuts because it is seeing those large time updates itself as well as large plane movements, leading to the spring oscillating wildly.



I think the issue might also occur with some of the jmetests that take a little longer to start. However I am certain it is happening with flagrush lesson 9. I can drive the bike most of the way to the barrier before I can see it, in the first few very long frames (one that is actually a genuine long frame during startup, the rest due to Timer smoothing), and it does some odd stuff when it appears, display flickering about etc. Very similar to my app, except I think the amount of setup is smaller, so the delay and weirdness are less. For my app I am loading a relatively large multi part model, and a large paged terrain with hi res texture. Nothing that causes any problems while it is running, just at start up.



I'm going to make my code a little more robust for long frames, look at more spring damping etc, but the fundamental problem is that for the first 32ish frames, the frame times are complete nonsense :slight_smile: It seems like the best thing would be to get rid of the 2 second "setup" frame, but I imagine this might be due to some unavoidable transfer to the card, I'm not that good on technical details :wink: However if this can't be done, it would be great to be able to disable or at least tweak the averaging behaviour of Timer. For example, performing no averaging until a certain number of frames elapse, etc.



Sorry for the very long reply, I wanted to make sure the explanation made some kind of sense, and it ended up a little rambling. Thanks for help :slight_smile:

Hmm, thanks for the information…  I've got some ideas on how we might help allieviate this that I'll try this week.

dunno if tpf should be smoothed at all…at least i would like a getTrueTimePerFrame() that is'nt smoothed…

Not smoothing the timer causes any system stutter to show through terribly.  Also, looking at the code it already throws out the first frame's time…  Hmm.

Implementing a non-smoothed "timer" is very easy, you don't even need the timer class for that. The timer class was usefull to get an accurate timer on all platforms… now that jME is 1.5 you can just use the nanotimer. Not that I'd recommend a timer without any smoothing, but still…



As for your problem shingoki, you can always wait a few frames till you start invoking updateGeometricState(). I think irrisor also implemented a reset method for the timer recently.






cough

http://www.jmonkeyengine.com/jmeforum/index.php?topic=2900.msg22128#msg22128

cough



Now, what I just hacked up was the URL to my ridiculously small non-smoothed timer.  I use it as a Timer on the server as it doesn’t really matter how “smooth” the timer is.



darkfrog

Thanks for the suggestions :slight_smile: I'll have a look at the alternate Timer, thanks DarkFrog :slight_smile: I had some kind of feeling there was something magical about the built in timer, but if I'm free to ignore it there's no problem.



Like Renanse says, the Timer does give 1/60th of a second on the first call, the problem is the first "real" frame interval, which you see on the second frame. This is the ~2 second interval, and it's actually the second value you get out of Timer. I didn't make that very clear before, and I'm not sure I have even now :slight_smile:



I can see the argument for smoothing to get rid of stutter, but I don't know that a simple moving average is the best way to filter stutter. A better approach might be to monitor average frame rate and standard deviation/range, and when an unusually long frame interval is seen, just return the average. The long interval would never be added to the timer average or SD calcs. This would avoid the game progressing too far without user response during a stutter, but wouldn't smear out that stutter to following frames, which actually makes the problem worse. I'm going to have a look at implementing something like this in a replacement Timer. In addition, the first few frame intervals could be very short (less than 1/60th, more like 1/500th) so that very little happens in game logic until we're sure that any initial long frames are past. I'm just manually capping frame lengths at 0.1 seconds at the moment, and it works ok, but an approach with an average and a range/SD would adapt to different (really slow) hardware better.



Does anyone know what causes the second frame to be so long after the first one?

Sounds like you're doing something after render? Or setting something up on the first update and then doing the work on the second?  I implemented something similar to this when I was writing Roll-A-Rama in order to get something up on the screen and then lock the update with the loading process.  Sort of a hack, but it worked. ;)  Is it possible you are accidentally doing something similar?



darkfrog