Time per frame question on AbstractAppState

I ported some code to jMonkey and noticed things seemed to be moving a little slower in jMonkey and I eventually figured out the source of the problem. In my old update project code I was using System.currentTimeMillis() to get an elapsed time since the last frame and now I’m using the ‘tpf’ passed into AbstractAppState’s update(…). I thought that these two methods of calculating a time delta would yield the same results but it doesn’t.



Here’s the code I used to test this. updateTimeElapsed grows at a slower rate than systemTimeElapsed . Why?



[java]int updateTimeElapsed = 0;

int systemTimeElapsed = 0;

long prevUpdate = -1;

@Override

public void update(float tpf) {

long currTime = System.currentTimeMillis();

if(prevUpdate != -1) {

updateTimeElapsed += (int)(tpf * 1000f);

systemTimeElapsed += (currTime - prevUpdate);

}

prevUpdate = currTime;



if(systemTimeElapsed % 200 == 0) { // Print out at ~every 200th frame

System.out.println(“sys=” + systemTimeElapsed + “, update=” + updateTimeElapsed);

}[/java]

1 Like

Maybe because you are removing a bunch of precision in your test?

updateTimeElapsed += (int)(tpf * 1000f);



You’re letting the nanosecond timer accumulate at its full prevision but you are tracking tpf at only millisecond accuracy.



Also, what version of JME are you running?

@pspeed said:
Maybe because you are removing a bunch of precision in your test?
updateTimeElapsed += (int)(tpf * 1000f);

You're letting the nanosecond timer accumulate at its full prevision but you are tracking tpf at only millisecond accuracy.

Also, what version of JME are you running?


D'oh... never mind.

I didn't notice you were using currentTimeMillis. Still, though... the millisecond timer updates sporadically. Compare against nanoTime instead.

The fact that the millisecond timer updates sporadically doesn’t actually matter for what I’m testing, because it’s the growth over a larger period that I’m worried about. I ended up with numbers at about “sys=85000, update=70000” - 15 seconds apart.



I’m using jMonkeyEngine 3 SDK Beta that is up to date but I will try and update to RC 2.

Except System.currentTimeMillis() will occasionally return the same value but internally will always be advanced. In those same cases, your updateTimeElapsed += (int)(tpf * 1000f); may never advance because it’s rounded off all of the sub-millisecond precision. I’d expect them to vary pretty fast, actually. Even at 60 FPS… for higher FPS I’d expect them to go out of whack faster.



The test is currently flawed. There may be a problem but it’s hard to say because the test as presented will definitely show the issues you mention because of flaws in the test.

1 Like

Note: the quickest way to fix your test would be to change updateTimeElapsed to a double and just += tpf each frame. Then cast to int * 1000 only when you want to display it.

1 Like
pspeed said:
Except System.currentTimeMillis() will occasionally return the same value but internally will always be advanced.


I can testify to that, just yesterday I wrote some simple timing test to find what query to the database was too slow.

time= System.currentTimeMillis();
//call the database;
timeSpent = System.currentTimeMillis - time;
// Write out the time spent.

Many of these wrote out 0, even though they were quick I know 0 just isn't true. So accumulating a lot of 0:s won't be accurate :)
1 Like

are you using xp? are hpet deactivated in bios? both cases can lead to as bad precision as 20ms, wich is very much for a game

While you guys are right, my fear is that you will cause the original OP to completely ignore this thread because your concerns don’t directly address his issue. He is right that System.currentTimeMillis() will keep marching on. So as a rough counter for testing his issue it is ok. You can expect some error but as long as it’s within +/- some reasonably small amount it proves that nothing is really wrong.



The issue is that he’s quantized out all precision in the tpf accumulator.



Let’s take it to the extreme and say he quantized to 1 second. Time would never advance in the tpf accumulator. As it is, at 60 FPS, he removes all sub-millisecond accuracy so the tpf accumulator advances by 16 ms a frame instead of 16.66666 ms per frame. So in only one frame, he’s already lost a full millisecond… and it gets progressively worse the more he goes on. After 1 second, his tpf accumulator will have lost 40 milliseconds. After 10 seconds, 0.4 seconds has been lost.



At higher frame rates or more erratic frame rates this can be even worse. Imagine you have a FPS over 1000 then time may not advance at all.

2 Likes

The OP test is a pretty good idea for a unit test, i.e. to test that tpf works as expected. So cjuillard, please ignore our off topic rants as long as you listen to pspeed :slight_smile:

1 Like

Thanks for the replies!



pspeed summed it up pretty well with his last post. My “updateTimeElapsed += (int)(tpf * 1000f);” loses a tiny bit of precision that can accumulate to a large amount of the time elapsed if the ‘tpf’ is really low already. I was running without vsync on at like 300fps so this really added up. Here’s the fixed version if anyone is curious.



[java]float updateTimeElapsed = 0;

long systemTimeElapsed = 0;

long prevUpdate = -1;

int frameCount = 0;

@Override

public void update(float delta) {

long currTime = System.currentTimeMillis();

if(prevUpdate != -1) {

updateTimeElapsed += delta;

systemTimeElapsed += (currTime - prevUpdate);

}

prevUpdate = currTime;



if((frameCount++) % 60 == 0) {

System.out.println(“sys=” + systemTimeElapsed + “, update=” + (updateTimeElapsed*1000f));

}[/java]



The starting time elapsed values are a little different but their growth is equal.

5 Likes