[SOLVED] Engine having lagspikes on high-end gaming pc

Hello,

I have a project with other students and we’ve been working on something using the jMonkeyEngine.
Now the thing is the engine runs fine on all of our laptops, pc’s etc. Even on my (not so very good) laptop.
Now when I run the project on my gaming pc it lags sometimes. Like once every 3 seconds i get some framedrops.
Specs:

  • GeForce GTX 970
  • Intel® Core™ i5-6600K CPU @ 3.50GHz
  • 15.96 GB RAM
  • Windows 10 64-bit
  • Java version 1.8.0_131

Use profiler to check if those spikes occurs when GC is doing his job.

I do not know how to profile my java programs. How do I do this / with which tools?

(I’m not sure, but I think this experience is from developing OpenGL apps on the Android, not specifically JME).
Are you using a fixed framerate? If not, a higher framerate may make the GC run more often resulting in the spikes you see.

1 Like

Hello It’s for a school project (simulation of a drone). And no we do not use a fixed framerate.

Exactly what happened to me, I had a very high frame rate but it somehow did slow my whole game. By setting Vsnc or fix frame rate like 60Hz fixed my issue.

1 Like

OK. Try locking it to 60 or 90 or whatever framerate you need and see if that helps.
Intuition (that may not be entirely correct down to numbers): An fps of 600 produces x10 as much “garbage” as 60 fps. The GC gets more work to do in the same amount of time resulting in larger spikes, more often.

2 Likes

that fixed my problem! thanks

Any motion should always be multiplied by the frame rendering time. This will get rid of these problems.

What do you mean, could you explain your answer?

He means if you want to rotate or move or animate anything (amongst other things) you must multiply it by tpf to avoid framerate dependent speed in favor of time based speed. Otherwise a 500fps client would animate like hyperspeed compared to a client run ing at 60 or 70fps. You utilise tpf (time per frame) to avoid this problem.

It doesnt have much to do with your issue tbh, and if you dont understand you will when you see things moving faster or slower as your frame rate changes.

Not sure if it is the same issue, but I know nvidia cards run an option called “threaded optimization” which can cause frame drops every few seconds. Basically the option offloads the render calls to a separate thread and the frame drop is caused by the synchronization. There is a way to setup the meshes “correctly” for nvidia cards but I haven’t spent the time yet figuring out how to do it in the engine. Setting vsync helped the issue but didn’t completely solve it. I didn’t get clean framefrate until I disabled the threaded optimization.

I was referring to tpf, Because I did not understand what kind of slowing in the topic.

Slowing down the game (decreasing fps) does not solve the problem but will only delay it. If spikes are created by garbage collection (and I’m pretty sure that they are), you need to rework your code. Don’t create temporary objects, reuse them, avoid using iterators…

Managing the memory properly is a common problem.

1 Like

Nice hardware. Skullstone works well without lags on 7-years old core2 duo :slight_smile:

So, it seems that you need to learn it. It is easy. Google “java profiler”.

1 Like

Yes I know my teammate has spend the entire day yesterday fixing GC problems. We worked on the project for the entire academy year and tomorrow is our due date :sweat_smile:… We had to make a drone simulation with objects in the world and our drone has to automatically (with AI) move to the objects using stereovision (so calculating all the objects on its own) in a different thread and recreating the objects in that thread on a different jMonkeyEngine.

Alright I understand what you mean now with tpf, thanks for clearing that up will help me if i have any future game/engine projects !!

Which GC algorithm do you use? I can recommend +UseConcMarkSweepGC

My teammate sais we’re using the default garbage collection algorithm

Change it and try if it solves the lagspikes problem.