简体   繁体   中英

Libgdx how to calculate elapsed time

I was trying to calculate elapsed time in libgdx by adding in the render method the value of delta to my float value which is the time i measure since play state starts. The bug or problem, how you want to call it, is that by my calculations, the time displayed doing this is two times the real time. I tried to divide by 2 and I was pretty close to the real time, which means that by adding delta every render call I don't get real time in seconds. Why is this happening?

private float time=0;
public void render () {
    time +=Gdx.graphics.getDeltaTime();
}

the above doesn't get the real time and my question is why? EDIT: i am using a screen but it doesn't matter i tried both Gdx.graphics.getDeltaTime() and delta argument from render method.

You can use wrapper TimeUtils.

Get current time:

long startTime = TimeUtils.millis();

Get time elapsed since startTime:

long elapsedTime = TimeUtils.timeSinceMillis(startTime);

LibGDX is still Java at heart, just use the standard methods for getting the time. At the start of the application use

startTime = System.currentTimeMillis();

then in the render you can do

System.out.println("Time elapsed in seconds = " + ((System.currentTimeMillis() - startTime) / 1000));

Make sure startTime is declared as a long.

As to why it's not working with the method you posted, I'm not sure.

EDIT: If your LibGDX project isn't set to run at a fixed rate, then the frame-rate change will cause a change in Gdx.graphics.getDeltaTime(), because the delta time is returned for the last frame updated.

Why not just take a system time stamp from standard java libraries. Do that when you start the application and when you want to calculate the elapsed time reduce that initial value from the current time.

long startTime = System.currentTimeMillis();

and then whenever you want to get the elapsed time do:

long elapsed time = System.currentTimeMillis() - startTime;

I don't think adding the delta is very accurate since probably some of the time is lost between when libgdx stops measuring the delta and when it starts again in which case it's going to drift over time.

Not sure why you get about twice as fast though... I know the render call is called 60 times a second. maybe the delta is somehow considering 30 frames per second...

BTW, there's also getRawDeltaTime() This should give you something more accurate.

Update: As Christian commented - you need to divide by 1000 to get to seconds resolution rather than milliseconds.

TimeUtils is required for portability. Avoid using system classes if you are planning on deploying on multiple platforms. LibGdx features its own array class for this reason as well.

Example:

public long time;
time = TimeUtils.nanoTime();

From the wiki:

Class TimeUtils "Wrapper around System.nanoTime() and System.currentTimeMillis(). Use this if you want to be compatible across all platforms!"

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM