简体   繁体   中英

Calculating the frame-rate in a Unity scene

I'm making a project with Augmented Reality, using Unity and Vuforia extensions. I'm new to C#, but I was looking for a method similar to ARToolKit's getFrame() , and I'm really not finding anything.

My questions are:

  1. Is it necessary that I can calculate the frame-rate that my scene is operating at?
  2. Which scene object should i use to track the frame-rate?

Thats as simple as:

public float avgFrameRate;

public void Update()
{
    avgFrameRate = Time.frameCount / Time.time;
}

Put this code in any MonoBehaviour and attatch it to any GameObject in the scene hierarchy.

Please note: this will only give you an average frame-rate. For a more current frame-rate, other answers have addressed effective ways of accomplishing that.

You will want something like a timer that tracks the time, and how long it took to update the screen, and extrapolates from that how many frames are drawn in a second.

I am fairly rusty with Unity, but I believe something like 1/Time.deltaTime should give you what you want.

So you'd have something like

public void Update()
{
    framerateThisFrame = 1/Time.deltaTime;
}

Next you would have to decide how often to change the displayed FPS, since framerateThisFrame can change a lot during every frame. You might want to change it every two seconds for example.

EDIT

An improvement you might want to make is something like storing the past n frames, and use an average to calculate the FPS, then display it. So you could end up with something like:

public int Granularity = 5; // how many frames to wait until you re-calculate the FPS
List<double> times;
int Counter = 5;

public void Start ()
{
    times = new List<double>();
}

public void Update ()
{
   if (counter <= 0)
   {
       CalcFPS ();
       counter = Granularity;
   } 

   times.Add (Time.deltaTime);
   counter--; 
}

public void CalcFPS ()
{
    double sum = 0;
    foreach (double F in times)
    {
        sum += F;
    }

    double average = sum / times.Count;
    double fps = 1/average;

    // update a GUIText or something
}

EDIT

You might even multiply the frame time by Time.timeScale, if you want to be consistent while you apply slow-down/time altering effects.

Since the framerate can vary constantly, it will change many times during a given second. I've used the following recommended approach to get the current framerate. Just put it in a new script and add it to a new, empty game object in your scene.

float deltaTime = 0f;

void Update() {

    deltaTime += (Time.deltaTime - deltaTime) * .1f;

}

Source, including display method: http://wiki.unity3d.com/index.php?title=FramesPerSecond

You should look at Time.smoothDeltaTime . This returns a smoothed Time.deltaTime value which you can use instead of having to smooth it yourself using one of the techniques mentioned in other answers.

None of the answers here consider the fact that the timescale can be modified in Unity and if it is, all the above approaches will be incorrect. This is because Time.Delta time is influenced by the timescale .

As such, you need to use Time.unscaledDeltaTime :

int fps = 0;
void Update () {
    fps = (int)(1f / Time.unscaledDeltaTime);
}
IEnumerator FramesPerSecond()
{
    while (true)
    {
        yield return new WaitForSeconds(1);
        Debug.LogFormat("Fps {0}", Time.frameCount/Time.time);

    }

}

private void Start()
{
    StartCoroutine(FramesPerSecond());
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM