简体   繁体   中英

Android touch camera control with inertia in Unity3d

I'm working on a script for Android tablet devices in Unity3d, where the user drags to move the camera. I want the "ground" at the touch position to stay underneath the users finger while he is panning. Here is my simplyfied working code so far:

using UnityEngine;

public class CameraMovement : MonoBehaviour
{
    Plane plane = new Plane(Vector3.forward, Vector3.zero);
    Vector2 worldStartPoint;

    void Update()
    {
        if (Input.touchCount == 1)
        {
            Touch touch = Input.GetTouch(0);

            if (touch.phase == TouchPhase.Began)
            {
                this.worldStartPoint = GetWorldPoint(touch.position);
            }

            if (touch.phase == TouchPhase.Moved)
            {
                Vector2 worldDelta = GetWorldPoint(touch.position) - worldStartPoint;
                transform.Translate(-worldDelta.x, -worldDelta.y, 0);
            }
        }
    }

    Vector2 GetWorldPoint(Vector2 screenPoint)
    {
        Ray ray = Camera.main.ScreenPointToRay(screenPoint);
        float rayDistance;
        if (plane.Raycast(ray, out rayDistance))
            return ray.GetPoint(rayDistance);

        return Vector2.zero;
    }
}

Now the problematic part: I would like the camera to move like a physics object once the user lifts up his finger. I'm trying to calculate the current velocity while dragging and then applying it as a dampened/inertia-like effect while not currently dragging. Theoretically I would do this:

Vector2 worldStartPoint;
Vector3 velocity;

void Update()
{
    if (Input.touchCount == 1)
    {
        Touch touch = Input.GetTouch(0);

        if (touch.phase == TouchPhase.Began)
        {
            this.worldStartPoint = GetWorldPoint(touch.position);
        }

        if (touch.phase == TouchPhase.Moved)
        {
            Vector2 worldDelta = GetWorldPoint(touch.position) - worldStartPoint;
            transform.Translate(-worldDelta.x, -worldDelta.y, 0);

            velocity = worldDelta / Time.deltaTime;
        }
    }
    else
    {
        transform.Translate(-velocity * Time.deltaTime);
        velocity = Vector3.MoveTowards(velocity, Vector3.zero, damping * Time.deltaTime);
    }
}

So, I always calculate velocity while moving, and once I stop input, it should remain at the last known velocity, apply it and reduce from there on until stop. However, usually the last velocity is zero, because it appears that when dragging/swiping across the screen the finger actually stops shortly and reports zero velocity before TouchPhase.Moved is over.

Now my solution/workaround would be to keep an array of velocities of the last few frames (maybe 30) and once the finger is lifted up, I would calculate the average velocity.

Vector3[] velocityBuffer = new Vector3[bufferSize];
int nextBufferIndex;

Vector3 GetAverage()
{
    Vector3 sum = Vector3.zero;
    for (int i = 0; i < bufferSize; i++)
        sum += velocityBuffer[i];

    return sum / bufferSize;
}

This works a little better, since more often than not it reports at least some velocity, but in total it's not much better and also feels very hacky. Depending on the speed of the touch, I might end up with 20 entries of zero velocity, making the damping much too strong, and sometimes the velocity randomly becomes so big that the camera just flicks a few hundred units.

Is there anything wrong with my calculations or am I overseeing something simple to fix this? Does anybody have a working solution of a smooth camera drag? I looked at a few mobile games and a lot of them actually were feeling a little clumsy, like snapping the camera to the finger position, after a few pixels of deltaMovement and then suddenly releasing it without any damping.

I'm not feeling as if I've found the perfect solution, but at least by now I have something which works better and also is more "physically correct". First, I am now storing the camera position and deltaTime in my buffer instead of the velocity. This way I can calculate a rolling average every 10 frames with the correct time factor.

/// <summary>
/// Stores Vector3 samples such as velocity or position and returns the average.
/// </summary>
[Serializable]
public class Vector3Buffer
{
    public readonly int size;

    Sample[] sampleData;
    int nextIndex;

    public Vector3Buffer(int size)
    {
        if (size < minSize)
        {
            size = minSize;
            Debug.LogWarning("Sample count must be at least one. Using default.");
        }

        this.size = size;
        sampleData = new Sample[size];
    }

    public void AddSample(Vector3 position, float deltaTime)
    {
        sampleData[nextIndex] = new Sample(position, deltaTime);
        nextIndex = ++nextIndex % size;
    }

    public void Clear()
    {
        for (int i = 0; i < size; i++)
            sampleData[i] = new Sample();
    }

    public Vector3 GetAverageVelocity(Vector3 currentPosition, float currentDeltaTime)
    {
        // The recorded sample furthest back in time.
        Sample previous = sampleData[nextIndex % size];
        Vector3 positionDelta = currentPosition - previous.position;
        float totalTime = currentDeltaTime;
        for (int i = 0; i < size; i++)
            totalTime += sampleData[i].deltaTime;

        return positionDelta / totalTime;
    }

    [Serializable]
    struct Sample
    {
        public Vector3 position;
        public float deltaTime;

        public Sample(Vector3 position, float deltaTime)
        {
            this.position = position;
            this.deltaTime = deltaTime;
        }
    }

    public const int minSize = 1;
}

Also, I noticed, that I was recording a lot of zero velocity values, which is now mitigated because I'm tracking the position, which I also want to update when not dragging, but holding:

if (input.phase == TouchPhase.Moved || input.phase == TouchPhase.Stationary)
{
    velocityBuffer.AddSample(transform.position, Time.deltaTime);
}

if (input.phase == TouchPhase.Ended || input.phase == TouchPhase.Canceled)
{
    velocity = -velocityBuffer.GetAverageVelocity(transform.position, Time.deltaTime);
}

Lastly, instead of setting the camera to the fingers world position every frame, I use a tiny bit of Lerp/MoveTowards interpolation to smooth out any jitter. It's hard to get best value between crisp control and smooth look, but I assume that's the way to go with user input, which can vary rapidly.

Of course, I'd still be interested in other approaches, better solutions or opinions about my current implementation.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM