简体   繁体   中英

Arc length of curve from data points in Python

I'm working on a robot simulation and trying to calculate the robot's distance from the goal along some planned trajectory. The trajectory curves to avoid obstacles, and it is given by a list of coordinates. To find progress, I need to find the arc length from the current position to the goal. I'm familiar with the equation for arc length of a function: 在此处输入图像描述

The approach I was planning to use was creating a polynomial approximation of the function of the trajectory from the data points using NumPy's polynomial.polyfit, then finding its derivative, squaring that, adding 1, taking the square root, and finally integrating. However, the square root of a polynomial doesn't always exist, so this method wouldn't always work.

Is there some better way to approach this? I'm familiar with numerical integration, but not sure if/how it can be applied to this problem.

EDIT: Figured out how to do this numerically, which is much faster. Compute numerical derivative using numpy.gradient/numpy.diff, plug each element in that derivative into sqrt(1 + (dy/dx)^2), then use numpy.trapz/scipy.integrate.simpson to compute integral numerically.

How will your robot move from one point to another?

If it is a straight line it suffices to do

np.sum(np.sqrt(np.diff(x)**2 + np.diff(y)**2))

If not you should first figure out what the path your robot will follow. Then having those equations you can integrate analytically, or sampling points in the curve. For smooth paths the error on the size tends to be O(1/n^2) where n is the number of points you use in your interpolation.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM