簡體   English   中英

將線段擬合到一組點

[英]Fit a line segment to a set of points

我正在嘗試將線段擬合到一組點,但我無法找到它的算法。 我有一個二維線段L和一組二維點C L可以用任何合適的方式表示(我不在乎),比如支持向量和定義向量、兩點、左右邊界的線性方程……唯一重要的是線有一個起點和一個結束,所以它不是無限的。

I want to fit L in C , so that the sum of all distances of c to L (where c is a point in C ) is minimized. 這是一個最小二乘問題,但我(認為)不能使用多項式擬合,因為L只是一個段。 我在這方面的數學知識有點缺乏,所以任何關於進一步閱讀的提示也將不勝感激。

這是我的問題的說明:

1

橙色線應該適合藍色點,以便每個點到線的距離平方和最小。 我不介意解決方案是使用不同的語言還是根本不是代碼,只要我可以從中提取算法即可。

由於這更像是一個數學問題,我不確定 SO 是否可以,或者應該轉移到交叉驗證或數學交換。

這是 python 中的一個命題。 點和線之間的距離是根據這里提出的方法計算的: Fit a line segment to a set of points

段具有有限長度的事實,這會強制使用minmax function,或者if測試我們是否必須使用垂直距離或到端點之一的距離,真的很難(不可能?)得到一個解析解。

因此,所提出的解決方案將使用優化算法來接近最佳解決方案。 It uses scipy.optimize.minimize, see: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html

由於段長度是固定的,我們只有三個自由度。 在建議的解決方案中,我使用起始段點和段斜率的 x 和 y 坐標作為自由參數。 我使用getCoordinates function 從這 3 個參數和長度中獲取段的起點和終點。

import numpy as np
from scipy.optimize import minimize
import matplotlib.pyplot as plt
import math as m
from scipy.spatial import distance

# Plot the points and the segment
def plotFunction(points,x1,x2):
    'Plotting function for plane and iterations'
    plt.plot(points[:,0],points[:,1],'ro')
    plt.plot([x1[0],x2[0]],[x1[1],x2[1]])
    plt.xlim(0, 1)
    plt.ylim(0, 1)
    plt.show()

# Get the sum of the distance between all the points and the segment
# The segment is defined by guess and length were:
# guess[0]=x coordinate of the starting point
# guess[1]=y coordinate of the starting point
# guess[2]=slope
# Since distance is always >0 no need to use root mean square values
def getDist(guess,points,length):
  start_pt=np.array([guess[0],guess[1]])
  slope=guess[2]
  [x1,x2]=getCoordinates(start_pt,slope,length)
  total_dist=0
  # Loop over each points to get the distance between the point and the segment
  for pt in points:
    total_dist+=minimum_distance(x1,x2,pt,length)

  return(total_dist)

# Return minimum distance between line segment x1-x2 and point pt
# Adapted from https://stackoverflow.com/questions/849211/shortest-distance-between-a-point-and-a-line-segment
def minimum_distance(x1, x2, pt,length):
  length2 = length**2  # i.e. |x1-x2|^2 - avoid a sqrt, we use length that we already know to avoid re-computation
  if length2 == 0.0:
    return distance.euclidean(p, v);
  # Consider the line extending the segment, parameterized as x1 + t (x2 - x1).
  # We find projection of point p onto the line.
  # It falls where t = [(pt-x1) . (x2-x1)] / |x2-x1|^2
  # We clamp t from [0,1] to handle points outside the segment vw.
  t = max(0, min(1, np.dot(pt - x1, x2 - x1) / length2));
  projection = x1 + t * (x2 - x1);  # Projection falls on the segment
  return distance.euclidean(pt, projection);


# Get coordinates of start and end point of the segment from start_pt,
# slope and length, obtained by solving slope=dy/dx, dx^2+dy^2=length
def getCoordinates(start_pt,slope,length):
    x1=start_pt
    dx=length/m.sqrt(slope**2+1)
    dy=slope*dx
    x2=start_pt+np.array([dx,dy])
    return [x1,x2]

if __name__ == '__main__':
    # Generate random points
    num_points=20
    points=np.random.rand(num_points,2)

    # Starting position
    length=0.5
    start_pt=np.array([0.25,0.5])
    slope=0

    #Use scipy.optimize, minimize to find the best start_pt and slope combination
    res = minimize(getDist, x0=[start_pt[0],start_pt[1],slope], args=(points,length), method="Nelder-Mead")

    # Retreive best parameters
    start_pt=np.array([res.x[0],res.x[1]])
    slope=res.x[2]
    [x1,x2]=getCoordinates(start_pt,slope,length)

    print("\n** The best segment found is defined by:")
    print("\t** start_pt:\t",x1)
    print("\t** end_pt:\t",x2)
    print("\t** slope:\t",slope)
    print("** The total distance is:",getDist([x1[0],x2[1],slope],points,length),"\n")

    # Plot results
    plotFunction(points,x1,x2)

此解決方案與此處已發布的解決方案相對相似,但我認為效率更高,更優雅且易於理解,這就是為什么盡管相似,但仍將其發布的原因。

正如已經寫過的那樣,min(max(...)) 公式很難解析地解決這個問題,這就是為什么 scipy.optimize 非常適合的原因。

該解決方案基於https://math.stackexchange.com/questions/330269/the-distance-from-a-point-to-a-line-中概述的點與有限線段之間距離的數學公式部分

import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import minimize, NonlinearConstraint


def calc_distance_from_point_set(v_):
    #v_ is accepted as 1d array to make easier with scipy.optimize
    #Reshape into two points
    v = (v_[:2].reshape(2, 1), v_[2:].reshape(2, 1))

    #Calculate t* for s(t*) = v_0 + t*(v_1-v_0), for the line segment w.r.t each point
    t_star_matrix = np.minimum(np.maximum(np.matmul(P-v[0].T, v[1]-v[0]) / np.linalg.norm(v[1]-v[0])**2, 0), 1)
    #Calculate s(t*)
    s_t_star_matrix = v[0]+((t_star_matrix.ravel())*(v[1]-v[0]))

    #Take distance between all points and respective point on segment
    distance_from_every_point = np.linalg.norm(P.T -s_t_star_matrix, axis=0)
    return np.sum(distance_from_every_point)

if __name__ == '__main__':

    #Random points from bounding box

    box_1 = np.random.uniform(-5, 5, 20)
    box_2 = np.random.uniform(-5, 5, 20)
    P = np.stack([box_1, box_2], axis=1)
    segment_length = 3
    segment_length_constraint = NonlinearConstraint(fun=lambda x: np.linalg.norm(np.array([x[0], x[1]]) - np.array([x[2] ,x[3]])), lb=[segment_length], ub=[segment_length])
    point = minimize(calc_distance_from_point_set, (0.0,-.0,1.0,1.0), options={'maxiter': 100, 'disp': True},constraints=segment_length_constraint).x
    plt.scatter(box_1, box_2)
    plt.plot([point[0], point[2]], [point[1], point[3]])

示例結果:

在此處輸入圖像描述

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM