简体   繁体   English

将 3D 线拟合到 Java 中的 3D 点数据?

[英]Fit a 3D line to 3D point data in Java?

I've spent a decent amount of time trying to hunt down a simple way of doing this - ideally, a magical library exists out there somewhere that will take my set of 3D data points and return 2 points on the best fit line using either orthogonal regression or least squares and also return the error of the fitted line.我花了相当多的时间试图寻找一种简单的方法来做到这一点——理想情况下,一个神奇的库存在于某个地方,它将获取我的一组 3D 数据点并使用正交返回最佳拟合线上的 2 个点回归或最小二乘法,并返回拟合线的误差。 Does such a thing exist, and if so, where?这样的事情是否存在,如果存在,在哪里?

This is easy enough to do, but to write it yourself you will need an eigenvalue solver or a singular value decomposition. 这很容易做到,但要自己编写,你需要一个特征值求解器或奇异值分解。 Create the nx3 matrix A, of your (x-xbar, y-ybar, z-zbar) data as columns. 创建(x-xbar,y-ybar,z-zbar)数据的nx3矩阵A作为列。 Save those column means for later, I'll call it V0 = [xbar,ybar,zbar]. 保存这些列的方法以供日后使用,我将其称为V0 = [xbar,ybar,zbar]。

Now, compute the eigenvalues and eigenvectors of A'*A, ie, the 3x3 matrix formed from A transpose multiplied by A. 现在,计算A'* A的特征值和特征向量,即由A转置乘以A形成的3x3矩阵。

If this data lies on a line in R^3, then one of those eigenvalues will be significantly larger than the other two eigenvalues. 如果该数据位于R ^ 3中的一条线上,那么这些特征值中的一个将明显大于其他两个特征值。 If this is not true, then the orthogonal regression line will not be well estimated. 如果不是这样,则不能很好地估计正交回归线。

Take the eigenvector that is associated with the largest eigenvalue of A'*A. 取与A'* A的最大特征值相关联的特征向量。 Then if V is the corresponding eigenvector, the orthogonal regression line is defined as 然后,如果V是对应的特征向量,则将正交回归线定义为

V(t) = V0 + t*V V(t)= V0 + t * V.

Any point on that line can be given by some value of the parameter t. 该行上的任何点都可以由参数t的某个值给出。

Alternatively, compute the singular value decomposition of A, and take the right singular vector which corresponds to the largest singular value of A. 或者,计算A的奇异值分解,并取右边的奇异向量,该向量对应于A的最大奇异值。

In either event, if you wish to compute the errors for the data points, this would be defined as simply the orthogonal distance to the line in question. 在任何一种情况下,如果您希望计算数据点的误差,这将被定义为与所讨论的线的正交距离。

Google for "java linear least squares regression library" and you should find a few options. 谷歌的“java线性最小二乘回归库”,你应该找到一些选项。 One is Drej . 一个是Drej I have not used this myself, though. 不过,我自己并没有用过这个。

EDIT - I'm not confident this answers the question - I don't know whether 3D data is supported. 编辑 - 我不相信这回答了问题 - 我不知道是否支持3D数据。

It's easy enough to do it if you know the trick: http://www.scribd.com/doc/21983425/Least-Squares-Fit 如果你知道这个诀窍就很容易做到: http//www.scribd.com/doc/21983425/Least-Squares-Fit

More dimensions means more coefficients, but they're easy enough to add in. The ideas are all the same. 更多的维度意味着更多的系数,但它们很容易添加。这些想法都是一样的。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM