简体   繁体   English

机器学习中感知器算法中的权重向量

[英]Weight Vectors in Perceptron Algorithm in Machine learning

I'm researching on the perceptron Algorithm in machine learning.Till now,I understood the following things about the perceptron 我正在研究机器学习中的感知器算法。到目前为止,我理解了关于感知器的以下内容

1)It's a supervised learning technique
2)It tries to create a hyper plane that linearly separates the class   
  labels ,which is when the perceptron converges
3)if the predicted output and the obtained output from the algorithm   
  doesnot match it adjusts it's weight vector and bias.

However,I couldnot understand what happens to the weight vector if the 但是,我无法理解重量矢量会发生什么
perceptron doesnot acheive convergence? 感知器没有实现收敛? Do the algorithm keeps on 算法是否继续
updating the weight vector? 更新权重向量?

The preceptron can only converge if and only if the classes are linearly separable. 当且仅当类可线性分离时,先验者才能收敛。 If this is true the algorithm will converge to a solution, it has not guarantees about the quality of the solution in any way. 如果这是真的,算法将收敛到一个解决方案,它无法以任何方式保证解决方案的质量。 The preceptron's hyperplane does not minimize any particular objective and it has no unique solution, any line that separates the two classes is equally valid to the perceptron. 先验者的超平面不会使任何特定目标最小化,并且没有唯一的解决方案,任何将这两个类别分开的线对于感知器同样有效。

If this is not true, the perceptron will never converge to a final solution. 如果不是这样,感知器将永远不会收敛到最终解决方案。 It may bounce around a reasonable area, but that is not guaranteed - and it could just fly off into a useless place. 它可能会在一个合理的区域内反弹,但这并不能保证 - 它可能会飞到一个无用的地方。 It will simply keep updating its weights until you force some stopping condition (most common is max number of passes through the data). 它将简单地继续更新其权重,直到你强制一些停止条件(最常见的是通过数据的最大数量)。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM