简体   繁体   中英

perceptron learning algorithm : convergence proof for the algorithm, why the ||w*|| is equal 1 or this condition is necessary?

I am studying perception learning, and learn the convergence proof for the algorithm with the following link( https://www.cse.iitb.ac.in/~shivaram/teaching/cs344+386-s2017/resources/classnote-1.pdf ): as the Assumption 1 (Linear Separability) as figure shown. I don't know why the ||w*||=1 or why this condition is necessary , Could you help me understand it?thanks!

假设1

The norm assumption is there for the simplicity of the analysis only, it is easy to show that the assumption is not necessary since droping it actually implies it.

Lets assume that there exists w (||w|| = Z > 0), gamma>0 such that

yi(<w, xi>) > gamma

then for the same gamma:

yi(<Zw/||w||, xi>) > gamma

thus

|Z| yi(<w/||w||, xi>) > gamma

so for w* = w/||w|| (so ||w*||=1), and gamma* = gamma / |Z| > 0

yi(<w*, xi>) > gamma*

which concludes the proof that if there exists any w (with arbitrary norm Z) and gamma, then there also exists w* with norm 1 (and simply we have to divide the original gamma by Z) and gamma*=gamma/Z.

The only reason to do it this way is to make constants in the proof simpler, but the assumption itself is redundant.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM