[英]Training SVC from scikit-learn shows that using -h 0 may be faster?
I am training an SVC model on a large dataset, and since I have set verbose=True
, it is showing a Warning: using -h 0 may be faster
.我正在一个大型数据集上训练 SVC model ,并且由于我设置了
verbose=True
,它显示了一个Warning: using -h 0 may be faster
。
I have two questions here:我在这里有两个问题:
sklearn.svm.SVC
parameter setting can affect the speed of training? sklearn.svm.SVC
参数设置中缓存的大小会影响训练速度吗? I have set it as cache_size=2000
.cache_size=2000
。 Your expert view is appreciated感谢您的专家观点
-h
parameter controls shrinking: whether to use the shrinking heuristics, 0 or 1 (default 1) -h
参数控制收缩:是否使用收缩启发式,0 或 1(默认为 1)
You can set it in the SVC constructor using shrinking
argument.您可以使用
shrinking
参数在 SVC 构造函数中设置它。 Shrinking is a heuristics to speed up the optimization problem.收缩是加速优化问题的启发式方法。
Check Original Paper and Similar Quesiton on shrinking检查原始纸张和 类似的收缩问题
Caching is technique for reducing the computational time of the decomposition method which is part of the training.缓存是用于减少作为训练的一部分的分解方法的计算时间的技术。 This size is controlled via
cache_size
parameter.这个大小是通过
cache_size
参数控制的。
I higlhy recommend reading the original libsm paper, especially section 5.我强烈建议阅读原始 libsm 论文,尤其是第 5 节。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.