简体   繁体   English

在Matlab中使用LibSVM支持向量回归的参数

[英]Parameters for Support Vector Regression using LibSVM in Matlab

I am trying to use LibSVM for regression. 我正在尝试使用LibSVM进行回归。 I am trying to detect faces (10 classes of different faces). 我正在尝试检测面部(10种不同的面部)。 I labeled 1-10 as face class and 11 is for non face. 我将1-10标记为面部类别,将11标记为非面部类别。 I want to develop a script usig LibSVM which will give me a continuous score between 0-1 if the test image falls to any of the 10 face class, otherwise it will give me -1 (non-face). 我想开发一个脚本usig LibSVM,如果测试图像属于10个面部类别中的任何一个,它将给我0-1之间的连续得分,否则给我-1(非面部)。 From this score, I can be able to predict my calss. 从这个分数,我可以预测自己的成绩。 If the test image is matched with 1st class, the score should be around .1. 如果测试图像与第一类匹配,则得分应在0.1左右。 Similarly if the test image is matched with class 10, the score should be around 1 (any continuous value close to 1). 同样,如果测试图像与10类匹配,则分数应约为1(任何接近1的连续值)。 I am trying to use SVR using LibSVM to solve this problem. 我正在尝试通过LibSVM使用SVR解决此问题。 I can easily get the predicted class through classification. 我可以通过分类轻松获得预测的课程。 But I want a continuous score value which I can get through regression. 但是我想要一个可以通过回归得到的连续得分值。 Now, I was looking in the net for the function or parameters in function for SVR using LibSVM, but I couldn't find anything. 现在,我正在网上寻找使用LibSVM的SVR的功能或功能中的参数,但找不到任何东西。 Can anybody please help me in this regard? 在这方面有人可以帮我吗?

This is not a regression problem. 这不是回归问题。 Solving it through regression will not yield good results. 通过回归解决它不会产生好的结果。

You are dealing with a multiclass classification problem. 您正在处理一个多类分类问题。 The best way to approach this is to construct 10 one-vs-all classifiers with probabilistic output. 解决此问题的最佳方法是构造具有概率输出的10个对所有分类器。 To get probabilistic output (eg in the interval [0,1]), you can train and predict with the -b 1 option for C-SVC ( -s 0 ). 要获得概率输出(例如,在间隔[0,1]中),可以使用C-SVC的-b 1选项( -s 0 )进行训练和预测。

If any of the 10 classifiers yields a sufficiently large probability for its positive class, you use that probability (which is close to 1). 如果10个分类器中的任何一个为其正分类产生足够大的概率,则可以使用该概率(接近1)。 If none of the 10 classifiers yield a positive label with high enough confidence you can default to -1. 如果10个分类器中没有一个产生具有足够高置信度的正标签,则可以默认为-1。

So in short: make a multiclass classifier containing one-vs-all classifiers with probabilistic output. 简而言之:制作一个多分类器,其中包含具有概率输出的一对多分类器。 Subsequently post-process the predictions as I described, using a probability threshold of your choice (for example 0.7). 随后使用您选择的概率阈值(例如0.7)对我所描述的预测进行后处理。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM