简体   繁体   中英

Calculate equation of hyperplane using LIBSVM support vectors and coefficients in Python

I am using the LIBSVM library in python and am trying to reconstruct the equation (w'x + b) of the hyperplane from the calculated support vectors.

The model appears to train correctly but I am unable to manually calculate prediction results that match the output of svm_predict for the test data.

I have used the below link from the FAQ to try and troubleshoot but I am still not able to calculate the correct results. https://www.csie.ntu.edu.tw/~cjlin/libsvm/faq.html#f804

My code is as follows:

from svmutil import *
import numpy as np

ytrain, xtrain = svm_read_problem('small_train.libsvm')
# Change labels from 0 to -1    
for index in range(len(ytrain)):
    if ytrain[index] == 0:
        ytrain[index] = -1.0
print ("Training set loaded...")

m = svm_train(ytrain, xtrain, '-q')
print ("Model trained...")

sv = np.asarray(m.get_SV())
sv_coef = np.asarray(m.get_sv_coef())
sv_indices = np.asarray(m.get_sv_indices())
rho = m.rho[0]

w = np.zeros(len(xtrain[0]))
b = -rho
# weight vector w = sum over i ( coefsi * xi )
for index, coef in zip(sv_indices, sv_coef):
    ai = coef[0]
    for key in xtrain[index-1]:
        w[key] = w[key] + (ai * xtrain[index-1][key])

# From LIBSVM FAQ - Doesn't seem to impact results
# if m.label[1] == -1:
#     w = np.negative(w)
#     b = -b

print(np.round(w,2))

ytest, xtest = svm_read_problem('small_test.libsvm')
# Change labels from 0 to -1  
for index in range(len(ytest)):
    if ytest[index] == 0:
        ytest[index] = -1.0

print ("Test set loaded...")
print ("Predict test set...")
p_label, p_acc, p_val = svm_predict(ytest, xtest, m)

print("p_label: ", p_label)
print("p_val: ", np.round(p_val,3))

for i in range(len(ytest)):
    wx = 0
    for key in xtest[i]:
        wx = wx + (xtest[i][key] * w[key])
    print("Manual calc: ", np.round(wx + b,3))

My understanding is that my manually calcualted results, using wx+b, should match those contained in p_val. I have tried negating both w and b and have still not been able to get the same results as those in p_val.

The data sets (LIBSVM format) I am using are:

small_train.libsvm

0 0:-0.36 1:-0.91 2:-0.99 3:-0.57 4:-1.38 5:-1.54
1 0:-1.4 1:-1.9 2:0.09 3:0.29 4:-0.3 5:-1.3
1 0:-0.43 1:1.45 2:-0.68 3:-1.58 4:0.32 5:-0.14
1 0:-0.76 1:0.3 2:-0.57 3:-0.33 4:-1.5 5:1.84

small_test.libsvm

1 0:-0.97 1:-0.69 2:-0.96 3:1.05 4:0.02 5:0.64
0 0:-0.82 1:-0.17 2:-0.36 3:-1.99 4:-1.54 5:-0.31

Are the values of w being calculated correctly? and are the p_val results the correct values to be comparing with?

Any help as always is greatly appreciated.

I managed to the get the values to match by changing:

m = svm_train(ytrain, xtrain, '-q')

to

m = svm_train(ytrain, xtrain, '-q -t 0')

From looking at the documentation the default kernel type is non-linear (radial basis function). After setting a linear kernal the results now appear to align.

Below are the available kernel types:

-t kernel_type : set type of kernel function (default 2)
    0 -- linear: u'*v
    1 -- polynomial: (gamma*u'*v + coef0)^degree
    2 -- radial basis function: exp(-gamma*|u-v|^2)
    3 -- sigmoid: tanh(gamma*u'*v + coef0)
    4 -- precomputed kernel (kernel values in training_set_file)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM