简体   繁体   中英

Linear regression for OR operation in scikit-learn and plotting

To learn linear regression in scikit-learn, I wrote some code to handle OR operation and added visualization but the visualization doesn't seem to explain what's happening intuitively:

from sklearn import linear_model
X = [[0, 0], [1, 1], [0, 1], [1, 0]]
Y = [0, 1, 1, 1]
regr = linear_model.LinearRegression()
regr.fit(X, Y)

# check that the coeffients are the expected ones.
m = regr.coef_[0]
b = regr.intercept_
print(' y = {0} * x + {1}'.format(m, b))

print(regr.predict([[0,0]]))
print(regr.predict([[0,1]]))
print(regr.predict([[1,0]]))
print(regr.predict([[1,1]]))

With threshold of 0.5, I think this works as expected. Now I tried to plot these to visualize and understand what's happening:

%matplotlib inline
import matplotlib.pyplot as plt
import numpy as np

x = np.arange(0, 2, 0.1)
fig, ax = plt.subplots()
ax.plot(x, m * x + b, color='red')

x = [0, 0, 1, 1]
y = [0, 1, 0, 1]
ax.scatter(x, y)

fig.show()

在此处输入图片说明

I expected the slope would be a negative value but it isn't. What did I do wrong?

(sorry, i don't know how to edit the mathematical notation in answer) 1, Your Input X is a 2D variable list. let's say X[0] = x, X[1] = y , Y = z, then the linear model should be ax + by + c = z. And you need 3D to visualize this model as it has 3 variables 2, If you need to draw a 2D plot, then we should set z to a 0 or 1 whatever you like, then the model turn to be y = -c/b - (a/b)x, which is y = -0.5 - x in this predict model. And the slop is negative.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM