[英]Python problems on Machine-Learning
import numpy as np
import pandas as pd
import numpy as np
from matplotlib import pyplot as pt
def computeCost(X,y,theta):
m=len(y)
predictions= X*theta-y
sqrerror=np.power(predictions,2)
return 1/(2*m)*np.sum(sqrerror)
def gradientDescent(X, y, theta, alpha, num_iters):
m = len(y)
jhistory = np.zeros((num_iters,1))
for i in range(num_iters):
h = X * theta
s = h - y
theta = theta - (alpha / m) * (s.T*X).T
jhistory_iter = computeCost(X, y, theta)
return theta,jhistory_iter
data = open(r'C:\Users\Coding\Desktop\machine-learning-ex1\ex1\ex1data1.txt')
data1=np.array(pd.read_csv(r'C:\Users\Coding\Desktop\machine-learning-ex1\ex1\ex1data1.txt',header=None))
y =np.array(data1[:,1])
m=len(y)
y=np.asmatrix(y.reshape(m,1))
X = np.array([data1[:,0]]).reshape(m,1)
X = np.asmatrix(np.insert(X,0,1,axis=1))
theta=np.zeros((2,1))
iterations = 1500
alpha = 0.01;
print('Testing the cost function ...')
J = computeCost(X, y, theta)
print('With theta = [0 , 0]\nCost computed = ', J)
print('Expected cost value (approx) 32.07')
theta=np.asmatrix([[-1,0],[1,2]])
J = computeCost(X, y, theta)
print('With theta = [-1 , 2]\nCost computed =', J)
print('Expected cost value (approx) 54.24')
theta,JJ = gradientDescent(X, y, theta, alpha, iterations)
print('Theta found by gradient descent:')
print(theta)
print('Expected theta values (approx)')
print(' -3.6303\n 1.1664\n')
predict1 = [1, 3.5] *theta
print(predict1*10000)
结果:
测试成本函数...
与 theta = [0 , 0]
计算成本 = 32.072733877455676
预期成本值(约) 32.07
与 theta = [-1 , 2]
计算成本 = 69.84811062494227
预期成本值(约) 54.24
通过梯度下降找到的 Theta:
[[-3.70304726 -3.64357517]
[ 1.17367146 1.16769684]]
预期的 theta 值(大约)
-3.6303
1.1664
[[4048.02858742 4433.63790186]]
有两个问题,第一个 Cost 计算是正确的,但第二个是错误的。 我的梯度下降中有 4 个元素(假设是两个)
当你提到“随着 theta = [-1 , 2]”
然后你输入
theta=np.asmatrix([[-1,0],[1,2]])
我认为这是不正确的。 假设您有一个特征并且您添加了一列 1,并且您正在尝试进行简单的线性回归
正确的方法应该是
np.array([-1,2])
还有哪里有
predictions= X*theta-y
如果你这样做会更好
np.dot(X,theta)-y
当你乘法时,它不会做同样的事情。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.