![](/img/trans.png)
[英]Machine learning model keeps on giving the same result even with different outputs
[英]My code giving differnt result where as the same code in my Machine learning assignment expects a different result?
我的代碼
def lrCostFunction(theta, X, y, lambda_):
m = y.size
if y.dtype == bool:
y = y.astype(int)
tempt = theta
tempt[0] = 0
J = 0
grad = np.zeros(theta.shape)
hx = X.dot(theta.T)
h = sigmoid(hx)
J = (1/m) * np.sum(-y.dot(np.log(h)) - (1-y).dot(np.log(1-h)))
J = J + (lambda_/(2*m)) * np.sum(np.square(tempt))
grad = ((1/m) * (h - y) .dot(X)) + (lambda_/m) * tempt
return J, grad
# rand_indices = np.random.choice(m, 100, replace=False)
# sel = X[rand_indices, :]\
theta_t = np.array([-2, -1, 1, 2], dtype=float)
X_t = np.concatenate([np.ones((5, 1)), np.arange(1, 16).reshape(5, 3, order='F')/10.0], axis=1)
y_t = np.array([1, 0, 1, 0, 1])
lambda_t = 3
cost, gradient = lrCostFunction(theta_t, X_t, y_t, lambda_t)
print("J= ", cost, "\nGrad= ", gradient)
OUTPUT:
J= 3.0857279966152817
Grad= [ 0.35537648 -0.49170896 0.88597928 1.66366752]
作業要求來自相同輸入的這些結果:
print('Cost : {:.6f}'.format(J))
print('Expected cost: 2.534819')
print('-----------------------')
print('Gradients:')
print(' [{:.6f}, {:.6f}, {:.6f}, {:.6f}]'.format(*grad))
print('Expected gradients:')
print(' [0.146561, -0.548558, 0.724722, 1.398003]');
我什至在 inte.net 上搜索答案,每個人的代碼都和我一樣,他們說他們的結果和預測的一樣。 我甚至在我的 pycharm IDE 上復制了他們的代碼,但我又得到了相同的答案。 如果您想閱讀問題“向量化正則化邏輯回歸”,輸入也相同
鏈接到具有相同代碼和正確答案的其中一個解決方案:
這也發生在我上次作業的一部分中,這真的很令人沮喪,所以我正在尋求幫助。
你的代碼是正確的。 問題是,當您更改值tempt[0]
時,您也在更改theta[0]
。 復制theta
可確保初始向量不變。
def lrCostFunction(theta, X, y, lambda_):
m = y.size
if y.dtype == bool:
y = y.astype(float)
J = 0
grad = np.zeros(theta.shape)
hx = X.dot(theta.T)
h = sigmoid(hx)
tempt = np.copy(theta) # Copy of theta
tempt[0] = 0
J = (1/m) * np.sum(-y.dot(np.log(h)) - (1-y).dot(np.log(1-h)))
J = J + (lambda_/(2*m)) * np.sum(np.square(tempt))
grad = ((1/m) * (h - y) .dot(X)) + (lambda_/m) * tempt
print(theta, tempt)
return J, grad
cost, gradient = lrCostFunction(theta_t, X_t, y_t, lambda_t)
print("J= ", cost, "\nGrad= ", gradient)
# Output:
# [-2. -1. 1. 2.] [ 0. -1. 1. 2.]
# J= 2.534819396109744
# Grad= [ 0.14656137 -0.54855841 0.72472227 1.39800296]
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.