简体   繁体   English

Logistic 回归无法找到 theta 的值

[英]Logistic Regression not able to find value of theta

I have hundred Entries in csv file.我在 csv 文件中有一百个条目。

Physics,Maths,Status_class0or1
30,40,0
90,70,1

Using above data i am trying to build logistic (binary) classifier.使用上述数据,我正在尝试构建逻辑(二进制)分类器。 Please advise me where i am doing wrong ?请告诉我哪里做错了? Why i am getting answer in 3*3 Matrix (9 values of theta, where as it should be 3 only)为什么我在 3*3 矩阵中得到答案(theta 的 9 个值,因为它应该只有 3 个)

Here is code: importing the libraries这是代码:导入库

import numpy as np
import pandas as pd
from sklearn import preprocessing

reading data from csv file.从 csv 文件读取数据。

df = pd.read_csv("LogisticRegressionFirstBinaryClassifier.csv", header=None)
df.columns = ["Maths", "Physics", "AdmissionStatus"]
X = np.array(df[["Maths", "Physics"]])
y = np.array(df[["AdmissionStatus"]])
X = preprocessing.normalize(X)
X = np.c_[np.ones(X.shape[0]), X]
theta = np.ones((X.shape[1], 1))

print(X.shape) # (100, 3)
print(y.shape) # (100, 1)
print(theta.shape) # (3, 1)

calc_z to caculate dot product of X and theta calc_z 计算 X 和 theta 的点积

def calc_z(X,theta):
    return np.dot(X,theta)

Sigmoid function S形函数

def sigmoid(z):
    return 1 / (1 + np.exp(-z))

Cost_function成本函数

def cost_function(X, y, theta):
    z = calc_z(X,theta)
    h = sigmoid(z)    
    return (-y * np.log(h) - (1 - y) * np.log(1 - h)).mean()
print("cost_function =" , cost_function(X, y, theta))

def derivativeofcostfunction(X, y, theta):
    z = calc_z(X,theta)
    h = sigmoid(z)
    calculation = np.dot((h - y).T,X)
    return calculation
print("derivativeofcostfunction=", derivativeofcostfunction(X, y, theta))

def grad_desc(X, y, theta, lr=.001, converge_change=.001): 
    cost = cost_function(X, y, theta) 
    change_cost = 1
    num_iter = 1

    while(change_cost > converge_change): 
        old_cost = cost
        print(theta)
        print (derivativeofcostfunction(X, y, theta))
        theta = theta - lr*(derivativeofcostfunction(X, y, theta))
        cost = cost_function(X, y, theta)
        change_cost = old_cost - cost
        num_iter += 1

    return theta, num_iter 

Here is the output :这是输出:

[[ 0.4185146  -0.56877556  0.63999433]
 [15.39722864  9.73995197 11.07882445]
 [12.77277463  7.93485324  9.24909626]]
[[0.33944777 0.58199037 0.52493407]
 [0.02106587 0.36300629 0.30297278]
 [0.07040604 0.3969297  0.33737757]]
[[-0.05856159 -0.89826735  0.30849185]
 [15.18035041  9.59004868 10.92827046]
 [12.4804775   7.73302024  9.04599788]]
[[0.33950634 0.58288863 0.52462558]
 [0.00588552 0.35341624 0.29204451]
 [0.05792556 0.38919668 0.32833157]]
[[-5.17526527e-01 -1.21534937e+00 -1.03387571e-02]
 [ 1.49729502e+01  9.44663458e+00  1.07843504e+01]
 [ 1.21978140e+01  7.53778010e+00  8.84964495e+00]]
(array([[ 0.34002386,  0.58410398,  0.52463592],
       [-0.00908743,  0.34396961,  0.28126016],
       [ 0.04572775,  0.3816589 ,  0.31948193]]), 46)

I changed this code , just added Transpose while returning the matrix and it fixed my issue.我更改了此代码,只是在返回矩阵时添加了 Transpose 并解决了我的问题。

def derivativeofcostfunction(X, y, theta):
z = calc_z(X,theta)
h = sigmoid(z)
calculation = np.dot((h - y).T,X)
return calculation.T

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 对于某些 theta 值,逻辑回归输出 NaN 的成本 function - Cost function of logistic regression outputs NaN for some values of theta 如何在Logistic回归中查找Logistic / Sigmoidal函数参数 - How to find Logistic / Sigmoidal function parameters in Logistic Regression 如何找到逻辑回归模型特征的重要性? - How to find the importance of the features for a logistic regression model? 在 Python Logistic 回归中为求解器提供种子值 - Feeding a seed value to solver in Python Logistic Regression 使用逻辑回归预测参数值 - Using logistic regression to predict the parameter value 具有非整数特征值的逻辑回归 - Logistic Regression with Non-Integer feature value 梯度 function 无法找到最佳 theta,但正规方程可以 - Gradient function not able to find optimal theta but normal equation does 如何在python scikit-learn中的逻辑回归中找到正则化参数? - How to find the regularization parameter in logistic regression in python scikit-learn? 如何在python中找到实际的逻辑回归模型? - how do I find the actual logistic regression model in python? 假设优化收敛,逻辑回归是否总是找到全局最优? - Does logistic regression always find global optimum, assuming that the optimisation converges?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM