[英]Kaggle Titanic with tflearn neural network
I have solved the Titanic problem with logistic regression, now I want solve the problem with neural network. 我已经用逻辑回归解决了泰坦尼克号问题,现在我想用神经网络解决问题。 But my model always return 1 , that means survived .
但是我的模型总是返回1 ,表示幸存 。 for every test input.
对于每个测试输入。 Maybe there is a problem in my model.
也许我的模型有问题。 How could I solve this?
我该如何解决?
train_data = pd.read_csv('data/train.csv')
test_data = pd.read_csv('data/test.csv')
#Some data cleaning process
#......
X_train = train_data.drop("Survived",axis=1).as_matrix()
Y_train = train_data["Survived"].as_matrix().reshape((891,1))
X_test = test_data.drop("PassengerId",axis=1).as_matrix()
net = tflearn.input_data(shape=[None, 6])
net = tflearn.fully_connected(net, 32)
net = tflearn.fully_connected(net, 32)
net = tflearn.fully_connected(net, 1, activation='softmax')
net = tflearn.regression(net)
model = tflearn.DNN(net)
model.fit(X_train, Y_train, n_epoch=10, batch_size=16, show_metric=True)
pred = model.predict(X_test)
print pred
Using softmax as an activation layer in the output ensures that the sum of the outputs across all nodes in that layer is 1
. 将softmax用作输出中的激活层可确保该层中所有节点上的输出之和为
1
。 Since you only have a single node, and the output has to sum to 1
, it will always output 1
by definition. 由于您只有一个节点,并且输出的总和必须为
1
,因此根据定义,它将始终输出1
。
You should never use softmax as your activation for a binary classification task. 绝对不要将softmax用作二进制分类任务的激活。 A better option is the logistic function , which I think tensorflow calls sigmoid .
更好的选择是logistic函数 ,我认为tensorflow称为sigmoid 。
So instead of 所以代替
net = tflearn.fully_connected(net, 1, activation='softmax')
try 尝试
net = tflearn.fully_connected(net, 1, activation='sigmoid')
Your problem is a binary classification problem ie there are 2 possible outcomes. 您的问题是二进制分类问题,即有两种可能的结果。
0 or 1
. 0 or 1
。 In the context of Titanic Problem Not Survived or Survived
. 在泰坦尼克号问题的背景下,
Not Survived or Survived
。
The output layer of the neural net should produce an output less than or equal to 1 or greater than or equal to 0. No other values will make sense in the context of binary classification. 神经网络的输出层应产生小于或等于1或大于或等于0的输出。在二进制分类的上下文中,没有其他值有意义。
Normally a cutoff is placed like 0.50
. 通常情况下,截止点的位置为
0.50
。 If the predicted output of the net is greater than that cutoff, it is regarded as 1 else 0. 如果网的预测输出大于该截止值,则将其视为1否则为0。
In order for things to work like this as said before the net should produce an output in the range [0, 1]. 为了使事情像之前所说的那样工作,网络应产生在[0,1]范围内的输出。 For this the activation function of the layer ie output layer must be
sigmoid
. 为此,层(即输出层)的激活功能必须为
sigmoid
。 It produces output in the range [0, 1]. 它产生的输出范围为[0,1]。 To know more about sigmoid and other activation functions I recommend you to follow this link .
要了解有关Sigmoid和其他激活功能的更多信息,建议您点击此链接 。
In your code you can do it like this. 在您的代码中,您可以这样做。
net = tflearn.fully_connected(net, 1, activation='sigmoid')
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.