简体   繁体   English

AI - 学习最佳组合

[英]AI - Learn the best combination

Total param: 48K总参数:48K

Input X:输入 X:

array([[ 1964,    12, 32772, ...,     0,  6176,     0],
       [ 1964,    12, 32772, ...,     0,  6841,     0],
       [ 1964,    28, 32772, ...,     0,  6176,     0],
       ...,
       [ 7400,    20, 41565, ...,     0,  8149,     0],
       [ 7400,    20, 41565, ...,     0,  8151,     0],
       [ 7400,    20, 41565, ...,     0,  8150,     0]], dtype=int32)

Output y: Output 年:

array([0., 0., 0., ..., 1., 0., 0.])

Model structure: Model结构:

model = Sequential()
model.add(BatchNormalization(input_shape=(7,)))
model.add(Dense(32, activation="relu"))
model.add(Dense(32, activation="relu"))
model.add(Dense(64, activation="relu"))
model.add(Dense(32, activation="relu"))
model.add(Dense(32, activation="relu"))
model.add(Dense(16, activation="relu"))
model.add(Dense(1, activation=None))

In first input layer i use batch normalization and my acc increase from 50 to 73, which i mean is good solution..在第一个输入层中,我使用批量归一化,我的 acc 从 50 增加到 73,我的意思是这是一个很好的解决方案..

Model compile Model编译

model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])

i also try 'adam' and same result.我也尝试“亚当”和同样的结果。

Model fit: Model 适合:

history = model.fit(x_train, y_train, batch_size=2048, validation_data=(x_test, y_test), epochs=1000)

I alo try more combination, with epochs=30000 and batch size 1024 and get acc 78.51.我也尝试了更多组合,使用 epochs=30000 和批量大小 1024 并获得 acc 78.51。

I also try to double every layer, node*2:我还尝试将每一层加倍,节点* 2:

I have 16k with output 1, so with this this solution i get 6k corrected prediction.我有 16k 和 output 1,所以通过这个解决方案我得到 6k 校正预测。 @ with epochs=30000 and batch size 1024, optimizer adam @ @ epochs=30000 批量大小为 1024,优化器 adam @

Model structure: Model结构:

model = Sequential()
model.add(BatchNormalization(input_shape=(7,)))
model.add(Dense(64, activation="relu"))
model.add(Dense(64, activation="relu"))
model.add(Dense(128, activation="relu"))
model.add(Dense(32, activation="relu"))
model.add(Dense(1, activation='sigmoid'))

在此处输入图像描述

My simple question how to increase acc to get more corrected prediction?我的简单问题是如何增加 acc 以获得更正确的预测?

If orange is your validation loss / accuracy, you're overfitting.如果橙色是您的验证损失/准确性,则说明您过度拟合。 The accuracy almost does not decrease with the number of epochs, while the validation loss increases.准确率几乎不会随着 epoch 数的增加而降低,而验证损失会增加。 Think about想一想

  • balancing the labels, maybe they're imbalanced your accuracy doesn't tell you that much平衡标签,也许它们不平衡你的准确性并没有告诉你那么多
  • add early stopping添加提前停止
  • adapt the batch size调整批量大小
  • adapt the activation function适配激活function
  • add DropOut添加 DropOut
  • test other optimizers eg Adam测试其他优化器,例如 Adam

You used a lot of number of hidden layers, reduce them in the beginning.你使用了很多隐藏层,一开始就减少它们。 I prefer to start with a small.network, even like logistic regression or a simple linear model, and then have a look if a neural.network increases the performance.我更喜欢从小型网络开始,甚至像逻辑回归或简单的线性 model,然后看看神经网络是否提高了性能。

Think about using different methods than neural.networks, for example CART methods (eg XGBOOST) have been shown to outperform neural.networks on problems with small feature size (eg here 7).考虑使用与 neural.networks 不同的方法,例如 CART 方法(例如 XGBOOST)已被证明在小特征尺寸(例如此处 7)的问题上优于 neural.networks。

I hope that helps to further explore the problem!希望对进一步探究问题有所帮助!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM