[英]neural network: Why is my code not reproducible?
我以為我的神經網絡可以重現,但事實並非如此。 結果並沒有顯着不同,但例如損失與一次運行相差約 0.1。 所以這是我的代碼!
# Code reproduzierbar machen
from numpy.random import seed
seed(0)
from tensorflow import set_random_seed
set_random_seed(0)
# Importiere Datasets (Training und Test)
import pandas as pd
poker_train = pd.read_csv("C:/Users/elihe/Documents/Studium Master/WS 19 und 20/Softwareprojekt/poker-hand-training-true.data")
poker_test = pd.read_csv("C:/Users/elihe/Documents/Studium Master/WS 19 und 20/Softwareprojekt/poker-hand-testing.data")
from sklearn.preprocessing import OneHotEncoder
# Trainings- und Testset in Input und Output verwandeln
X_tr = poker_train.iloc[:, 0:10].values
y_tr = poker_train.iloc[:, 10:11].values
X_te = poker_test.iloc[:, 0:10].values
y_te = poker_test.iloc[:, 10:11].values
# Output in 0-1-Vektoren verwandeln
encode = OneHotEncoder(categories = 'auto')
y_train = encode.fit_transform(y_tr).toarray()
y_test = encode.fit_transform(y_te).toarray()
from sklearn.preprocessing import StandardScaler
sc = StandardScaler()
X_train = sc.fit_transform(X_tr)
X_test = sc.transform(X_te)
# NN mit Keras erstellen
import keras
from keras.models import Sequential
from keras.layers import Dense
nen.add(Dense(400, input_dim = 10, activation = 'sigmoid'))
nen.add(Dense(400, activation = 'sigmoid'))
nen.add(Dense(10, activation = 'softmax'))
from keras.optimizers import RMSprop
nen.compile(loss='binary_crossentropy', optimizer=RMSprop(0.001), metrics=['accuracy'])
nen_fit = nen.fit(X_train, y_train,epochs=30, batch_size=15, verbose=1, validation_split = 0.2, shuffle = False)
我以為我可以用前幾行來重現它……有人可以幫忙嗎? 我用谷歌搜索了很多,但沒有任何幫助。 有一點點差異是正常的嗎? 我想讓它完全(。)可重現。
順便說一句,請忽略我在代碼中的評論..我是德國人:) 你必須知道我是神經網絡的新手!
我建議你這個
import numpy as np
import random as rn
import tensorflow as tf
import keras
from keras import backend as K
#-----------------------------Keras reproducible------------------#
SEED = 1234
tf.set_random_seed(SEED)
os.environ['PYTHONHASHSEED'] = str(SEED)
np.random.seed(SEED)
rn.seed(SEED)
session_conf = tf.ConfigProto(
intra_op_parallelism_threads=1,
inter_op_parallelism_threads=1
)
sess = tf.Session(
graph=tf.get_default_graph(),
config=session_conf
)
K.set_session(sess)
我在這里使用了 Outcast 的答案版本: 為什么即使我設置了隨機種子,我也無法在 Keras 中獲得可重現的結果?
import os
import random
import numpy as np
seed_value = 1
# 1. Set `PYTHONHASHSEED` environment variable at a fixed value
os.environ['PYTHONHASHSEED'] = str(seed_value)
# 2. Set `python` built-in pseudo-random generator at a fixed value
random.seed(seed_value)
# 3. Set `numpy` pseudo-random generator at a fixed value
np.random.seed(seed_value)
如果這不起作用,請嘗試設置 scikitlearn 全局種子: https://github.com/scikit-learn/scikit-learn/issues/10237
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.