[英]Does tensorflow re-initialize weights when training in a for loop?
我正在for
循環中訓練 model,因為......我可以。 我知道有像tf.Dataset
API 和generators
到磁盤中的 stream 數據的替代方案,但我的問題是關於循環的具體情況。
TF 是否在每個循環開始時重新初始化 model 的權重? 還是僅在第一次實例化 model 時才發生初始化?
編輯:
for msn in LIST:
data = pd.read_parquet(
"03 - Data",
engine='pyarrow')
data = data[column_order]
data.rename(columns={"Flight_Id_Int":"Flight_Id"}, inplace=True)
""" DATA PREPARATION AND FORMATING """
data_clean = clean_and_prepare(data, SEQ_LEN, input_type=model_type, smooth=True)
# To keep the chonological order of flight we don't random shuffle
train_idx = np.arange(0, int(len(data_clean)*0.9))
test_idx = np.arange(int(len(data_clean)*0.9), len(data_clean))
train_df = tf.data.Dataset.from_tensor_slices(
(data_clean[train_idx], data_clean[train_idx])
).batch(BATCH_SIZE)
test_df = tf.data.Dataset.from_tensor_slices(
(data_clean[test_idx], data_clean[test_idx])
).batch(BATCH_SIZE)
""" MODEL TRAINING """
history = model.fit(train_df,
epochs=EPOCHS,
validation_data=(test_df),
callbacks=[tf.keras.callbacks.EarlyStopping(
monitor="val_loss",
patience=15,
mode="min",
restore_best_weights = True)])
plot_train_history(history, "Autoencorder {0} - MSN: {1}".format(model_type, msn))
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.