[英]Does tensorflow re-initialize weights when training in a for loop?
我正在for
循环中训练 model,因为......我可以。 我知道有像tf.Dataset
API 和generators
到磁盘中的 stream 数据的替代方案,但我的问题是关于循环的具体情况。
TF 是否在每个循环开始时重新初始化 model 的权重? 还是仅在第一次实例化 model 时才发生初始化?
编辑:
for msn in LIST:
data = pd.read_parquet(
"03 - Data",
engine='pyarrow')
data = data[column_order]
data.rename(columns={"Flight_Id_Int":"Flight_Id"}, inplace=True)
""" DATA PREPARATION AND FORMATING """
data_clean = clean_and_prepare(data, SEQ_LEN, input_type=model_type, smooth=True)
# To keep the chonological order of flight we don't random shuffle
train_idx = np.arange(0, int(len(data_clean)*0.9))
test_idx = np.arange(int(len(data_clean)*0.9), len(data_clean))
train_df = tf.data.Dataset.from_tensor_slices(
(data_clean[train_idx], data_clean[train_idx])
).batch(BATCH_SIZE)
test_df = tf.data.Dataset.from_tensor_slices(
(data_clean[test_idx], data_clean[test_idx])
).batch(BATCH_SIZE)
""" MODEL TRAINING """
history = model.fit(train_df,
epochs=EPOCHS,
validation_data=(test_df),
callbacks=[tf.keras.callbacks.EarlyStopping(
monitor="val_loss",
patience=15,
mode="min",
restore_best_weights = True)])
plot_train_history(history, "Autoencorder {0} - MSN: {1}".format(model_type, msn))
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.