简体   繁体   中英

Can't use tf.keras.optimizer with tf.keras.models.sequential

I am using python 3 with conda and tensorflow, with the following code, in order to create tf.keras.models.sequential and optimize it with tf.keras.optimizer.Adam, and getting the following error:

from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.datasets import mnist
from tensorflow.python.keras.optimizers import Adam
from tensorflow.python.keras.layers import Dense, Dropout
from siamese import triplet_loss

model = Sequential()
model.add(Dense(units=100, input_shape=(784,), activation="relu"))
model.compile(loss=triplet_loss.TripletLoss.semihard, optimizer=Adam())

(train_x, train_y), (test_x, test_y) = mnist.load_data()
train_x = train_x.reshape((-1, 784)) / 255.0
print(train_x)

ValueError: optimizer must be an instance of tf.train.Optimizer, not a

I tried importing an optimizer from tf.train but it does not seem to find anything to import...

tf version is 1.12

Thanks

Worked when changing the code to

model = Sequential()
model.add(Dense(units=100, input_shape=(784,), activation="relu"))
model.compile(loss=triplet_loss.TripletLoss.semihard, optimizer=tf.train.AdamOptimizer(learning_rate=0.005))

This will work for Tensorflow version 2.x

model = Sequential()
model.add(Dense(units=100, input_shape=(784,), activation="relu"))
model.compile(loss=triplet_loss.TripletLoss.semihard, optimizer=tf.compat.v1.train.AdamOptimizer(learning_rate=0.005))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM