I have the following dataset
username,itemname,value
"carl","socks",12.50
"john doe","shirts",30.00
...
I also have the following vocabulary lookup files
usernames.txt
carl
john doe
bob smith
...
itemnames.txt
socks
shirts
shoes
...
I will be receiving the strings at prediction time. There is no way around that. In order to make training similar I am using tf.contrib.lookup
import tf.contrib.lookup
user_lookup = tf.contrib.lookup.index_table_from_file(
vocabulary_file='usernames.txt'
)
item_lookup = tf.contrib.lookup.index_table_from_file(
vocabulary_file='itemnames.txt'
)
Now I have the following model defined using the keras api
import tensorflow as tf
user_input = tf.keras.layers.Input(shape=(1,), dtype=tf.int32)
item_input = tf.keras.layers.Input(shape=(1,), dtype=tf.int32)
user_embedding = tf.keras.layers.Embedding(input_dim=num_users, output_dim=10)(user_input)
item_embedding = tf.keras.layers.Embedding(input_dim=num_items, output_dim=10)(item_input)
...
output = ...
model = tf.keras.Model([user_input, item_input], output)
model.compile(...)
I am using tf.estimator for training and prediction. So my first instinct is to do the following:
my_estimator = tf.keras.estimator.model_to_estimator(keras_model=model)
tf.tables_initializer()
def train_fn(dataset_iterator):
(username, itemname), value = dataset_iterator.get_next()
userid = user_lookup.lookup(username)
itemid = item_lookup.lookup(itemname)
return (username, itemname), value
my_train_spec = tf.estimator.TrainSpec(
input_fn=train_fn(train_data)
)
my_eval_spec = tf.estimator.EvalSpec(
input_fn=train_fn(validation_data)
)
tf.estimator.train_and_evaluate(
estimator=my_estimator,
train_spec=my_train_spec,
eval_spec=my_eval_spec
)
When I run this I get the follow error:
ValueError: Tensor("Cast_2:0", shape=(), dtype=int32) must be from the same graph as Tensor("Item-Embedding-LMF/embeddings/Read/ReadVariableOp:0", shape=(429099, 10), dtype=float32, device=/job:ps/task:1).
Can anyone recommend a solution to this problem? Or maybe even a different approach to handling this lookup?
Mostly, no issue with the lookup. The all the variaables should be asscociated with the same graphs May be write the model in the scope eg
def model_fn:
with tf.variable_scope('my_model', reuse=tf.AUTO_REUSE):
....
..
return estimator
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.