[英]Getting error while using my custom embedding layer in Tensorflow 2.0
I've made my custom embedding layer but when I'm testing it I'm getting error.我已经制作了自定义嵌入层,但是在测试它时出现错误。 Below is my custom embedding layer.下面是我的自定义嵌入层。
class EndTokenLayer(Layer):
def __init__(self, embedding_dim=128, **kwargs):
super(EndTokenLayer, self).__init__(**kwargs)
self.end_token_embedding = tf.Variable(initial_value=tf.random.uniform(shape=(embedding_dim,)), trainable=True)
def call(self, inputs):
end_token = tf.tile(tf.reshape(self.end_token_embedding, shape=(1, 1, self.end_token_embedding.shape[0])), [tf.shape(inputs)[0],1,1])
return tf.keras.layers.concatenate([inputs, end_token], axis=1)
But when I'm testing to my train_dataset(from tensorflow slices) which is having shape of one batch x = (16,13,128) and y = (16,14)但是当我测试我的 train_dataset(来自 tensorflow 切片)时,它的形状为一批 x = (16,13,128) 和 y = (16,14)
temp = EndTokenLayer()
print(temp(inputs = train.take(1)))
Error logs: ValueError: Attempt to convert a value () with an unsupported type () to a Tensor.错误日志:ValueError:尝试将具有不受支持的类型 () 的值 () 转换为张量。
train.take(1)
will give you a new dataset with a single element, not the first element of the dataset. train.take(1)
将为您提供一个包含单个元素的新数据集,而不是数据集的第一个元素。 Maybe you want something like this:也许你想要这样的东西:
temp = EndTokenLayer()
for elem in train.take(1):
print(temp(inputs=elem))
Or just:要不就:
temp = EndTokenLayer()
print(temp(inputs=next(iter(train))))
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.