简体   繁体   English

Tensorflow-Keras 向量归一化 output 层

[英]Tensorflow-Keras vector normalise output layer

My model outputs a direction vector in 3d space so I do not care about the magnitude of the vector.我的 model 在 3d 空间中输出方向向量,所以我不关心向量的大小。 How can I vector normalise the output layer so that the loss function doesn't care about the magnitude either?如何对 output 层进行矢量归一化,以便损失 function 也不关心幅度?

Model: Model:

model = keras.Sequential(
    [
        keras.Input(shape=(17,), dtype=np.float64),
        layers.Dense(9, activation="relu"),
        layers.Dense(9, activation="relu"),
        layers.Dense(9, activation="relu"),
        layers.Dense(9, activation="relu"),
        layers.Dense(3) # output layer I want to vector normalise
    ]
)

Alternatively would it be possible to specify for the loss function to only consider the angle between the vectors as loss?或者,是否可以指定损失 function 仅将向量之间的角度视为损失?

Thank you.谢谢你。

I found what I was looking for:我找到了我要找的东西:

tf.keras.losses.CosineSimilarity

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM