简体   繁体   English

ValueError 在 Keras 中构建具有 2 个输出的神经网络

[英]ValueError building a neural network with 2 outputs in Keras

I tried to build a network having a single input X (a 2-dimensions matrix of size Xa*Xb) and 2 outputs Y1 and Y2 (both in 1 dimension).我尝试构建一个具有单个输入 X(大小为 Xa*Xb 的二维矩阵)和 2 个输出 Y1 和 Y2(均为一维)的网络。 Even though it isn't the case in the code I posted below, Y1 is supposed to be a classifier that outputs a one-hot vector and Y2 is supposed to be for regression (the original code raised the same error).即使我在下面发布的代码中不是这种情况,Y1 应该是一个输出单热向量的分类器,而 Y2 应该是用于回归(原始代码引发了相同的错误)。

When training the network I get the following error:在训练网络时,我收到以下错误:

ValueError: Shapes (None, None) and (None, 17, 29) are incompatible

Obviously, (None, 17, 29) translates to (None, size_Xa, size_Y1) , and I don't understand why Xa and Y1 should be related (independantly from Xb) in the first place.显然, (None, 17, 29)转换为(None, size_Xa, size_Y1) ,我不明白为什么 Xa 和 Y1 首先应该相关(独立于 Xb)。

Here is my code.这是我的代码。 I tried to reduce it to the minimum in order to make it easier to understand.我试图将其减少到最低限度,以便更容易理解。

import numpy as np
from keras.layers import Dense, LSTM, Input
from keras.models import Model

def dataGenerator():
    while True:
        yield makeBatch()
def makeBatch():
    """generates a batch of artificial training data"""
    x_batch, y_batch = [], {}
    x_batch = np.random.rand(batch_size, size_Xa, size_Xb)
    #x_batch = np.random.rand(batch_size, size_Xa)
    y_batch['output1'] = np.random.rand(batch_size, size_Y1)
    y_batch['output2'] = np.random.rand(batch_size, size_Y2)
    return x_batch, y_batch

def generate_model():
    input_layer = Input(shape=(size_Xa, size_Xb))
    #input_layer = Input(shape=(size_Xa))
    common_branch = Dense(128, activation='relu')(input_layer)
    branch_1  = Dense(size_Y1, activation='softmax', name='output1')(common_branch)
    branch_2  = Dense(size_Y2, activation='relu',    name='output2')(common_branch)
    model = Model(inputs=input_layer,outputs=[branch_1,branch_2])

    losses = {"output1":"categorical_crossentropy", "output2":"mean_absolute_error"}
    model.compile(optimizer="adam",
                        loss=losses,
                        metrics=['accuracy'])
    return model

batch_size=5
size_Xa = 17
size_Xb = 13
size_Y2 = 100 
size_Y1 = 29

model = generate_model()

model.fit(  x=dataGenerator(),
            steps_per_epoch=50,
            epochs=15,
            validation_data=dataGenerator(), validation_steps=50, verbose=1)

If I uncomment the 2 commented lines in makeBatch and generate_model, the error disappears.如果我取消注释 makeBatch 和 generate_model 中的 2 行注释,错误就会消失。 So if the input X is in 1 dimension it runs, but when I change it to 2 dimensions (keeping everything else the same) the error appears.因此,如果输入 X 在 1 维中运行,但是当我将其更改为 2 维(保持其他所有内容相同)时,会出现错误。

Is this related to the architecture with 2 outputs?这与具有 2 个输出的架构有关吗? I think there is something I'm missing here, any help is welcome.我认为我在这里缺少一些东西,欢迎任何帮助。

I add the full error log for reference:我添加了完整的错误日志以供参考:

Epoch 1/15
Traceback (most recent call last):
  File "neuralnet_minimal.py", line 41, in <module>
    model.fit(  x=dataGenerator(),
  File "/path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/keras/engine/training.py", line 66, in _method_wrapper
    return method(self, *args, **kwargs)
  File "/path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/keras/engine/training.py", line 848, in fit
    tmp_logs = train_function(iterator)
  File "/path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py", line 580, in __call__
    result = self._call(*args, **kwds)
  File "/path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py", line 627, in _call
    self._initialize(args, kwds, add_initializers_to=initializers)
  File "/path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py", line 505, in _initialize
    self._stateful_fn._get_concrete_function_internal_garbage_collected(  # pylint: disable=protected-access
  File "/path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 2446, in _get_concrete_function_internal_garbage_collected
    graph_function, _, _ = self._maybe_define_function(args, kwargs)
  File "/path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 2777, in _maybe_define_function
    graph_function = self._create_graph_function(args, kwargs)
  File "/path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 2657, in _create_graph_function
    func_graph_module.func_graph_from_py_func(
  File "/path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/framework/func_graph.py", line 981, in func_graph_from_py_func
    func_outputs = python_func(*func_args, **func_kwargs)
  File "/path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/eager/def_function.py", line 441, in wrapped_fn
    return weak_wrapped_fn().__wrapped__(*args, **kwds)
  File "/path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/framework/func_graph.py", line 968, in wrapper
    raise e.ag_error_metadata.to_exception(e)
ValueError: in user code:

    /path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/keras/engine/training.py:571 train_function  *
        outputs = self.distribute_strategy.run(
    /path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/distribute/distribute_lib.py:951 run  **
        return self._extended.call_for_each_replica(fn, args=args, kwargs=kwargs)
    /path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/distribute/distribute_lib.py:2290 call_for_each_replica
        return self._call_for_each_replica(fn, args, kwargs)
    /path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/distribute/distribute_lib.py:2649 _call_for_each_replica
        return fn(*args, **kwargs)
    /path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/keras/engine/training.py:532 train_step  **
        loss = self.compiled_loss(
    /path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/keras/engine/compile_utils.py:205 __call__
        loss_value = loss_obj(y_t, y_p, sample_weight=sw)
    /path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/keras/losses.py:143 __call__
        losses = self.call(y_true, y_pred)
    /path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/keras/losses.py:246 call
        return self.fn(y_true, y_pred, **self._fn_kwargs)
    /path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/keras/losses.py:1527 categorical_crossentropy
        return K.categorical_crossentropy(y_true, y_pred, from_logits=from_logits)
    /path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/keras/backend.py:4561 categorical_crossentropy
        target.shape.assert_is_compatible_with(output.shape)
    /path/of/my/project/venv/lib/python3.8/site-packages/tensorflow/python/framework/tensor_shape.py:1117 assert_is_compatible_with
        raise ValueError("Shapes %s and %s are incompatible" % (self, other))

    ValueError: Shapes (None, None) and (None, 17, 29) are incompatible

Strangely enough, the error disappears when I add a Flatten() layer before the network splits... It has to do with the shape of the network but I still don't get the real reason behind all of this.奇怪的是,当我在网络分裂之前添加一个Flatten()层时,错误消失了......这与网络的形状有关,但我仍然不明白这背后的真正原因。

I will mark this as correct answer as it solves the problem, unless someone else posts something.我会将其标记为正确答案,因为它解决了问题,除非其他人发布了一些东西。 Please tell me if this is not the right way to do it.如果这不是正确的方法,请告诉我。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 无法修复 ValueError:在 Keras 中构建简单的神经网络 model - Can't fix ValueError: Building a simple neural network model in Keras python keras神经网络预测不起作用(输出0或1) - python keras neural network prediction not working (outputs 0 or 1) 限制神经网络回归 (Keras) 中的输出总和 - Restrict the sum of outputs in a neural network regression (Keras) 将 keras 用于两个输出的神经网络的问题 - Problem using keras for neural network for two outputs Keras 神经网络:ValueError - 输入形状错误 - Keras neural network: ValueError - input shape is wrong Keras神经网络为每个输入输出相同的结果 - Keras neural network outputs same result for every input 实施具有多个输出的神经网络时遇到 keras 错误 - Encountering error for keras when implementing neural network with multiple outputs 具有多个输出的 keras 中神经网络回归 model 的最佳指标 - Best metrics for neural network regression model in keras with multiple outputs Keras 与一维卷积神经网络中的输入形状相关的 ValueError - Keras ValueError related to input shape in 1D Convolutional Neural Network 构建简单的神经网络:ValueError: Input 0 of layer sequence is in compatible with the layer - Building a simple Neural Network: ValueError: Input 0 of layer sequential is incompatible with the layer
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM