简体   繁体   English

在具有Tensorflow后端的Keras上,在不同的输入部分上并行拟合LSTM和一些密集层

[英]on Keras with Tensorflow backend, fitting a LSTM and some dense layers in parallel on different fractions of input

I am working on a regression forecast where I have some complex 3D sequences and some features explaining some key characteristics of the sequences. 我正在进行回归预测,其中有一些复杂的3D序列和一些解释序列关键特征的功能。 They are held on two matrices of such shapes: 它们被固定在两个这样的形状的矩阵上:

X1.shape, X2.shape
((9000, 300, 3), (9000, 106))

I want to feed them to a Model instance where the X1 matrix is dealt by a LSTM and the X2 matrix by a couple of dense layers. 我想将它们提供给Model实例,其中X1矩阵由LSTM处理,X2矩阵由几个密集层处理。 My plan is to merge them before the output layer. 我的计划是在输出层之前合并它们。

I was planning to train by: 我打算通过以下方式进行培训:

model.fit(zip(X1, X2), y, batch_size=BATCH, epochs=EPOCHS, validation_split=0.2)

How to build the model in order to receive the two matrices and deal them separately? 如何建立模型以接收两个矩阵并将它们分开处理?

At the moment I just have my standard model for LSTM only: 目前,我只有LSTM的标准模型:

def model(sample_size=300, axis=3):
    inp=Input(shape=(sample_size, axis))
    x=LSTM(50, return_sequences=True)(inp)
    x=GlobalMaxPool1D(x)
    x=Dense(1)(x)
    model=Model(inputs=inp, ouputs=x)
    model.compile(loss='mean_squared_error', optimizer='adam',
                  metrics= ['mae'])
   return model

I think this should work 我认为这应该工作

# First input
input1=Input(shape=(300,3))
x=LSTM(50, return_sequences=True)(input1)
x=GlobalMaxPool1D(x)
x=Dense(n)(x)

# Second Input
input2=Input(shape=(106))
y=Dense(n)(input2)

# Merge
merged=Concatenate([x,y])
merged=Dense(1)(merged)

# Define model with two inputs
model=Model(inputs=[input1,input2],outputs=merged)

Both the models should have same output space before merging. 合并之前,两个模型应具有相同的输出空间。 Then, you can pass a list of inputs and Keras will pass it at the appropriate places 然后,您可以传递输入列表,Keras会将其传递到适当的位置

model.fit([X1,X2],....)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM