本文介绍了输入0与gru_13层不兼容:预期ndim = 3,找到的ndim = 2的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在3GRU层上使用3CNN.这是架构:

I want to use 3CNN with 3GRU layers. Here is the architecture:

layer_a = Input(shape=(120,), dtype='float32',name='main')
layer_b = Input(shape=(9,), dtype='float32', name='site')
layer_c = Input(shape=(4,), dtype='float32', name='access')
model = Model(inputs=[layer_a, layer_b,layer_c], outputs=[layer_f])
model.compile(optimizer='adam',loss=smape_error)

但是当我尝试将其放入数据中时,会产生错误:

But when I tried to fit into my data, it produces an error:

不确定出什么问题了吗

推荐答案

GRU层需要以下尺寸(batch_size,seq_len,dim_per_seq)它也会返回(batch_siz,number_of_neurons),因此为了将2个GRU彼此放置在一起,第一GRU层需要设置参数return_sequences = True.

GRU layers need the following dimension (batch_size, seq_len, dim_per_seq)also it returns (batch_siz, number_of_neurons) so in order to put 2 GRU after each other, the first GRU layer need to set the parameter return_sequences=True.

此外,在构建keras模型时,最好使用model.summary()(在错误出现之前先构建模型的一部分)进行调试.问题通常出在意想不到的状态.

Also, when building keras models it's always a good idea to use model.summary()(just build part of the model before the erroe appears) to debug. Often the problem lies in an unexpected shape.

您的体系结构根本不适合使用GRU层.首先,您无法展平张量,因为这将破坏序列的结构.这将使连接各层成为不可能.您可以将树的图层layer_t,layer_tt和layer_ttt转换并合并到相同的第二维(应大于1).这样,您可以将最后一个维度连接起来,并获得带有形状序列的张量以放入gru层.

Your architecture is not suited for using GRU layers at all. First you can't flatten the tensors because this will destroy your sequence like structure. This will make concatenating the layers impossible. You could conv and pool your tree layers layer_t, layer_tt and layer_ttt to the same second dimension (should be bigger then 1). This way you could concatenate the last dimension and get a tensor with a sequence like shape to put into a gru layer.

这篇关于输入0与gru_13层不兼容:预期ndim = 3,找到的ndim = 2的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-14 00:41