ValueError: A `Concatenate` layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(

动态 未结 0 95
小小草
小小草 LV1 2023年9月23日 23:40 发表
这个错误是因为 `gru1out` 和 `gru2out` 的形状不匹配,无法在 `concatenate` 层中连接。你可以将 `gru2out` 转换为与 `gru1out` 相同的形状,例如: ```python def LSTNet(trainX1,trainX2,trainY,config): input1 = Input(shape=(trainX1.shape[1], trainX1.shape[2])) conv1 = Conv1D(filters=48, kernel_size=6, strides=1, activation='relu') # for input1 # It's a probelm that I can't find any way to use the same Conv1D layer to train the two inputs, conv2 = Conv1D(filters=48, kernel_size=6 , strides=1, activation='relu') # for input2 conv2.set_weights(conv1.get_weights()) # at least use same weight conv1out = conv1(input1) gru1out = CuDNNGRU(64, return_sequences=True)(conv1out) attention1_output = MultiHeadAttention(num_heads=8, key_dim=64)(gru1out, gru1out) attention1_output = LayerNormalization()(attention1_output + gru1out) gru1out = Dropout(config.dropout)(attention1_output) input2 = Input(shape=(trainX2.shape[1], trainX2.shape[2])) conv2out = conv2(input2) attention2_output = MultiHeadAttention(num_heads=8, key_dim=64)(conv2out, conv2out) attention2_output = LayerNormalization()(attention2_output + conv2out) gru2out = CuDNNGRU(64)(attention2_output) gru2out = RepeatVector(gru1out.shape[1])(gru2out) # repeat gru2out to match the shape of gru1out gru2out = Dropout(config.dropout)(gru2out) gru_out = concatenate([gru1out,gru2out]) output = Dense(trainY.shape[1])(gru_out) #highway 使用Dense模拟AR自回归过程
收藏(0)  分享
相关标签: 智能问答
问题没解决?让AI助手帮你作答 AI助手
0个回复
  • 消灭零回复