本文介绍了Keras中的多个输出的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当给定预测变量矢量时,我有一个问题要处理两个输出.假设预测变量矢量看起来像x1, y1, att1, att2, ..., attn,表示x1, y1是坐标,而att's是附加到x1, y1坐标出现的其他属性.基于此预测变量集,我想预测x2, y2.这是一个时间序列问题,我正在尝试使用多元回归解决.我的问题是如何设置keras,这可以在最后一层为我提供2个输出.我已经解决了keras中的简单回归问题,并且可以在我的github .

I have a problem which deals with predicting two outputs when given a vector of predictors.Assume that a predictor vector looks like x1, y1, att1, att2, ..., attn, which says x1, y1 are coordinates and att's are the other attributes attached to the occurrence of x1, y1 coordinates. Based on this predictor set I want to predict x2, y2. This is a time series problem, which I am trying to solve using multiple regresssion.My question is how do I setup keras, which can give me 2 outputs in the final layer. I have solved simple regression problem in keras and the code is avaialable in my github.

推荐答案

from keras.models import Model
from keras.layers import *

#inp is a "tensor", that can be passed when calling other layers to produce an output
inp = Input((10,)) #supposing you have ten numeric values as input


#here, SomeLayer() is defining a layer,
#and calling it with (inp) produces the output tensor x
x = SomeLayer(blablabla)(inp)
x = SomeOtherLayer(blablabla)(x) #here, I just replace x, because this intermediate output is not interesting to keep


#here, I want to keep the two different outputs for defining the model
#notice that both left and right are called with the same input x, creating a fork
out1 = LeftSideLastLayer(balbalba)(x)
out2 = RightSideLastLayer(banblabala)(x)


#here, you define which path you will follow in the graph you've drawn with layers
#notice the two outputs passed in a list, telling the model I want it to have two outputs.
model = Model(inp, [out1,out2])
model.compile(optimizer = ...., loss = ....) #loss can be one for both sides or a list with different loss functions for out1 and out2

model.fit(inputData,[outputYLeft, outputYRight], epochs=..., batch_size=...)

这篇关于Keras中的多个输出的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-21 01:08