问题描述
我正在实施Hinton的知识蒸馏论文.第一步是存储温度较高的繁琐模型"的软目标(即,我不需要训练网络,只需要对每个图像进行前向传递,并存储温度为T
的软目标).
有没有一种方法可以获取Alexnet或googlenet软目标的输出,但温度不同?
我需要使用pi= exp(zi/T)/sum(exp(zi/T)
修改soft-max.
需要用温度T
划分最终完全连接层的输出.我只需要这个就可以通过(不用于训练).
I am working on implementing Hinton's Knowledge distillation paper. The first step is to store the soft targets of a "cumbersome model" with a higher temperature (i.e. I don't need to train the network, just need to do forward pass per image and store the soft targets with a temperature T
).
Is there a way I can get the output of Alexnet or googlenet soft targets but with a different temperature?
I need to modify the soft-max with pi= exp(zi/T)/sum(exp(zi/T)
.
Need to divide the outputs of the final fully connected layer with a temperature T
. I only need this for the forward pass (not for training).
推荐答案
我相信可以通过三种方法来解决此问题
I believe there are three options to solve this problem
1..使用温度参数实现自己的Softmax
图层.修改 softmax_layer.cpp
考虑到温度" T
.您可能需要调整 caffe.proto
并允许使用额外的参数来解析Softmax
层.
1. Implement your own Softmax
layer with a temperature parameter. It should be quite straight forward to modify the code of softmax_layer.cpp
to take into account a "temperature" T
. You might need to tweak the caffe.proto
as well to allow for parsing Softmax
layer with an extra parameter.
2..将该层实现为 python层.
3..如果您只需要向前通过,即提取要素",则可以在softmax之前之前简单地将图层的顶部"输出为要素.层,然后完全在咖啡外部温度下进行softmax.
3. If you only need a forward pass, i.e. "extracting features", then you can simply output as features the "top" of the layer before the softmax layer and do the softmax with temperature outside caffe altogether.
4..您可以添加 Scale
顶层Softmax
层之前的层:
4. You can add Scale
layer before the top Softmax
layer:
layer {
type: "Scale"
name: "temperature"
bottom: "zi"
top: "zi/T"
scale_param {
filler: { type: 'constant' value: 1/T } # replace "1/T" with the actual value of 1/T.
}
param { lr_mult: 0 decay_mult: 0 } # make sure temperature is fixed
}
layer {
type: "Softmax"
name: "prob"
bottom: "zi/T"
top: "pi"
}
这篇关于Caffe:Softmax与温度的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!