今天看到google brain 关于激活函数在2017年提出了一个新的Swish 激活函数。
叫swish,地址:https://arxiv.org/abs/1710.05941v1
pytorch里是这样的:
def relu_fn(x): """ Swish activation function """ return x * torch.sigmoid(x)
Swish, which is simply f(x) = x ·sigmoid(x). Our experiments show that Swish tends to work better than ReLU on deeper models across a number of challenging datasets.
For example, simply replacing ReLUs with Swish units improves top-1 classification accuracy on ImageNet by0.9% for MobileNASNetA and 0.6% for Inception-ResNet-v2.
The simplicity of Swish and its similarity to ReLU make it easy for practitioners to replace ReLUs with Swish units in any neural network.
他人的介绍:
https://blog.csdn.net/wydbyxr/article/details/84615522