本文介绍了在 TensorFlow 中实现 heaviside 阶跃函数的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在 TensorFlow 中创建 heaviside 阶跃函数.由于 Heaviside 函数不可微,我还需要选择导数近似并定义自定义梯度,因此完整的实现如下所示:

I want to create heaviside step function in TensorFlow. Since Heaviside function is not differentiable I also need to choose derivative approximation and define custom gradient so full implementation looks like this:

import tensorflow as tf


@tf.RegisterGradient("HeavisideGrad")
def _heaviside_grad(unused_op: tf.Operation, grad: tf.Tensor):
    x = unused_op.inputs[0]
    # During backpropagation heaviside behaves like sigmoid
    return tf.sigmoid(x) * (1 - tf.sigmoid(x)) * grad


def heaviside(x: tf.Tensor, g: tf.Graph = tf.get_default_graph()):
    custom_grads = {
        "Sign": "HeavisideGrad"
    }
    with g.gradient_override_map(custom_grads):
        # TODO: heaviside(0) currently returns 0. We need heaviside(0) = 1
        sign = tf.sign(x)
        # tf.stop_gradient is needed to exclude tf.maximum from derivative
        step_func = sign + tf.stop_gradient(tf.maximum(0.0, sign) - sign)
        return step_func

在我的实现中有一个警告:tf.sign(0) 返回零值,所以 heaviside(0) 也返回零,我想要 heaviside(0) 返回 1. 我怎样才能实现这种行为?

There is one caveat in my implementation: tf.sign(0) returns zero value so heaviside(0) also returns zero and I want heaviside(0) to return 1. How can I achieve such behavior?

推荐答案

一个非常hacky 的方法是使用

A very hacky way would be to use

1 - max(0.0, sign(-x)) 

作为您的步进函数而不是

as your step function instead of

max(0.0, sign(x))

另一种选择是使用 Greater_equal 并将结果转换为您想要的类型,并使用您已有的 sigmoid 覆盖覆盖其梯度.

Another option would be to use greater_equal and cast the result to your desired type, and override its gradient with the sigmoid override you already have.

这篇关于在 TensorFlow 中实现 heaviside 阶跃函数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-11 22:12