问题描述
在以下TensorFlow函数中,我们必须在最后一层中提供人工神经元的激活.我了解.但是我不明白为什么叫logits?那不是数学函数吗?
In the following TensorFlow function, we must feed the activation of artificial neurons in the final layer. That I understand. But I don't understand why it is called logits? Isn't that a mathematical function?
loss_function = tf.nn.softmax_cross_entropy_with_logits(
logits = last_layer,
labels = target_output
)
推荐答案
登录是一个重载术语,可能意味着很多不同的东西:
Logits is an overloaded term which can mean many different things:
在数学中, Logit 是一种映射概率的函数( [0, 1]
)到R((-inf, inf)
)
In Math, Logit is a function that maps probabilities ([0, 1]
) to R ((-inf, inf)
)
0.5的概率对应于0的对数.负的logit对应于小于0.5的概率,对大于0.5的正数.
Probability of 0.5 corresponds to a logit of 0. Negative logit correspond to probabilities less than 0.5, positive to > 0.5.
在ML 中,可以
也要登录 有时是指sigmoid函数的逐元素逆.
Logits also sometimes refer to the element-wise inverse of the sigmoid function.
这篇关于TensorFlow中logits一词的含义是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!