问题描述
我一直在查看 tfjs 示例并尝试了解 seq2seq 模型.在这个过程中,我偶然发现了 date-conversion-注意示例.
I've been looking at tfjs examples and trying to learn about seq2seq models. During the process, I've stumbled upon the date-conversion-attention example.
这是一个很好的例子,但是例子中使用了什么样的注意力机制?Readme
文件中没有信息.有人可以给我指出描述这里使用的注意力的论文吗?
It's a great example but what kind of attention mechanism is being used in the example? There is no info in Readme
file. Can somebody point me to the paper that describes the attention that's being used here?
推荐答案
我相信我找到了答案.date-conversion-attention
中使用的注意力模型使用点积对齐分数,并在 Effective Approaches to Attention-based Neural Machine Translation
中进行了描述.链接:https://arxiv.org/pdf/1508.04025.pdf
I believe I found the answer.The attention model used in the date-conversion-attention
uses the dot product alignment score and it's described in Effective Approaches to Attention-based Neural Machine Translation
. Link: https://arxiv.org/pdf/1508.04025.pdf
这篇关于tfjs-examples/date-conversion-attention 中使用的注意力模型是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!