本文介绍了tfjs-examples/date-conversion-attention 中使用的注意力模型是什么?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直在查看 tfjs 示例并尝试了解 seq2seq 模型.在这个过程中,我偶然发现了 date-conversion-注意示例.

I've been looking at tfjs examples and trying to learn about seq2seq models. During the process, I've stumbled upon the date-conversion-attention example.

这是一个很好的例子,但是例子中使用了什么样的注意力机制?Readme 文件中没有信息.有人可以给我指出描述这里使用的注意力的论文吗?

It's a great example but what kind of attention mechanism is being used in the example? There is no info in Readme file. Can somebody point me to the paper that describes the attention that's being used here?

链接到注意部分:https://github.comcom/tensorflow/tfjs-examples/blob/908ee32750ba750a14d15caeb53115e2d3dda2b3/date-conversion-attention/model.js#L102-L119

推荐答案

我相信我找到了答案.date-conversion-attention 中使用的注意力模型使用点积对齐分数,并在 Effective Approaches to Attention-based Neural Machine Translation 中进行了描述.链接:https://arxiv.org/pdf/1508.04025.pdf

I believe I found the answer.The attention model used in the date-conversion-attention uses the dot product alignment score and it's described in Effective Approaches to Attention-based Neural Machine Translation. Link: https://arxiv.org/pdf/1508.04025.pdf

这篇关于tfjs-examples/date-conversion-attention 中使用的注意力模型是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

05-30 13:16