PyTorch笔记 - Seq2Seq + Attention 算法
Posted SpikeKing
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了PyTorch笔记 - Seq2Seq + Attention 算法相关的知识,希望对你有一定的参考价值。
自回归的序列建模,两种序列属于不同的空间。
两篇文章:
- Seq2Seq & Attention - Neural Machine Translation by Jointly Learning to Align and Translate
- Translate & Alignment
- Seq2Seq & Local Attention - Effective Approaches to Attention-based Neural Machine Translation
- content-based function,考虑内容和位置(编码器和解码器)
- location-based function,只考虑位置(解码器)
- Monotonic、Predictive、Gaussian distribution
Neural Machine Translation by Jointly Learning to Align and Translate
机器翻译:machine translation
MNT:Neutral Machine Translation,神经网络机器翻译
Encoder - Decoder,Stop Token
SOTA:English 2 French Translation,fixed-length vector -> soft-alignment
align and translate
Translate,
以上是关于PyTorch笔记 - Seq2Seq + Attention 算法的主要内容,如果未能解决你的问题,请参考以下文章
PyTorch笔记 - Seq2Seq + Attention 源码
PyTorch笔记 - Seq2Seq + Attention 源码