multi30k / README.md
msarmi9's picture
[update]: README
8deec7c
|
raw
history blame
954 Bytes
---
license: mit
language:
- de
- en
tags:
- translation
- pytorch
datasets:
- multi30k
metrics:
- bleu
model-index:
- name: multi30k
results:
- task:
type: translation
dataset:
type: multi30k
name: multi30k-de-en
metrics:
- type: bleu
value: 33.468
name: Test BLEU
args: n_gram=4
---
# Seq2seq + Attention
[![colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/msarmi9/dlxp/blob/master/colab/seq2seq.ipynb)
Pytorch implementation of [Neural Machine Translation by Jointly Learning to Align and Translate](https://arxiv.org/abs/1409.0473). Trained on the [Multi30k-de-en](http://www.statmt.org/wmt16/multimodal-task.html#task1) dataset with sentencepiece as the tokenizer.
Here's the attention heatmap of a random sample from the test set:
![attention-heatmap](attention-heatmap.png)