Skip to content

dillonalaird/Attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

89 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Attention-based Neural Machine Translation

This is an implementation of the attention mechanism used in "Effective Approaches to Attention-based Neural Machine Translation" by Minh-Thang Luong, Hieu Pham and Chistopher D. Manning. The paper can be found here.

The datasets can be downloaded from here. In order to run the models as is you will need to rename the dataset filenames according to the names found in main.py. You will also need to add the "<pad>" token to the vocab file.

About

Attention based neural machine translation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published