Resources¶
Neural Machine Translation¶
If you want to learn more about neural machine translation, check out the following resources.
Tutorials¶
- The Annotated Transformer by Alexander Rush
- The Annotated Encoder-Decoder by Jasmijn Bastings
- Graham Neubig: Neural Machine Translation and Sequence-to-sequence Models: A Tutorial.
- Philipp Koehn: Neural Machine Translation.
- Video recording of Chris Manning’s lecture on “NMT and Models with Attention” at Stanford (2017)
Publications¶
- NMT papers in the ACL anthology
- statmt.org survey of NMT publications
- THUNLP-MT MT reading list
Data¶
- WMT: The shared tasks of the yearly Conference on Machine Translation (WMT) provide lots of parallel data
- OPUS: The OPUS project collects publicly available parallel data and provides it to everyone on their website.
PyTorch¶
Here’s a collection of links that should help you get started or improve your coding skills with PyTorch:
- Intro to PyTorch from Udacity’s Deep Learning course (Jupyter notebooks)
- 60 min Blitz tutorial by Soumith Chintala
- Fast AI’s MOOC “Practical Deep Learning for Coders” for a practical introduction to Deep Learning and PyTorch
Git Versioning¶
Never worked with Git before? For the basics, check out this tutorial by Roger Dudler and for more advanced usage this one by Lars Vogel.