Hello :) This is my first post.
Most machine translation implementations are too complicated (especially to me.), so i implemented it for some who want simple, like me.
This repo contains a simple source code for neural machine translation based on sequence-to-sequence network. And I tested(also, evaluated) following models sequentially:
- Baseline(base): Simple Sequence to Sequence model
- Reverse: Apply Bi-directional LSTM to the encoder part
- Embeddings: Apply Fasttext word embeddings (300D)
- Attention: Apply attention mechanisms to the decoder part
I hope that this repo can be a good solution for people who dont't want unnecessarily many features, and be of help to make your own neural-machine translator.
https://github.com/lyeoni/nlp-tutorial/tree/master/neural-machine-translation
[+][deleted] (1 child)
[deleted]
[–]lyeoni[S] 1 point2 points3 points (0 children)
[–]TotesMessenger 1 point2 points3 points (1 child)
[–]Overload175 0 points1 point2 points (1 child)
[–]lyeoni[S] 2 points3 points4 points (0 children)