nmt
nmt copied to clipboard
TensorFlow Neural Machine Translation Tutorial
when i curl the website Links , it shows below. ```html 301 Moved Permanently Moved Permanently The document has moved here. ``` So, this script should be changed!
I have a problem when running my code that trains 2DCNN using pickle, I use CPU not GPU. The error occurs from the the first epoch and the program stop....
Hi, When I tried to modify the source code to support distributed training, the following error happened: ``` E tensorflow/core/framework/variant.cc:102] Could not decode variant with type_name: "tensorflow::DatasetVariantWrapper". Perhaps you forgot...
The [Neural Machine Translation (seq2seq) Tutorial](https://github.com/tensorflow/nmt#background-on-the-attention-mechanism) contains a dead link under the **Background on the Attention Mechanism** section. It is said like this: > Various implementations of attention mechanisms can...
Dead link at line 568 replaced with working link
What is the name of the loss function that is used? Can anyone please help?
Hi, the current implementation of smooth-BLEU contains a bug: it smoothes unigrams as well. Consequently, when both the reference and translation consist of totally different tokens, it anyway returns a...
Code does not work with the latest Tensorflow library: ``` Traceback (most recent call last): File "/Users/XXXXX/opt/anaconda2/envs/XXXXXX/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/Users/XXXXX/opt/anaconda2/envs/XXXXXX/lib/python3.6/runpy.py", line 85, in _run_code exec(code,...
Is it possible to use BERT word embeddings along with this NMT implementation? The goal is to use a pre-trained BERT language model so the contextualized embedding could be leveraged....
Is it possible to use BERT based contextualized word embeddings along with the nmt implementation? I want to take advantage of the pretrained BERT language model so the NMT weights...