nmt_with_attention no longer works on Colab
Hello,
I got stuck halfway through in the nmt_with_attention Colab:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
[<ipython-input-34-f0271e0d820d>](https://localhost:8080/#) in <cell line: 1>()
----> 1 logits = decoder(ex_context, ex_tar_in)
2
3 print(f'encoder output shape: (batch, s, units) {ex_context.shape}')
4 print(f'input target tokens shape: (batch, t) {ex_tar_in.shape}')
5 print(f'logits shape shape: (batch, target_vocabulary_size) {logits.shape}')
1 frames
[<ipython-input-30-dcee1b38b45f>](https://localhost:8080/#) in call(self, context, x, state, return_state)
13
14 # 2. Process the target sequence.
---> 15 x, state = self.rnn(x, initial_state=state)
16 shape_checker(x, 'batch t units')
17
ValueError: Exception encountered when calling Decoder.call().
too many values to unpack (expected 2)
Arguments received by Decoder.call():
• context=tf.Tensor(shape=(64, 18, 256), dtype=float32)
• x=tf.Tensor(shape=(64, 16), dtype=int64)
• state=None
• return_state=False
When I first tried this notebook a long time ago, it worked though.
Regards.
I also experienced same problem on May 1, 2024. I wonder why these tutorials are not verified and if not working removed or updated.
This error was originally reported on April 9. I decided to raise a 'new' issue on May 1, because I got the same error when I tried to run the code under Google Collab or locally under a notebook, running TF 2.16.1. The code that produces the error is posted on the TF Tutorials page. Therefore people trying to learn TF will be mislead and disappointed, as I am. Message to the TF Team: Please fix the code or remove it from the tutorials page. Date: May 18, 2024
try change :
!pip install "tensorflow-text>=2.11"
to :
!pip install "tensorflow-text==v2.15.0"