Abheesht
Abheesht
Resolves https://github.com/keras-team/keras-nlp/issues/259 and https://github.com/keras-team/keras-nlp/issues/190 - BLEU - Use KerasNLP's WordPiece Tokeniser
Is there a way to make ">>>" unselectable on [keras.io](http://keras.io/)? Even a copy button will suffice. **keras.io Docs**  **TensorFlow Docs**  Since ">>>" is unselectable in TensorFlow, the user...
Resolves #241 - [x] Greedy Search - [ ] Beam Search (will probably open a separate PR for this) - [x] Top-p Search - [x] Top-k Search - [x] Random...
When `jit_compile` is set to True, the decoding functions do not work. We wish to add support for the same in the future. Error: https://p.ip.fi/2TNt
Currently, we use `tf.py_function` to implement BLEU score. This is because the graph mode implementation which we tried out wasn't very efficient. This notebook compares the two approaches: https://colab.research.google.com/drive/1TZ8XnrmMcU8ZE2J-3amb44p-U-hxER53?usp=sharing. We...
Made an attempt to do the above here: https://colab.research.google.com/drive/1PBMzeBd-HyFE0o4VXwk19-kqHIhOZM49?usp=sharing. Ran into an issue: ``` --------------------------------------------------------------------------- TypeError Traceback (most recent call last) in () 1 inputs = keras.Input(shape=(), dtype="string") ----> 2...
Currently, the [`rouge`](https://github.com/google-research/google-research/tree/master/rouge) package does not have functionality to pass a custom tokeniser. This commit takes care of that: https://github.com/google-research/google-research/commit/61ce9f0ca76025dac5b671c0631e443a9975a8a3. However, it is not part of a release yet. Waiting...
We can add notebooks (or share Colab notebooks) for existing examples in the library (with instructive text and explanation).
Building on https://github.com/keras-team/keras-nlp/blob/master/keras_nlp/layers/preprocessing/mlm_mask_generator.py which dynamically masks tokens, I was wondering if we can implement a layer for how XLNet generates permutation masks for its inputs (Permutation Language Modelling). This is...