text2text
text2text copied to clipboard
length of input for tokenization
Could u please tell the maximum size of inputs for tokenization?
There should not be a limit to the length of inputs for tokenization other than that of your machine resources (i.e., memory, CPU, etc.).
The model inputs, however, are usually limited depending on the model.