semantic-kernel icon indicating copy to clipboard operation
semantic-kernel copied to clipboard

Python: Add support for using other tokenizers

Open TheMrguiller opened this issue 1 year ago • 0 comments

Hi,

As part of developing a service-oriented prompt, I find it nearly mandatory to have a way to count tokens prior to sending them to the model. This allows us to modify the input to meet the API or local model's maximum token limitation, given the initial length of the input and prompt. This gives us the opportunity to prepare the code for forthcoming errors that are unwanted and can be controlled. I know that in Python, the Hugging Face tokenizer exists, but a similar solution seems to be lacking in C#.

Thanks for you time

TheMrguiller avatar May 13 '24 13:05 TheMrguiller