Dong ZHOU

Results 2 issues of Dong ZHOU

**Is your feature request related to a problem? Please describe.** I need to calculate the number of tokens, but TokenizerGpt3 has errors in calculations for models of GPT-3.5 and above....

bug

Tokenizer supports multiple encodings: r50k_base, p50k_base, cl100k_base; supports encode and decode method. ```C# Tokenizer tokenizer = new Tokenizer("cl100k_base"); Tokenizer tokenizer = new Tokenizer().FromModelName("gpt-3.5-turbo-0301"); Tokenizer tokenizer = new Tokenizer().FromModel(Models.Model.TextDavinciV3); string str...