gpt-2-simple icon indicating copy to clipboard operation
gpt-2-simple copied to clipboard

Converting generated data to not tokenized default version

Open erenarkangil opened this issue 3 years ago • 0 comments

Hi there,

Thank you for sharing this repo. My problem is, I am training sequential data, where each word is an unique code in my txt file. Seemingly, GPT does not have any problem to understand and generate this data, however the data it creates are only tokens, before it tokenize the dataset before training. Now I have tokens as output but I need to convert my data back to use it properly. I did not have this problem with textgenrnn because it is based on chars, however i could not run it on colab due to dependencies. How can I map real values to generated tokens?

Thanks a lot.

erenarkangil avatar Aug 04 '22 04:08 erenarkangil