gpt-2-simple
                                
                                
                                
                                    gpt-2-simple copied to clipboard
                            
                            
                            
                        Converting generated data to not tokenized default version
Hi there,
Thank you for sharing this repo. My problem is, I am training sequential data, where each word is an unique code in my txt file. Seemingly, GPT does not have any problem to understand and generate this data, however the data it creates are only tokens, before it tokenize the dataset before training. Now I have tokens as output but I need to convert my data back to use it properly. I did not have this problem with textgenrnn because it is based on chars, however i could not run it on colab due to dependencies. How can I map real values to generated tokens?
Thanks a lot.