Chengxu Zhuang
Chengxu Zhuang
Hi, I find that the tokenizers for OPT models have possibly wrong "special_tokens_map": ``` >>> from transformers import GPT2Tokenizer >>> tokenizer = GPT2Tokenizer.from_pretrained("facebook/opt-350m") >>> tokenizer.special_tokens_map {'bos_token': '', 'eos_token': '', 'unk_token':...
The map object in github information cannot be sonified.
Finish the AlexNet example
Git check problem, which will throw out an error for now.
@nhaber * Make sure the optimizer is able to handle multiple losses * DBinterface, when things will be saved in the database * Dataset creation, graph instantiation can be done...
Slightly modify the copy deepcopy to make it support functions from class instances with tensorflow tensors as attributes.
Get Josh and Aran to contribute the loss agg_func they have implemented. @jbmelander @anayebi
No tags were actually added in the cells about adding XML tags, and the output was the same as the previous cell, which was incorrect.