world2vec
world2vec
Yes, it is really annoying. For me , I just want to hide all output from SC2 proc and keep output of my program. Do not how to do this!!!
@flyboyer Is your score on validate or test dataset? How about Model2?
第一赛季的线上测试数据,以及第二赛季的数据当时官方没有公开 建议向官方咨询
Thanks for your quick response and guidance. Looking forward to your paper to know the gains of these 2 component. The first run of my implementation only have about 90.x...
Dear authors, Could you publish details of model super parameters for PTB word? Thanks
Any update? lots of people want it!
For my training code(I did not use huggingface trainer), ``` model, tokenizer = FastLanguageModel.from_pretrained("xxxx", dtype=getattr(torch, 'bfloat16'), max_seq_length=768, load_in_4bit=True) model = FastLanguageModel.get_peft_model(model, r=64, lora_alpha=16, lora_dropout=0, bias="none", random_state=32 target_modules=["q_proj", "k_proj", "v_proj", "o_proj",...
@danielhanchen For my case is on RTX4090. torch AMP with float16 work well, it does not work for bfloat16.