Charley Xiao

Results 9 comments of Charley Xiao

@lmsunny 同样有这个问题,是装了vue-router之后才出现的。请问下解决了吗?解决办法是什么? (话说这个作者是不是已经停止维护这玩意了)

PS: The training part: ```python training_args = GRPOConfig( output_dir=str(args.output_dir), # use_vllm = True, learning_rate = 5e-6, adam_beta1 = 0.9, adam_beta2 = 0.99, weight_decay = 0.1, warmup_ratio = 0.1, lr_scheduler_type =...

> We updated unsloth and unsloth-zoo. Could you update and try now? Hi, I reinstalled unsloth and unsloth-zoo from their latest commits and am still getting the same error `AttributeError:...

I set `fast_inference=True` in `model, tokenizer = FastLanguageModel.from_pretrained(...)` and the problem's gone lol.

> Hello > > @ai-nikolai @Charley-xiao > > Glad to see this has been resolved. > I will mark this as closed/completed. Please feel free to reopen and comment if...

> Look at the error trace stack you are getting. It's the peft library telling you there is no attribute called load_lora. I think you meant and you should use...

Same here. Using Llama3-8B-Instruct with SIMPLE_CHAT_TEMPLATE from trl: ![Image](https://github.com/user-attachments/assets/afb2c24b-44f1-4172-8657-18b60cfdba41) Here's my code: ```python model, tokenizer = FastModel.from_pretrained( model_name=args.orchestrator, load_in_4bit = True, device_map="auto", full_finetuning=False, ) model = FastLanguageModel.get_peft_model( model, r=16, target_modules=["q_proj",...

Ah I solved it by changing `model, tokenizer = FastModel.from_pretrained(...)` to `model, tokenizer = FastLanguageModel.from_pretrained(...)`

> I experienced the same issue. Did you manage to solve it? I downgraded nerfstudio to version 1.0 (pip install nerfstudio==1.0) and COLMAP to version 3.8 (conda install colmap==3.8). After...