晨星
晨星
我也出现了! ``` Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a...
我自己试了一下 加这个加粗部分能解决一个报错( tokenizer = AutoTokenizer.from_pretrained(******, trust_remote_code=True **, revision="main"**)
解决方法(自己使用成功了): 加" , revision="main" ": `tokenizer = AutoTokenizer.from_pretrained(******, trust_remote_code=True , revision="main")` 重装torch老版本: `pip install torch==1.12.0`
It seems doesn't supported llama3.1
Maybe you can run `llama run llama3` and use `interpreter --model ollama/llama3` to use