mu1ReX

Results 6 comments of mu1ReX

> > 使用的就是basic_demo/cli_demo_multi_gpus.py,还是一样的报错 > > 我也遇到了同样的错误,请问解决了吗 我也是 可以参考: # https://github.com/THUDM/CogVLM/issues/256 # https://huggingface.co/THUDM/cogagent-chat-hf/tree/main

> Same exception with `ValueError: The model's max seq len (2048) is larger than the maximum number of tokens that can be stored in KV cache (176). Try increasing `gpu_memory_utilization`or...

推荐用这个插件[Diagrams.Net](https://github.com/jensmtg/obsidian-diagrams-net)

> Hi same issue for me. I am trying vllm with facebook/opt-125m using openai template, can someone help ? ValueError: As of transformers v4.44, default chat template is no longer...