codingl2k1

Results 86 comments of codingl2k1

> [@codingl2k1](https://github.com/codingl2k1) Could you look at this? 我看看

The tool call for the vLLM backend is currently not supported. You can try using the transformers backend to see if it supports tool calls.

服务的还有错误日志吗?

有尝试使用 xinference client 吗?如果要自己用 requests 实现请参考: https://github.com/xorbitsai/inference/blob/main/xinference/client/restful/restful_client.py#L764

> 同问 有没有 xinference 服务端日志啊?

Same issue, I am using the CPU build: CPU : SSE3 = 1 | SSSE3 = 1 | AVX = 1 | AVX2 = 1 | F16C = 1 |...

> Re-download Model. Try it. work for me. > > ``` > # SDK模型下载 > from modelscope import snapshot_download > snapshot_download('iic/CosyVoice-300M', local_dir='pretrained_models/CosyVoice-300M') > snapshot_download('iic/CosyVoice-300M-SFT', local_dir='pretrained_models/CosyVoice-300M-SFT') > snapshot_download('iic/CosyVoice-300M-Instruct', local_dir='pretrained_models/CosyVoice-300M-Instruct') > snapshot_download('iic/CosyVoice-ttsfrd',...

The models on huggingface are outdated. ![image](https://github.com/user-attachments/assets/9806231c-1c1a-4dba-b47d-44230296111a)