aabbccddwasd
aabbccddwasd
我成功解决了这个问题 在ipython kernel install --name chatglm3-demo --user创建完内核后在.local/share/jupyter/kernels/修改kernel.json,在argv里增加"--matplotlib=inline"这样就可以正常显示图片了
https://github.com/QwenLM/Qwen2.5-VL/issues/719#issuecomment-2635751358
> The quantized models for Qwen2.5-VL are coming soon. thanks!! How soon, 1 week or 1 month?
> add vllm support Qwen2.5-VL too, please well, I think vllm developers is responsible for this. they said v0.7.2 will support qwen2.5-vl.
> great, the only missing part is 4bit AWQ or guff version of 72b for 48gb vram devices and 30b-40b AWQ version for 24gb vram devices also gptq, in 2080ti...
no problem in my case
emm, Now we have 4090 48GB so is there enough VRAM for FA3, isn't it?