coder4nlp
coder4nlp
+1
Please!
pre-processed dat +1
> > > Hi, @coder4nlp , the training code is scheduled for release at the end of this month. If you are urgent to finetune our model, you can refer...
where are the schedules?
@HAWLYQ Thank you very much!
@kzjeef dashinfer==2.0.0rc3 dashinfer-vlm==2.3.0 transformers==4.51.3 **When using qwen2-vl, the startup failed.** ``` dashinfer_vlm_serve --model /models/Qwen/Qwen2-VL-2B --host 127.0.0.1 ``` ``` Start converting ONNX model! Loading safetensors checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:01
Hi, @kzjeef.Thanks for your response. When using qwen2.5-vl, the service can start normally, but it is extremely slow. Here are my startup commands. **However, when the input consists of multiple...
@kzjeef. I have fixed the qwen2-vl issue, but the inference time is still very slow.
> > [@kzjeef](https://github.com/kzjeef). I have fixed the qwen2-vl issue, but the inference time is still very slow. > > what's your --vision_engine parameter in qwen2-vl test? --vision_engine parameter has not...