Lanture1064

Results 4 issues of Lanture1064

## Why are these changes needed? When using langchaingo w/ fastchat, it generates requests with formats like `"messages":["text":"given text", "type": "text"]` as default, which will be recognized and processed as...

Currently the `--max-model-len` can be passed to vllm through `kwarg`, but could it be added to the default param like `gpu-utilization-limit`? It is often needed when using models that can...

### 是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this? - [X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions ### 该问题是否在FAQ中有解答? | Is there an...

### Reminder - [X] I have read the README and searched the existing issues. ### System Info - `llamafactory` version: 0.9.1.dev0 - Platform: Linux-5.15.0-97-generic-x86_64-with-glibc2.35 - Python version: 3.12.3 - PyTorch...

pending