20191864218

Results 30 comments of 20191864218

Hi,I have the same issue. Have you solved it

> > > Hi, you can use [this script](https://github.com/haotian-liu/LLaVA/blob/main/scripts/merge_lora_weights.py) for merge lora weights. We'll update this in instruction as well. Thanks. > > > > > > what is the...

> > I found that the reason for this problem is different tokenizer rules. The `bos_token` is null and the `eos_token` is set to "" in the Qwen tokenizer configuration....

> @20191864218 Maybe you need set some parameters for Qwen1.5. #1146 Thank you, but I've encountered some issues after making the changes. Could you help me with it? I left...

> @20191864218 Maybe you need set some parameters for Qwen1.5. #1146 Hello, do you have a link for replacing the visual encoder?

> @yiyexy Using llava template on qwen chat model might introduce unwanted output when chat. This is a common issue. qwen using chatml format which using as spepartor/ Hello, if...

> I think base can not be used in vlm, it doens't have chat abilities,. I want to create a model solely for generating reports, without requiring strong conversational abilities....

> > > I think base can not be used in vlm, it doens't have chat abilities,. > > > > > > I want to create a model solely...

> @20191864218 This error appears to be due to a corrupted weight file. Please ensure that your weight file has been saved correctly. Thank you for your response. I merged...

Hi,I have the same issue. Have you solved it?