Can't run with multiple gpu
Hello, thanks for the great work!
I refer to the example_code/example_chat.py to run the newest InternLM-XComposer-2.5 model using 4 NVIDIA 4090 GPUs. But still meet the OOM problem. It seems that although the weights are divided successfully, the first gpu always runs into OOM when model.chat is called.
Any response will be greatly appreciated!
same
I refer to the example_code/example_chat.py to run the newest InternLM-XComposer-2.5 model using 4 A800 GPUs. But still meet the OOM problem.
same question
Please try to install transformers==4.33.1 with the following command and try again:
pip install transformers==4.33.1
Hello, thanks for the great work!
I refer to the
example_code/example_chat.pyto run the newest InternLM-XComposer-2.5 model using 4 NVIDIA 4090 GPUs. But still meet the OOM problem. It seems that although the weights are divided successfully, the first gpu always runs into OOM whenmodel.chatis called.Any response will be greatly appreciated!
I found the model cannot take multiple images as inputs, neither can it take a list of images thus the fix is
- Change the image input to a singleton: https://github.com/InternLM/InternLM-XComposer/blob/99a56be441c05337eeed5aacbcb88da447ae2d49/example_code/example_chat.py#L33
image = './examples/dubai.png'
- Add <ImageHere> Flag https://github.com/InternLM/InternLM-XComposer/blob/99a56be441c05337eeed5aacbcb88da447ae2d49/example_code/example_chat.py#L32
query = '<ImageHere>Please describe this image'
Please try to install transformers==4.33.1 with the following command and try again:
pip install transformers==4.33.1
Still meet the same problem with transformers 4.33.1. I'm running the video understanding example on the huggingface. Response will be greatly appreciated
But still meet the OOM problem with transformers 4.33.1,