DeepKE icon indicating copy to clipboard operation
DeepKE copied to clipboard

The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable.

Open jiangweiatgithub opened this issue 1 year ago • 1 comments

Describe the bug

A clear and concise description of what the bug is. 运行那个样本代码

model = AutoModelForCausalLM.from_pretrained( ... model_path, ... config=config, ... device_map="auto", ... quantization_config=quantization_config, ... torch_dtype=torch.bfloat16, ... trust_remote_code=True, ... )

===================================BUG REPORT=================================== Welcome to bitsandbytes. For bug reports, please run

python -m bitsandbytes

and submit this information together with your error trace to: https://github.com/TimDettmers/bitsandbytes/issues

bin D:\ProgramData\Anaconda3\envs\deepke-llm\lib\site-packages\bitsandbytes\libbitsandbytes_cpu.so D:\ProgramData\Anaconda3\envs\deepke-llm\lib\site-packages\bitsandbytes\cextension.py:34: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable. warn("The installed version of bitsandbytes was compiled without GPU support. " 'NoneType' object has no attribute 'cadam32bit_grad_fp32' CUDA SETUP: Loading binary D:\ProgramData\Anaconda3\envs\deepke-llm\lib\site-packages\bitsandbytes\libbitsandbytes_cpu.so... argument of type 'WindowsPath' is not iterable Traceback (most recent call last): File "", line 1, in File "D:\ProgramData\Anaconda3\envs\deepke-llm\lib\site-packages\transformers\models\auto\auto_factory.py", line 563, in from_pretrained return model_class.from_pretrained( File "D:\ProgramData\Anaconda3\envs\deepke-llm\lib\site-packages\transformers\modeling_utils.py", line 2482, in from_pretrained raise ImportError( ImportError: Using load_in_8bit=True requires Accelerate: pip install accelerate and the latest version of bitsandbytes pip install -i https://test.pypi.org/simple/ bitsandbytes or pip install bitsandbytes`

Environment (please complete the following information):

  • OS: window
  • Python Version 3.9

Screenshots

If applicable, add screenshots to help explain your problem.

Additional context

Add any other context about the problem here.

jiangweiatgithub avatar May 13 '24 10:05 jiangweiatgithub

你好,环境版本是下面这些。如果8bit量化存在问题,可以尝试使用4bit量化。

accelerate==0.21.0
transformers==4.33.0
bitsandbytes==0.39.1

guihonghao avatar May 13 '24 11:05 guihonghao

多谢回复。4bit量化,该如何修改代码?

jiangweiatgithub avatar May 13 '24 12:05 jiangweiatgithub

quantization_config=BitsAndBytesConfig(     
    load_in_4bit=True,
    llm_int8_threshold=6.0,
    llm_int8_has_fp16_weight=False,
    bnb_4bit_compute_dtype=torch.bfloat16,
    bnb_4bit_use_double_quant=True,
    bnb_4bit_quant_type="nf4",
)

model = AutoModelForCausalLM.from_pretrained(
    model_path,
    config=config,
    device_map="auto",  
    quantization_config=quantization_config,
    torch_dtype=torch.bfloat16,
    trust_remote_code=True,
)

设置并传入quantization_config参数

guihonghao avatar May 13 '24 12:05 guihonghao

请问您还有其他问题吗?

zxlzr avatar May 14 '24 08:05 zxlzr

进行了修改但是还是相同的问题

ElectorShx avatar Oct 10 '24 10:10 ElectorShx