InvokeAI icon indicating copy to clipboard operation
InvokeAI copied to clipboard

[bug]: Can't use AMD gpu

Open ghost opened this issue 1 year ago • 5 comments

Is there an existing issue for this problem?

  • [X] I have searched the existing issues

Operating system

Windows

GPU vendor

AMD (ROCm)

GPU model

rx 580

GPU VRAM

8GB

Version number

5.5.0

Browser

Chrome Version 131.0.6778.205

Python dependencies

after open InvokeAI look at this Starting up... Started Invoke process with PID: 15708 The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable. [2024-12-29 22:55:18,869]::[InvokeAI]::INFO --> Patchmatch initialized [2024-12-29 22:55:19,969]::[InvokeAI]::INFO --> Using torch device: CPU [2024-12-29 22:55:21,274]::[InvokeAI]::INFO --> InvokeAI version 5.5.0

all image creating using CPU pls help me and ty sam problem in stable-diffusion-webui invokeai is better because at least it works well on cpu

What happened

Capture d'écran 2024-12-30 003154

What you expected to happen

I am newbie in the field and I want to discover something new, but problems follow me everywhere. Sorry for my poor English

How to reproduce the problem

No response

Additional context

No response

Discord username

No response

ghost avatar Dec 29 '24 23:12 ghost

Hello, I have the same issue but with a mobile GPU, AMD 780M. During the installation I chose AMD and it seemed to have installed the dependencies (rocm, etc.) Thanks

mdenuit avatar Jan 01 '25 09:01 mdenuit

I searched for a solution and all I found were people with the same problem as me without a solution.

ghost avatar Jan 01 '25 09:01 ghost

ROCm is not fully available on Windows yet, only Linux. If you still want to use your AMD GPU on Windows you might want to look for projects that utilize either DirectML or XLUDA (SD.Next or any other A1111 AMD port) brave_Rzpchv8WWS

Protxch avatar Jan 06 '25 16:01 Protxch

ROCm is not fully available on Windows yet, only Linux. If you still want to use your AMD GPU on Windows you might want to look for projects that utilize either DirectML or XLUDA (SD.Next or any other A1111 AMD port)

Thanks but I gave up on it

ghost avatar Jan 11 '25 18:01 ghost

ROCm https://github.com/ROCm/pytorch/branches/all?query=windows&lastTab=overview

..pytorch rocm windows is coming q3 2025 aproximately https://github.com/ROCm/pytorch/issues/1802#issuecomment-2649202534

LexiconCode avatar Mar 24 '25 09:03 LexiconCode