LocalAI icon indicating copy to clipboard operation
LocalAI copied to clipboard

Vulkan broken on master as of now

Open Expro opened this issue 1 month ago • 6 comments

llama-cpp vulkan: Throws error on model load:

DEBUG GRPC stderr id="gpt-oss-120b-127.0.0.1:36315" line="/backends/vulkan-llama-cpp/llama-cpp-avx2: /backends/vulkan-llama-cpp/lib/libc.so.6: version `GLIBC_2.38' not found (required by /lib/x86_64-linux-gnu/libvulkan.so.1)" caller={caller.file="/build/pkg/model/process.go" caller.L=146 }

llama-cpp vulkan development: has memory leak that causes kernel panic due to OOM

Expro avatar Jan 07 '26 19:01 Expro

"GRPC stderr id="qwen3-vl-4b-instruct-127.0.0.1:42521" line="/backends/vulkan-llama-cpp/llama-cpp-avx2: /backends/vulkan-llama-cpp/lib/libc.so.6: version `GLIBC_2.38' not found (required by /lib/x86_64-linux-gnu/libvulkan.so.1)" caller={caller.file="/build/pkg/model/process.go" caller.L=146 } " Can at least confirm I'm getting the same issue from my logs

hacshacdgacs avatar Jan 09 '26 07:01 hacshacdgacs

I didn't replicated it yet, however I think this could be likely caused from #7769 as from that point we consume vulkan from upstream rather than ubuntu, and the glibc versions (reading the error) looks mismatching

mudler avatar Jan 11 '26 20:01 mudler

Created https://github.com/mudler/LocalAI/pull/7980 to confirm this by reverting to the vulkan-sdk from repositories. However currently we have issues on the CI and most of the backends aren't being published so that should be fixed first in order to consume the workaround.

The new images should bundle now the vulkan libraries used during building as part of the backend package and this probably could fix this bug as it would not depend anymore on the library in the base image, but use the one shipped with the backend images.

However if it persists, we need to check what pre-compiled libs the upstream vulkan-sdk ships that we should rather build instead.

mudler avatar Jan 11 '26 21:01 mudler

did some digging:

# From vulkan-sdk 1.4.328.1

❯ objdump -p x86_64/lib/libvulkan.so.1.4.328 | grep "NEEDED.*libc"
  NEEDED               libc.so.6

mudler@mudler-ubuntu-box ~/1.4.328.1
❯ strings x86_64/lib/libvulkan.so.1.4.328 | grep "^GLIBC_" | sort -V | tail -5
GLIBC_2.3.4
GLIBC_2.4
GLIBC_2.7
GLIBC_2.14
GLIBC_2.17

# And system package instead
❯ strings /usr/lib/x86_64-linux-gnu/libvulkan.so.1 | grep "^GLIBC_" | sort -V | tail -5
GLIBC_2.14
GLIBC_2.17
GLIBC_2.29
GLIBC_2.34
GLIBC_2.38

❯ cat /etc/os-release
PRETTY_NAME="Ubuntu 24.04.3 LTS"
NAME="Ubuntu"
VERSION_ID="24.04"

mudler avatar Jan 11 '26 21:01 mudler

Ok, let's start by bumping vulkan before trying to go back to ubuntu repos - the new version looks like its built against GLIBC 2.3*:

❯ strings x86_64/lib/libvulkan.so.1.4.335 | grep "^GLIBC_" | sort -V | tail -5
GLIBC_2.7
GLIBC_2.14
GLIBC_2.17
GLIBC_2.33
GLIBC_2.34

mudler avatar Jan 11 '26 21:01 mudler

may not be helpful, but i've built the backend/cpp/llama-cpp using Vulkan on Windows, and that's successful, so it's not a general code issue

ericblade avatar Jan 15 '26 20:01 ericblade

new backend images are available now on master, I can't test these now, but if you can check on your side would be great!

mudler avatar Jan 17 '26 12:01 mudler