Failed at `setup_env.py` gitstatus_query_p9k_
Hi! I'm using:
- Ubuntu 24.04
- clang version 18.1.8
- python 3.9.19 with pyenv (2.4.1)
- cmake 3.28.3
When I execute the command after downloading the model from Hugging Face:
python setup_env.py -md models/Llama3-8B-1.58-100B-tokens -q i2_s
I got this message:
INFO:root:Compiling the code using CMake.
INFO:root:Loading model from directory models/Llama3-8B-1.58-100B-tokens.
INFO:root:Converting HF model to GGUF format...
[1] 904944 terminated python setup_env.py -md models/Llama3-8B-1.58-100B-tokens -q i2_s
gitstatus_query_p9k_:print:68: write error: broken pipe
After debugging the setup_env.py file, I found the problem raises when executing this line:
python utils/convert-hf-to-gguf-bitnet.py models/Llama3-8B-1.58-100B-tokens --outtype f32
I get a similar error during setup
It is probably because the memory exhuasted. Can you provide the device information. You can also try to run the 700M model to check whether it works.
I am also facing issues while running this command : python setup_env.py -md models/Llama3-8B-1.58-100B-tokens -q i2_s
the issue I am facing is : INFO:root:Compiling the code using CMake. ERROR:root:Error occurred while running command: Command '['cmake', '-B', 'build', '-DBITNET_X86_TL2=ON', '-T', 'ClangCL']' returned non-zero exit status 1., check details in logs\generate_build_files.log
I am also facing issues while running this command : python setup_env.py -md models/Llama3-8B-1.58-100B-tokens -q i2_s
the issue I am facing is : INFO:root:Compiling the code using CMake. ERROR:root:Error occurred while running command: Command '['cmake', '-B', 'build', '-DBITNET_X86_TL2=ON', '-T', 'ClangCL']' returned non-zero exit status 1., check details in logs\generate_build_files.log
The solution I found is to use a model other than Llama3-8B-1.58-100B-tokens. You can see the solution I discovered at https://github.com/microsoft/BitNet/issues/77#issuecomment-2436022313
Please try with the latest model on HF. https://huggingface.co/microsoft/bitnet-b1.58-2B-4T-gguf