Mac M2 can not run the inference
I have tried running both inference and batch inference, but it always stuck there, but I can run inference without using refer the index file my python version is 3.9.12 PyTorch version is 2.3.0.dev20240222, it's a nightly build version, should I install the stable one?
it doesn't work on PyTorch 2.2.1 either, I can see clearly from the Activity Monitor that the CPU and GPU usage is 0
I encountered the same problem as you. I noticed this error when running webUI. I don’t know if it is related.
/Users/qrviom2/Retrieval-based-Voice-Conversion-WebUI1006/.venv/lib/python3.8/site-packages/gradio_client/documentation.py:102: UserWarning: Could not get documentation group for <class 'gradio.mix.Parallel'>: No known documentation group for module 'gradio.mix' warnings.warn(f"Could not get documentation group for {cls}: {exc}") /Users/qrviom2/Retrieval-based-Voice-Conversion-WebUI1006/.venv/lib/python3.8/site-packages/gradio_client/documentation.py:102: UserWarning: Could not get documentation group for <class 'gradio.mix.Series'>: No known documentation group for module 'gradio.mix' warnings.warn(f"Could not get documentation group for {cls}: {exc}")
我遇到了和你一樣的問題。我在運行 webUI 時注意到這個錯誤。不知道有沒有關係。
/Users/qrviom2/Retrieval-based-Voice-Conversion-WebUI1006/.venv/lib/python3.8/site-packages/gradio_client/documentation.py:102: UserWarning: Could not get documentation group for <class 'gradio.mix.Parallel'>: No known documentation group for module 'gradio.mix' warnings.warn(f"Could not get documentation group for {cls}: {exc}") /Users/qrviom2/Retrieval-based-Voice-Conversion-WebUI1006/.venv/lib/python3.8/site-packages/gradio_client/documentation.py:102: UserWarning: Could not get documentation group for <class 'gradio.mix.Series'>: No known documentation group for module 'gradio.mix' warnings.warn(f"Could not get documentation group for {cls}: {exc}")
I also encountered related problems, I don’t know if it will have any impact. :(
我遇到了和你一样的问题。我在运行 webUI 时注意到这个错误。不知道有没有关系。
/Users/qrviom2/Retrieval-based-Voice-Conversion-WebUI1006/.venv/lib/python3.8/site-packages/gradio_client/documentation.py:102: UserWarning: Could not get documentation group for <class 'gradio.mix.Parallel'>: No known documentation group for module 'gradio.mix' warnings.warn(f"Could not get documentation group for {cls}: {exc}") /Users/qrviom2/Retrieval-based-Voice-Conversion-WebUI1006/.venv/lib/python3.8/site-packages/gradio_client/documentation.py:102: UserWarning: Could not get documentation group for <class 'gradio.mix.Series'>: No known documentation group for module 'gradio.mix' warnings.warn(f"Could not get documentation group for {cls}: {exc}")我也遇到了相关问题,不知道有没有影响。 :(
I also encountered the same problem as you,has it been resolved?
我遇到了和你一样的问题。我在运行 webUI 时注意到这个错误。不知道有没有关系。
/Users/qrviom2/Retrieval-based-Voice-Conversion-WebUI1006/.venv/lib/python3.8/site-packages/gradio_client/documentation.py:102: UserWarning: Could not get documentation group for <class 'gradio.mix.Parallel'>: No known documentation group for module 'gradio.mix' warnings.warn(f"Could not get documentation group for {cls}: {exc}") /Users/qrviom2/Retrieval-based-Voice-Conversion-WebUI1006/.venv/lib/python3.8/site-packages/gradio_client/documentation.py:102: UserWarning: Could not get documentation group for <class 'gradio.mix.Series'>: No known documentation group for module 'gradio.mix' warnings.warn(f"Could not get documentation group for {cls}: {exc}")我也遇到了相关问题,不知道有没有影响。 :(
I also encountered the same problem as you,has it been resolved?
I haven't found a solution yet, if I find one I will share it :) I've been using it so far and the functions seem to be normal.
@XXXXRT666 Can you give me more details?
@XXXXRT666 Can you give me more details?
Yes, of course. I run it on python 3.9(anaconda) and python3.11(3.11 requirements installed).
After I click the convert button, It does do the inference work, but no output.
After clicking unload
I think it stuck at inference.
using crepe, index
setting index ratio zero or just not using
Also, memory not released after unloading the model, I think it is related to PyTorch
the program stuck at score, ix = index.search(npy, k=8) (infer.modules.vc.pipeline.vc)
For training, index.train(big_npy) will lead to Segmentation fault
I've test the run.sh, the same result
@XXXXRT666 Thanks, I'll fix
Just to confirm, Does the inference work better if the index ratio is set to zero?
@XXXXRT666 Thanks, I'll fix
Just to confirm, Does the inference work better if the index ratio is set to zero?
Yes, because if (... and index_rate != 0): will skip index.search() if the index rate is zero.
@XXXXRT666 In the meantime, I've fixed the problem of memory not being freed up. #2035
Thank you for your great efforts.
fyi, index.search hangs because faiss is causing segfault when pytorch is imported. It's faiss's glitch. Faiss suggests macOS users to install via conda, which should in theory fix it.
for macOS users, you either:
- use conda for faiss instead of pip.
- build faiss from source yourself and install its python binding
- change the code to spawn a separate subprocess to use faiss
fyi,
index.searchhangs becausefaissis causing segfault whenpytorchis imported. It's faiss's glitch. Faiss suggests macOS users to install via conda, which should in theory fix it.for macOS users, you either:
- use conda for faiss instead of pip.
- build faiss from source yourself and install its python binding
- change the code to spawn a separate subprocess to use faiss
Thank you very much, this indeed worked.
fyi,
index.searchhangs becausefaissis causing segfault whenpytorchis imported. It's faiss's glitch. Faiss suggests macOS users to install via conda, which should in theory fix it.for macOS users, you either:
- use conda for faiss instead of pip.
- build faiss from source yourself and install its python binding
- change the code to spawn a separate subprocess to use faiss
@Tps-F
@XXXXRT666 How do you solve this problem ?
use conda to install, or install it from source code
@XXXXRT666 Thanks for the tip
Hi, I still cannot make it work using conda and pip, maybe I'm doing something wrong ? What I did was :
conda create -n env python=3.10
conda activate env
pip install -r requirements.txt
python ./infer-web.py
any help ? TIA
Ok I got it to work, here's the solution
conda create -n env python=3.10
conda activate env
pip install -r requirements.txt
pip uninstall numba faiss-cpu
conda install conda-forge::numba pytorch::faiss-cpu
python ./infer-web.py