Parler-tts doesn't work when installed from gallery, documentation unhelpful
LocalAI version: v2.16.0-cublas-cuda12-ffmpeg
Environment, CPU architecture, OS, and Version: Running the v2.16.0-cublas-cuda12-ffmpeg image in Kubernetes.
Describe the bug When I install Parler using the webui gallery and try to run it, I only get an error that the model could not be found. It doesn't seem to download a model anywhere. The documentation on the LocalAI website says it can be installed and configured through the gallery, but installing it through that doesn't download a model, I can't find any way to configure it through the webui, and I can't find anything anywhere describing how to configure it manually.
To Reproduce Open the webui, go to the Models tab, find parler-tts-mini-v0.1 and install it. Go to the TTS tab, load parler-tts-mini-v0.1, and try to prompt it.
Expected behavior I get an output like with Piper.
Logs
Debug Output
11:41PM INF Loading model with backend parler-tts
11:41PM DBG Stopping all backends except ''
11:41PM DBG Loading model in memory from file: /models
11:41PM DBG Loading Model with gRPC (file: /models) (backend: parler-tts): {backendString:parler-tts model: threads:0 assetDir:/tmp/localai/backend_data context:{emptyCtx:{}} gRPCOptions:0xc0004d6248 externalBackends:map[autogptq:/build/backend/python/autogptq/run.sh bark:/build/backend/python/bark/run.sh coqui:/build/backend/python/coqui/run.sh diffusers:/build/backend/python/diffusers/run.sh exllama:/build/backend/python/exllama/run.sh exllama2:/build/backend/python/exllama2/run.sh huggingface-embeddings:/build/backend/python/sentencetransformers/run.sh mamba:/build/backend/python/mamba/run.sh openvoice:/build/backend/python/openvoice/run.sh parler-tts:/build/backend/python/parler-tts/run.sh petals:/build/backend/python/petals/run.sh rerankers:/build/backend/python/rerankers/run.sh sentencetransformers:/build/backend/python/sentencetransformers/run.sh transformers:/build/backend/python/transformers/run.sh transformers-musicgen:/build/backend/python/transformers-musicgen/run.sh vall-e-x:/build/backend/python/vall-e-x/run.sh vllm:/build/backend/python/vllm/run.sh] grpcAttempts:20 grpcAttemptsDelay:2 singleActiveBackend:true parallelRequests:false}
11:41PM DBG Loading external backend: /build/backend/python/parler-tts/run.sh
11:41PM DBG Loading GRPC Process: /build/backend/python/parler-tts/run.sh
11:41PM DBG GRPC Service for will be running at: '127.0.0.1:36177'
11:41PM DBG GRPC Service state dir: /tmp/go-processmanager804239804
11:41PM DBG GRPC Service Started
11:41PM DBG GRPC(-127.0.0.1:36177): stdout Initializing libbackend for build
11:41PM DBG GRPC(-127.0.0.1:36177): stdout virtualenv activated
11:41PM DBG GRPC(-127.0.0.1:36177): stdout activated virtualenv has been ensured
11:41PM DBG [WatchDog] Watchdog checks for idle connections
11:41PM DBG GRPC(-127.0.0.1:36177): stderr /build/backend/python/parler-tts/venv/lib/python3.10/site-packages/transformers/utils/hub.py:124: FutureWarning: Using TRANSFORMERS_CACHE is deprecated and will be removed in v5 of Transformers. Use HF_HOME instead.
11:41PM DBG GRPC(-127.0.0.1:36177): stderr warnings.warn(
11:41PM DBG GRPC(-127.0.0.1:36177): stderr [parler-tts] startup: Namespace(addr='127.0.0.1:36177')
11:41PM DBG GRPC(-127.0.0.1:36177): stderr [parler-tts] Server started. Listening on: 127.0.0.1:36177
11:41PM DBG GRPC Service Ready
11:41PM DBG GRPC: Loading model with options: {state:{NoUnkeyedLiterals:{} DoNotCompare:[] DoNotCopy:[] atomicMessageInfo:
Additional context I tried running the latest AIO image without any of my usual volumes attached, environment variables etc. for a clean run and had the same result.
I have the same error
I have the same issue. Tried model gallery and also the API.
If someone updates the API description for this, that would be good to understand!
Currently can't find the API documentation of LocalAI for this!
im also getting the same error with local ai being installed via the script on ubuntu instead of docker like the first reporter
This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.