Sebastian Whincop
Sebastian Whincop
The link has a typo in the domain name. The link should be: https://gpt4all.io/installers/gpt4all-0.1.0-Darwin.dmg
Same error for me as well ERROR: failed to solve: process "/bin/sh -c apt-get update && apt-get install -y curl wget git && rm -rf /var/lib/apt/lists/* && curl -s -L...
> > mlx_whisper with --condition-on-previous-text False is helpful for me. > > source: > > https://huggingface.co/mlx-community/whisper-large-v3-mlx/discussions/4#674de7ed37268b0dca069695 This seems to have solved it for me also.
 Just checked this workflow that fails on my Linux ARM Standalone N8N instance, and can confirm it works when processed on a X86 instance running 1.98.2 
Getting the same error in n8n Version 1.97.1 Self Hosted. 
> Could you solve the problem that the output of the replication inference model will copy the content of the think That's a good catch, I will look into that....
> Could you solve the problem that the output of the replication inference model will copy the content of the think Hi @fubaochen , I have tried a few thinking...
> I used vLLM to locally deploy deepseek r1 and use --enable-reasoning --reasoning -- parser deepseek_r1 to output inference. My openwebui version is 5.20. openwebui can parse the inference content...