Sebastian Whincop

Results 8 comments of Sebastian Whincop

The link has a typo in the domain name. The link should be: https://gpt4all.io/installers/gpt4all-0.1.0-Darwin.dmg

Same error for me as well ERROR: failed to solve: process "/bin/sh -c apt-get update && apt-get install -y curl wget git && rm -rf /var/lib/apt/lists/* && curl -s -L...

> > mlx_whisper with --condition-on-previous-text False is helpful for me. > > source: > > https://huggingface.co/mlx-community/whisper-large-v3-mlx/discussions/4#674de7ed37268b0dca069695 This seems to have solved it for me also.

![Image](https://github.com/user-attachments/assets/fde550f4-f354-400d-a53f-4b804adf1b20) Just checked this workflow that fails on my Linux ARM Standalone N8N instance, and can confirm it works when processed on a X86 instance running 1.98.2 ![Image](https://github.com/user-attachments/assets/890cdc61-3781-4293-bae4-366d78a7a23e)

Getting the same error in n8n Version 1.97.1 Self Hosted. ![Image](https://github.com/user-attachments/assets/8ca92197-520e-4ac4-ad66-5c306ac11653)

> Could you solve the problem that the output of the replication inference model will copy the content of the think That's a good catch, I will look into that....

> Could you solve the problem that the output of the replication inference model will copy the content of the think Hi @fubaochen , I have tried a few thinking...

> I used vLLM to locally deploy deepseek r1 and use --enable-reasoning --reasoning -- parser deepseek_r1 to output inference. My openwebui version is 5.20. openwebui can parse the inference content...