Lukas
Lukas
I have never, ever thought I will (ever!) write it. Ever in my life :disappointed: but considering how long this issue is around, plus looks like it is not top...
Hey, I have started Local Server with LM Studio, Loaded `lmstudio-community/Meta-Llama-3.1-8B-Instruct-GGUF` model, exported below variables ```sh export DEFAULT_MODEL="lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF" export DEFAULT_VENDOR=OpenAI export OPENAI_APIKEY=not-needed export OPENAI_BASE_URL=http://localhost:1234/v1 ``` I'm on ubuntu 22.04.5 LTS,...
Additional info, ```sh which fabric $HOME/.gvm/pkgsets/go1.23.1/global/bin/fabric ``` removed references in `.bashrc`, and `.zshrc`, files.
Update: After LM Studio restart I get some other error, ```sh echo "tell me why" | fabric -p ai error, status code: 401, message: Incorrect API key provided: not-needed. You...
Another hit, I have found, Should fabric has only one model? ```txt fabric --listmodels Available vendor models: DryRun [1] dry-run-model ``` should I install more models somehow? Where can I...
@sosacrazy126 thanks for info from your side!
Hey @sosacrazy126 , May I ask about status? is this issue is abandoned/on-hold/awaiting/other. BR ;)
> Just happened with me , the log size went up to 30 GB same here ✋ logsize 30+ GB, some crap on mobile client wanted to reach malware address...