Dane Madsen
Dane Madsen
Using llama.cpp as a submodule after ```b12fa0d1c13596869c512f49a526b979c94787cc``` seems to cause builds to fail because ninja throws an error ```ninja: error: '/home/dane_madsen/Maid/lib/.git/modules/lib/butler/llama.cpp/index', needed by '/home/dane_madsen/Maid/lib/butler/llama.cpp/common/build-info.cpp', missing and no known rule to...
Hi I was wondering if you could add a way to either search for, or get a list of models available to pull off ollama.ai. Currently the https://ollama.ai/library endpoint serves...
currently im using the /api/tags endpoint for automated scanning of the network to find ollama. This is working fine but it may be better to have a dedicated ping endpoint...
Podman is a FOSS alternative to docker and should work out of the box.
replace with linux
### Discussed in https://github.com/Mobile-Artificial-Intelligence/maid/discussions/521 Originally posted by **patrickmac110** April 27, 2024 I'd like to run other models like Phi-3, but there's no option for custom prompt formats like \nQuestion \n
### Feature request Many of the different AI platforms have an api for getting a list of available models. For instance Open AI and Mistral have the `/v1/models` endpoint and...
This function was added to the `translate.h` header in #1811 but the associated function definition was not added to `translate.c`
When saved the character card file is empty. App crashes when the user attempts to save an image.