vividfog

Results 5 issues of vividfog

I appreciate the effort keeping the codebase simple, Ollama is second to none in its elegance. But this was quick work removing the feature within a week without much debate...

enhancement
feedback wanted

Today I noticed that cmd+m works for selecting code blocks, but it does not for Jupyter Notebook outputs. I had an error as output, wanted to check what WizardCoder would...

bug

I'm running an OpenAI-compatible llama-cpp-python server, the model is llama-2-13b-chat.Q8_0.gguf. This has been set as the default model via `llm models default`. And configured to llm via extra-openai-models.yaml. The model...

It seems that GPT-3.5 isn't always consistent with its output, and Llama-2-13B has the same issue of extra output. Sequential runs using GPT-3.5, note the usage of mac/ubuntu and how...

It seems that `register_model()` in openai_models.py doesn't currently expect a `stream` variable to be in the .yaml file, and `can_stream` gets set to True by default. Many organizations use an...

enhancement