Jake Koenig
Jake Koenig
I updated the readme.
I just edited the readme to me more descriptive. A simpler solution is to make your config the following: ``` llm_custom_provider: ollama llm_model: mixtral ```
I updated the readme. Thanks for reporting!
I just double checked and it works on windows for me. Feel free to re-open with more information.
`pip install -e .` (you may want to do this in a venv). Then you should be able to run rawdog.
> @mentatbutler The bot currently only reacts to inline comments not issue comments on PRs. I'm hoping to add that feature this week
Thanks for reporting! It's listed in the requirements.txt but that's actually not used by the pyrpoject.toml. I made a [pr](https://github.com/AbanteAI/rawdog/pull/88) that fixes the issue.