Gary Blankenship

Results 5 comments of Gary Blankenship

Same. _edit_ did nothing and it's working fine anyway. ``` ❯ RUST_BACKTRACE=1 FIG_LOG_LEVEL=trace fig source error: 0: Could not sync remote dotfiles 1: Unexpected Server Error: Cannot find module '/node_modules/@fig/plugins/dist/cjs/plugins/powerlevel10k/index.js'...

Ollama works now that it has the OpenAI api compatibility. Example: ``` ollama: base-url: http://localhost:11434/v1 api-key-env: NA models: "codellama:7b": max-input-chars: 4000 ``` then `mods -m codellama:7b` works

In the discord forum I added some code to act as a middle layer between Mods and Gemini Pro.

I use Aider with Deepseek API. It's pretty successful compared to most models except Sonnet 3.5 imo.

Jina.ai looks useful - https://jina.ai/reader - https://s.jina.ai/When%20will%20the%20next%20SpaceX%20launch%20be%3F