features: proposed behaviours and ideas
A evergreen list of features and requests:
features.
- [x] [[ and ]] for jumping code block @aarnphm
- [x] scrollview on code buffer based on chat window cursor @aarnphm https://github.com/yetone/avante.nvim/issues/23#issuecomment-2293037076
- [x] token streaming #73
- [x] asking for users input env if not set #54
- [x] add support for different behaviour #55
- [x] support custom vendor #74
- [ ] Workspace context #76
- [x] Tool use
- [x] Prompt refactoring using aider's search queries @yetone
ui.
- [x] change chat window bottom padding for
vim.o.laststatus=3'
bugs.
- [x] #32 (fixes with #62)
- [x] handle stream on closed buffer accordingly #73
- UI: #61
- refactor stream to
plenary.Job, and update view renderer lifecycle accordingly.
qol.
- [ ] #121
cc @yetone for more suggestions and added to this tracking list
Currently, we need to manually select a block using v to interact with. Can we auto-select a block based on the tree-sitter context?
@yuchanns I use my configuration to leverage treesitter and quickly select a node with the simple . key.
https://github.com/yetone/cosmos-nvim/blob/b05a44687cd08a3664c084249ce7128df2ea1a6d/lua/layers/editor/configs.lua#L319
Currently, we need to manually select a block using
vto interact with. Can we auto-select a block based on the tree-sitter context?
I think this should be user config. I would prefer it not to be too invasive.
That's great!
Currently, we need to manually select a block using
vto interact with. Can we auto-select a block based on the tree-sitter context?
We can use the incremental_selection feature provided by nvim-treesitter to quickly select a block.
New AI editor features dropping in zed. Sharing 🧵 for inspiration
https://x.com/zeddotdev/status/1825967812629631034?s=61
@yetone Hey, Supermaven has this feature in VS Code that lets you attach any open files to the chat message. I'd like to have the possibility to attach files from open buffers to give the AI more context of the setup I'm working on in K8s e.g.
would it be possible to add support for Nvchad? my understanding is that lazy.vim is currently recommended.
would it be possible to add support for Nvchad? my understanding is that lazy.vim is currently recommended.
The plugin manager is meant by that but avante is compatible with any of the big plugin managers and NVChad has a place where you can add custom plugins. Check the avante readme and the NVChad docs on how to add plugins.
any plugins manager listed on readme is supported.
Check the avante readme and the NVChad docs on how to add plugins.
will do, thanks!
Vertex AI allows to use a set of LLMs (including Claude Sonnet), was wondering if it's possible to integrate Vertex within avante
Hey, Supermaven has this feature in VS Code that lets you attach any open files to the chat message. I'd like to have the possibility to attach files from open buffers to give the AI more context of the setup I'm working on in K8s e.g.
cursor seems to support this with @'ing a file in the prompt.
Hey, Supermaven has this feature in VS Code that lets you attach any open files to the chat message. I'd like to have the possibility to attach files from open buffers to give the AI more context of the setup I'm working on in K8s e.g.
cursor seems to support this with @'ing a file in the prompt.
you can drag and drop or pasting image at the prompt atm. Feel free to add additional support for writing path to a file
Hi, I see that "workspace context" is in the roadmap.
Does that mean if I open nvim with nvim /path/to/codebase:
- Avante will have indexed my entire codebase and pass that as context in the API request?
- Or will it pass my Neovim workspace (all open buffers) in the API request?
If it's the first, that's impressive, but can we get 2nd first?
Or somehow be able to pick which files' context we want to give?
Hi, I see that "workspace context" is in the roadmap.
Does that mean if I open nvim with
nvim /path/to/codebase:
Check the issue for this.
I'd suggest to have more lsp features, avante can have an embeded lsp server to assist on diagnostics etc, generate new code actions based on diagnostics
I just got another idea, avante can maybe provide also OCR features, which should be super cool!
something like: https://github.com/lukas-blecher/LaTeX-OCR, converting what's in clipboard to either text or latex or typst etc
Bear in mind that there's a more fully-featured analogue for vscode called Cline. You may want to repurpose some approaches from it, maybe even parts of the code through an LLM to translate to lua.
Among its ideas:
- give the LLM the option to run a command (after confirmation)
- give the LLM the option to ask for access to another file
- give you the option to give the LLM the contents of a file
- give you the option to give the LLM the errors; for neovim the equivalent would be the quickfix or location list (because you can populate those either from :make or the Lsp)
I think this might have some help to do with repo map. (Highlight the ast-grep)
This: https://github.com/yetone/avante.nvim/issues/676
Would like to add a feature to automatically generate git commit messages
Would like to add a feature to automatically generate git commit messages
I think this is out of scope, but fwiw you can write this yourself pretty easily.
Add support for tool (function) calling in gemini models. please.
I want to have multiple conversations that I can toggle through for a single project.
The :AvanteClear is a good start, but I want to recover past contexts and continue them.
You could add two commands:
-
AvanteOpen name, which opens the conversation at the corresponding name in the sidebar, or an empty conversation if no conversation yet exists at that name. -
AvanteListwhich lists all conversation names for the current project.
And optionally:
-
AvanteCopy name, which copies the context currently shown in the sidebar to a new conversation with namename.
I created a Telescope picker for providers: https://github.com/yetone/avante.nvim/discussions/1438 Maybe it would be interesting to add it to the core, or at least to the wiki?
I created a Telescope picker for providers: #1438 Maybe it would be interesting to add it to the core, or at least to the wiki?
I think it's already possible, no? <leader>a? will open a telescope-like picker to change the provider.
I created a Telescope picker for providers: #1438 Maybe it would be interesting to add it to the core, or at least to the wiki?
I think it's already possible, no?
<leader>a?will open a telescope-like picker to change the provider.
wow, didn't know about it, is it documented?
After checking it out, I am noticing <leader>a? doesn't work properly, it shows a list of what, all technically possible models? I mean it shouldn't show models I have not set up. For instance <leader>a? will let me choose vertex, but since I have not set it up it will fail. My script handles that.
Tool use with Claude apparently works now. Although IMO it should be an opt-in feature due to security and integrity considerations or so that user knows Avante will be modifying filesystem without asking (only creating empty files in my case).
@aarnphm Can you add "Option to display selected model"?
I toggle between Sonnet, o3, 4.1 quite a bit, and it's not possible to tell what model I'm using till after I've sent the query. While I don't think it's mandatory, the ability to either embed the current model name in a label somwehere, or just have an option to display it in the header (or as ghost text similar to VSCode) would be super helpful.
I might take a stab at a PR over Victoria Day, but just in case I forget, best to add it to the list.