Cannot use @Codebase: file was not found locally ... models/all-MiniLM-L6-v2/tokenizer.json
Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the Continue Discord for questions
- [X] I'm not able to find an open issue that reports the same bug
- [X] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: Windows 11
- Continue:0.9.158
- IDE: VSCode 1.70.2
Description
This appears to be the same issue from #895 which was fixed in February but I appear to be having the same issues now with the latest versions.
To reproduce
- Start Continue
- During the embedding process error happens
Log output
Error indexing codebase: Error. local_files_only=true or 'env.allowRemoteModels=false' and file was not found locally at "C:\Users\ndurkee\.vscode\extensions\continue-continue-0.9.158-win32-x64\models\all-MiniLM-L6-v2\tokenizer.json".
For @codebase you can work around the issue by using nomic-embed-text embeddings provider from ollama. However, @docs indexing seems to ignore that setting.
I am getting a very similar error message on macos and IntelliJ with continue v0.0.50, even when downloading the tokenizer from hugging face manually and placing it at that exact path:
Error: `local_files_only=true` or `env.allowRemoteModels=false` and file was not found locally at "/snapshot/continue-deploy/binary/models/all-MiniLM-L6-v2/tokenizer.json".
How this issue progresses?
I am facing the sample issue as well!!
Help me i faced the same issue
Guys, if you are using Intellji, please use local ollama with nomic-embed-text as embeddingsProvider
Revise your config.json to
"embeddingsProvider": {
"provider": "ollama",
"model": "nomic-embed-text",
"apiBase": "http://localhost:11434"
},
Guys, if you are using Intellji, please use local ollama with nomic-embed-text as embeddingsProvider
Revise your config.json to
"embeddingsProvider": { "provider": "ollama", "model": "nomic-embed-text", "apiBase": "http://localhost:11434" },
That works for most things, but not for @docs, unfortunately.
I'm getting the same issue with IntelliJ Ultimate 2024.1 on Windows with Continue 0.0.55. Unfortunately using ollama is not an option to work around.
I am on the most current RustRover Linux (.tar.gz install) and I have the same error with the most current continue version.
I am using the groq API and not ollama.
What is the actual issue?
Would rebuilding continue with the change in ./extensions/vscode/models/all-MiniLM-L6-v2/tokenizer.json help? (it says vscode, I am on RustRover, so not sure the error is actually correct).
This bug makes it impossible for me to effectively use the tool and it seems to be around for 9 months now - is there any hope this will get a fix? @RomneyDa @Patrick-Erichsen
@vanhauser-thc - this should be fixed with some recent commits by @RomneyDa , would you mind pulling down the latest version and seeing if things work for you?
@Patrick-Erichsen yes this works now for me, thanks!
This issue hasn't been updated in 90 days and will be closed after an additional 10 days without activity. If it's still important, please leave a comment and share any new information that would help us address the issue.
This issue was closed because it wasn't updated for 10 days after being marked stale. If it's still important, please reopen + comment and we'll gladly take another look!