Tobias Jung
Tobias Jung
Hi, seems like 0.3.3 has vllm support, but I'm not able to get a model up and running. I use the docker environment with the following args: `--model LoneStriker/CodeFuse-DeepSeek-33B-4.0bpw-h6-exl2 --gpu-memory-utilization...
We are also facing the problem. But in Windows and Linux whereas it is more rare under Linux
Any progress on this topic?
WebContextProvider now also available in release for JetBrains and VSCode. Tried it out and it works absolutely fine. In my case, I have to pay attention that the context window...
Alright, seems like with version 0.8.52 it is much better, but there are lots of duplicates in functions for every source file.
Thanks for the answer @Patrick-Erichsen I will then close the issue since it isn't a bug. File names are shown for the files.
Same here, using a proxy works in VS Code, but not in JetBrains. I also tried to use the `Web` context provider which access the network, but it cannot access...
#1376 There seem to be a ticket for using proxy settings of JetBrains, but isn't started yet
@sestinj Hello, when is this fix expected to be in the release version? I use v0.8.54 and the problem still occurs there. I do not want to switch to pre-release...
@sestinj Yes it works, thanks a lot!