Larger context size for completion
Describe the need of your request
I'm not getting very useful results with completion, which I think partly because the default context is too small. An example is initializing some fields in the constructor. As the number of fields increase, the model no longer see which fields exists to initialize any more. Most model nowadays can handle the entire file I think. But it still is a good idea to make it customizable to account for latency preference.
Proposed solution
No response
Additional context
No response
While you are at it, could you also please increase the maximum response token size or at least make it configurable. I noticed that often the auto completion response was correct and on the right track, but it gets truncated so that I have to manually type the rest.
I'm using a local Ollama instance with a 13b Codelama model.