Foong Zhi Yu
Foong Zhi Yu
@sestinj thanks for looking into it. The docs fix I think it's just a typo between streamComplete and streamCompletion. The file fails to load if using streamComplete and it will...
> > @foongzy @timdahlmanns thanks for sharing all of the details, that made the fix quite straightforward! It looks like we just weren't setting providerName in the CustomLLM class: [c3d3980](https://github.com/continuedev/continue/commit/c3d3980e97af39ef75a8112963a18644ec807a69)...
Hi @sestinj, any updates regarding this bug?
> The error is fixed for me in the latest version. ahhh yes, thanks @timdah!
Hi @TyDunn, we create our own backend and logic that does custom processing of queries and adding any relevant context before sending to LLM endpoint. We then stream the response...
Hi @TyDunn, if you can provide an update, that would be great. Seems like many are still relying heavily on the config.ts file for customisation.