Support for Custom/Local Hosted LLMs
First, congratulations on your outstanding work! I would greatly appreciate it if you could consider adding support for custom or locally hosted LLMs, such as those served by GPT4ALL or LMStudio. This enhancement would significantly empower the community by reducing reliance on major AI providers...
First, congratulations on your outstanding work! I would greatly appreciate it if you could consider adding support for custom or locally hosted LLMs, such as those served by GPT4ALL or LMStudio. This enhancement would significantly empower the community by reducing reliance on major AI providers...
That is for the online application. Haven't tried the github version yet, I don't know if there is any difference .
First, congratulations on your outstanding work! I would greatly appreciate it if you could consider adding support for custom or locally hosted LLMs, such as those served by GPT4ALL or LMStudio. This enhancement would significantly empower the community by reducing reliance on major AI providers...
That is for the online application. Haven't tried the github version yet, I don't know if there is any difference .
Tested the current version locally, using the github repo files. The lack of support to Local LLMs is not available as well.... any intentions to provide it someday?