rag-chat
rag-chat copied to clipboard
Prototype SDK for RAG development.
Hi there! I'm wanting to add Google Gemini as a provider to rag-chat, would the best place to modify it be within the model.ts? How would I be able to...
I am trying to create a chatbot using Next.js and Javascript. I firstly upload some context to a namespace in my Vector database. Then, when I try to ask questions,...
./node_modules/.pnpm/[email protected]/node_modules/undici/lib/web/fetch/util.js Module parse failed: Unexpected token (860:57) | // 5. If object is not a default iterator object for interface, | // then throw a TypeError. > if (typeof this...
PDF-Based FAQ Example #50
There are a few LLM observability tools we can integrate, but we need to check their feasibility first. For now, we can start with: - [x] LangSmith - [ ]...
``` { output: "I'm sorry, but I cannot answer that question based on the provided context and chat history.", isStream: false, metadata: [ undefined, undefined, undefined, undefined, undefined ], }...
Currently, our examples folder contains only basic use cases. It would be great if we could include some advanced usage and edge cases as well. PRs are welcome.
Currently, the RAG SDK only supports hosted models. If we could enable the use of local models, similar to web-llm, that would be great. The only issue is that while...
I can get the context from the chat method's return value. But when fetching from the history, I lose that information. I want to get that in the history for...
This pull request updates the README.md file to include the installation of the @upstash/redis package along with @upstash/rag-chat. Additionally, it includes some improvements to the example code by wrapping asynchronous...