Featurerequest: ConversationalRetrival and definition of chain_type
I think conversationalRetrival would be a great feature on this awesome project. https://python.langchain.com/docs/modules/chains/popular/chat_vector_db Unfortunately i am not good enough to make it work, i dont know how to pass the "memory" around.
Additionaly i think some would profit from "threads" for the ctransformers as well as the chain_type or the search_type of the retriever.
I will look into this. If the performance is good, I will make it as default, otherwise will add a config option to enable this. It will also solve #22
Looking forward to this @marella . Is this still on your to do?
Hi, yes. I was out of station with a slow internet for the past few days, so the progress has slowed down. I will start looking into the pending issues next week.
Hi, yes. I was out of station with a slow internet for the past few days, so the progress has slowed down. I will start looking into the pending issues next week.
Great to have you back @marella
@marella been trying to implement this function on my own. I think I might almost be there and have it functional. The problem is that I can't get it to remember the prompt, only the answer of the prompt so far. Let me know if you want to see my changes.
Great! Please share link to your repo/branch.
I did this @marella
Here is my chains.py:
from typing import Any, Callable, Dict, Optional
from langchain.chains import ConversationalRetrievalChain
from langchain.memory import ConversationBufferMemory
from .llms import get_llm
from .vectorstores import get_vectorstore
def get_retrieval_qa(
config: Dict[str, Any],
*,
callback: Optional[Callable[[str], None]] = None,
) -> ConversationalRetrievalChain:
db = get_vectorstore(config)
retriever = db.as_retriever(**config["retriever"])
llm = get_llm(config, callback=callback)
memory=ConversationBufferMemory(memory_key="chat_history", return_messages=True, input_key="question", output_key="answer")
return ConversationalRetrievalChain.from_llm(
llm=llm,
retriever=retriever,
return_source_documents=True,
memory=memory,
)
In Index.html I changed this:
answer.innerText = res.result; --> answer.innerText = res.answer;
In ui.py I changed this:
res["result"] = data["result"] --> res["answer"] = data["answer"]
It repeats the question before it gives an answer and then the repeated question is just removed