langflow icon indicating copy to clipboard operation
langflow copied to clipboard

User-Specific Memory for same flow

Open Chunyfong opened this issue 2 years ago • 0 comments

I am attempting to deploy an API for my application streamming the model flow to multiple users. A key requirement is that each user should have a unique memory space, it appears that each respond simply continues the same conversation, rather than starting a new one for each user. How can I rectify this?

i'm using the Basic Chat with Prompt and History example, with conversation Buffer Memory

import requests

BASE_API_URL = "http://127.0.0.1:7860/api/v1/predict"
FLOW_ID = "IDIDIDIDID"

TWEAKS = {
  "ChatOpenAI-5yps2": {},
  "LLMChain-J9wLv": {},
  "PromptTemplate-xpUGT": {},
  "ConversationBufferMemory-enKYk": {}
}

def run_flow(message: str, flow_id: str, tweaks: dict = None) -> dict:
    api_url = f"{BASE_API_URL}/{flow_id}"
    payload = {"inputs": {"text": message}}

    if tweaks:
        payload["tweaks"] = tweaks

    response = requests.post(api_url, json=payload)
    return response.json()


response = run_flow("Hello, I'm Damon", flow_id=FLOW_ID, tweaks=TWEAKS)
print(response)

response = run_flow("what is my name ?", flow_id=FLOW_ID, tweaks=TWEAKS)
print(response)

{'result': {'text': 'Your name is Damon.'}}

After Python end the script, Then i run it again:

response = run_flow("what is our conversation for past 4 sentences?", flow_id=FLOW_ID, tweaks=TWEAKS)
print(response)

{'result': {'text': 'In the past 4 sentences, we discussed your name. You first mentioned that your name is Damon, then you greeted me again as Damon. Finally, you asked me what your name is twice, to which I responded both times that your name is Damon.'}}

It seems that all conversation are in the same memory buff and i don't know how to seperate it

Chunyfong avatar Jul 05 '23 09:07 Chunyfong