antoniodagata77

Results 3 comments of antoniodagata77

I have the same issue. If the answer provided by LLM covers more than one line the final answer shown by the UI includes only the first one. chainlit version...

this is the code slice of interest: `retriever = get_retriever() agent = get_agent(retriever) cl.HaystackAgentCallbackHandler(agent) @cl.author_rename def rename(orig_author: str): rename_dict = {"custom-at-query-time": "Agent Step"} return rename_dict.get(orig_author, orig_author) @cl.on_message async def main(message:...

I have the same problem. In my app I have set two variables to modify the prompt template in order to change the language and style of the agent's response....