[BUG] Bedrock model metrics are not passed to Langfuse
Describe the bug
In the current version of Flowise, 1.8.0, if you use any Model in ChatBedrock, connect to your Langfuse instance, those metrics won't show up in Langfuse. I tried bumping the langfuse import versions to the latest, 3.11.1, and I get Time to first token now, but nothing else. If I use the Anthropic or OpenAI models, I see the correct metrics show up in Langfuse.
I've spent way to many hours on this but only able to get the Model name if I modify
packages/components/nodes/chatmodels/AWSBedrock/FlowiseAWSChatBedrock.ts
and add the following
invocationParams(options?: this['ParsedCallOptions']): any {
const params = {
model: this.model,
temperature: this.temperature,
maxTokens: this.maxTokens,
modelKwargs: this.modelKwargs,
...super.invocationParams(options)
}
return params
}
It's not great, I know. I think there is something to be said for how the callback is done with langchain but still can't get it. I know langfuse recently updated their handler to solve for the odd way bedrock decided to return it's metadata vs every other provider out there.
To Reproduce Steps to reproduce the behavior:
- Enable Langfuse on chatflow
- Add AWS Bedrock Chat node and pick any model
- Chat with model
- Go to langfuse console > Generations > select your trace
- You'll see no metrics show for that generation
- If you go to StrOutputParser in that trace, you will see the metadata in response_metadata, if you use a claude 3 model, doesn't show for other models.
Expected behavior Metrics like Model name, temperature, usage, etc, should show in Langfuse when using AWSChatBedrock
Screenshots
Looks like this, with new langfuse version bump:
Should look like this:
Flow langfuse-test Chatflow.json
Additional context @HenryHengZJ If we could get help from the Langfuse team, (@maxdeichmann) that would be awesome.
We had a look at this with @vinodkiran, it seems like Langfuse is using custom logic on their end to calculate the metrics, since langsmith is able to track those metrics.
@marcklingen can chime in
@HenryHengZJ Revisiting this and doing some testing I found that the bedrock metrics do not pass to LangSmith correctly either. This may be a langchain issue but just fyi. I'm currently trying to restructure the langchain bedrock util to output similar to openai, maybe that will make a difference.
Bedrock metrics in LangSmith
Anthropic metrics in LangSmith
gSmith
@HenryHengZJ Any updates on this? I am connecting my Flowise with LangFuse and it can't track BedrockChat.
Trace details for BedrockChat:
Same chatflow but if I use Chat with AzureOpenAI it's fine:
@QuinnGT thank you for creating this thread. Any work-around? Also tagging LangFuse @marcklingen if you have a solution for this? Thank you guys!!
I don't think the error is coming from Flowise, as we are passing the same callback to langfuse
Just saw this, this should work on latest langfuse python sdk and server version, we've recently made some stability improvements to the callbackhandler for langchain as it now exposes a more standardized interface for token counts. Can you try this again @QuinnGT?
@marcklingen does Flowise needs to update langfuse SDK version?