Flowise icon indicating copy to clipboard operation
Flowise copied to clipboard

[BUG] Bedrock model metrics are not passed to Langfuse

Open QuinnGT opened this issue 1 year ago • 6 comments

Describe the bug In the current version of Flowise, 1.8.0, if you use any Model in ChatBedrock, connect to your Langfuse instance, those metrics won't show up in Langfuse. I tried bumping the langfuse import versions to the latest, 3.11.1, and I get Time to first token now, but nothing else. If I use the Anthropic or OpenAI models, I see the correct metrics show up in Langfuse.

I've spent way to many hours on this but only able to get the Model name if I modify packages/components/nodes/chatmodels/AWSBedrock/FlowiseAWSChatBedrock.ts

and add the following

 invocationParams(options?: this['ParsedCallOptions']): any {
        const params = {
            model: this.model,
            temperature: this.temperature,
            maxTokens: this.maxTokens,
            modelKwargs: this.modelKwargs,
            ...super.invocationParams(options)
        }

        return params
    }

It's not great, I know. I think there is something to be said for how the callback is done with langchain but still can't get it. I know langfuse recently updated their handler to solve for the odd way bedrock decided to return it's metadata vs every other provider out there.

To Reproduce Steps to reproduce the behavior:

  1. Enable Langfuse on chatflow
  2. Add AWS Bedrock Chat node and pick any model
  3. Chat with model
  4. Go to langfuse console > Generations > select your trace
  5. You'll see no metrics show for that generation
  6. If you go to StrOutputParser in that trace, you will see the metadata in response_metadata, if you use a claude 3 model, doesn't show for other models.

Expected behavior Metrics like Model name, temperature, usage, etc, should show in Langfuse when using AWSChatBedrock

Screenshots Looks like this, with new langfuse version bump: Screenshot 2024-05-23 at 4 44 03 PM

Should look like this: Screenshot 2024-05-23 at 4 44 27 PM

Flow langfuse-test Chatflow.json

Additional context @HenryHengZJ If we could get help from the Langfuse team, (@maxdeichmann) that would be awesome.

QuinnGT avatar May 23 '24 23:05 QuinnGT

We had a look at this with @vinodkiran, it seems like Langfuse is using custom logic on their end to calculate the metrics, since langsmith is able to track those metrics.

@marcklingen can chime in

HenryHengZJ avatar May 31 '24 22:05 HenryHengZJ

@HenryHengZJ Revisiting this and doing some testing I found that the bedrock metrics do not pass to LangSmith correctly either. This may be a langchain issue but just fyi. I'm currently trying to restructure the langchain bedrock util to output similar to openai, maybe that will make a difference.

Bedrock metrics in LangSmith Screenshot 2024-06-13 at 12 42 39 AM

Anthropic metrics in LangSmith Screenshot 2024-06-13 at 12 43 01 AM gSmith

QuinnGT avatar Jun 13 '24 07:06 QuinnGT

@HenryHengZJ Any updates on this? I am connecting my Flowise with LangFuse and it can't track BedrockChat.

Trace details for BedrockChat: Image

Same chatflow but if I use Chat with AzureOpenAI it's fine: Image

@QuinnGT thank you for creating this thread. Any work-around? Also tagging LangFuse @marcklingen if you have a solution for this? Thank you guys!!

nmhieu-infostatus avatar Mar 27 '25 11:03 nmhieu-infostatus

I don't think the error is coming from Flowise, as we are passing the same callback to langfuse

HenryHengZJ avatar May 05 '25 07:05 HenryHengZJ

Just saw this, this should work on latest langfuse python sdk and server version, we've recently made some stability improvements to the callbackhandler for langchain as it now exposes a more standardized interface for token counts. Can you try this again @QuinnGT?

marcklingen avatar May 06 '25 23:05 marcklingen

@marcklingen does Flowise needs to update langfuse SDK version?

HenryHengZJ avatar May 07 '25 05:05 HenryHengZJ