sentry-ruby icon indicating copy to clipboard operation
sentry-ruby copied to clipboard

feat: implement LLM monitoring with langchainrb integration

Open monotykamary opened this issue 1 year ago • 5 comments

Description

This PR introduces a crude implementation of LLM Monitoring with LangChainrb integration to the Sentry Ruby SDK. The changes include:

  1. Addition of a new monitoring.rb file in the sentry-ruby/lib/sentry/ai/ directory, which implements AI monitoring functionality.
  2. Creation of a langchain.rb file in both sentry-ruby/lib/sentry/ and sentry-ruby/lib/sentry/ai/ directories, providing LangChain integration for the Sentry Ruby SDK.
  3. Potential updates to span.rb and transaction.rb to support these new features.

These changes enhance Sentry's capabilities in monitoring and integrating with AI-related technologies, particularly focusing on LangChain integration.

Current problems

  • Currently can't get it to show on the LLM Monitoring page, but most, if not all the span data are listed in the implementation.

Related Issues/PRs

  • #2406
  • #2405

Refactoring

  • No major refactoring was performed in this PR. All changes are related to new feature additions.

Changelog Entry

Added

  • Introduced AI monitoring capabilities (sentry-ruby/lib/sentry/ai/monitoring.rb)
  • Added LangChain integration (sentry-ruby/lib/sentry/langchain.rb and sentry-ruby/lib/sentry/ai/langchain.rb)
  • Enhanced span and transaction handling to support AI monitoring

Basic Testing:

require 'sentry-ruby'
require 'langchain'
require 'sentry/langchain'

puts "Initializing Sentry..."
Sentry.init do |config|
  config.dsn = ENV['SENTRY_DSN']
  config.traces_sample_rate = 1.0
  config.debug = true # Enable debug mode for more verbose logging
end

Sentry.with_scope do |scope|
  Sentry.set_tags(ai_operation: "Testing")
  
  transaction = Sentry.start_transaction(
    op: "ai.query",
    name: "AI Query Execution"
  )

  Sentry.configure_scope do |scope|
    scope.set_span(transaction)
  end

  begin
    Sentry::AI::Monitoring.ai_track("Testing")
    llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'])
    result = llm.chat(messages: [{role: "user", content: "testing input"}]).completion
    puts(result)
  rescue => e
    Sentry.capture_exception(e)
    raise e
  ensure
    transaction.finish
  end
end

monotykamary avatar Sep 23 '24 10:09 monotykamary

@sl0thentr0py I'm not too familiar Ruby or with how to get it completely up and running with LLM Monitoring, but I think have a good start. Can you have a look? image

monotykamary avatar Sep 23 '24 10:09 monotykamary

ty @monotykamary I will review this in a few days and see how best to package the new integration.

sl0thentr0py avatar Sep 23 '24 15:09 sl0thentr0py

This looks like an awesome addition to the sentry ruby ecosystem, please do net let it ~die~ end in a graveyard of forgotten open source pull requests 😅

rwojsznis avatar Feb 18 '25 05:02 rwojsznis

This looks like an awesome addition to the sentry ruby ecosystem, please do net let it ~die~ end in a graveyard of forgotten open source pull requests 😅

@rwojsznis we're not letting that to happen 🙃

@monotykamary thanks again for kick-starting this effort - are you still interested in working on this feature?

solnic avatar Feb 24 '25 11:02 solnic

I'm down for a redemption arc 🤘

monotykamary avatar Feb 24 '25 14:02 monotykamary