Skip to content

Conversation

@junmediatek
Copy link
Owner

Summary

Implement complete OpenTelemetry monitoring integration allowing users to send AI call traces to Jaeger, Grafana Tempo, and other OTLP-compatible backends.

Changes

  • Add OpenTelemetry SDK dependencies
  • Create telemetry module for SDK management
  • Extend experimental.openTelemetry configuration from boolean to full object
  • Integrate tracer into LLM stream and Agent generation
  • Support environment variables (OTEL_EXPORTER_OTLP_ENDPOINT, OTEL_SERVICE_NAME)

Configuration Example

{
  "experimental": {
    "openTelemetry": {
      "enabled": true,
      "endpoint": "http://localhost:4318",
      "serviceName": "opencode",
      "sampleRate": 1.0
    }
  }
}

Tracked Data

  • Model ID, provider, operation type
  • Token usage (prompt/completion)
  • Session ID, agent name, user ID
  • Tool calls with arguments
  • Request duration and spans

@github-actions
Copy link

Thanks for your contribution!

This PR doesn't have a linked issue. All PRs must reference an existing issue.

Please:

  1. Open an issue describing the bug/feature (if one doesn't exist)
  2. Add Fixes #<number> or Closes #<number> to this PR description

See CONTRIBUTING.md for details.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants