Quality monitoring for your LLM applications.
Deploy your LLM apps with confidence, iterate and improve without breaking your product for your users.
Pinned Loading
-
openllmetry Public
Open-source observability for your GenAI or LLM application, based on OpenTelemetry
-
openllmetry-js Public
Sister project to OpenLLMetry, but in Typescript. Open-source observability for your LLM application, based on OpenTelemetry
-
opentelemetry-mcp-server Public
Unified MCP server for querying OpenTelemetry traces across multiple backends (Jaeger, Tempo, Traceloop, etc.), enabling AI agents to analyze distributed traces for automated debugging and observab…
-
go-openllmetry Public
Sister project to OpenLLMetry, but in Go. Open-source observability for your LLM application, based on OpenTelemetry
-
openllmetry-ruby Public
Sister project to OpenLLMetry, but in Ruby. Open-source observability for your LLM application, based on OpenTelemetry
Repositories
- opentelemetry-mcp-server Public
Unified MCP server for querying OpenTelemetry traces across multiple backends (Jaeger, Tempo, Traceloop, etc.), enabling AI agents to analyze distributed traces for automated debugging and observability.
- openllmetry Public
Open-source observability for your GenAI or LLM application, based on OpenTelemetry
- openllmetry-js Public
Sister project to OpenLLMetry, but in Typescript. Open-source observability for your LLM application, based on OpenTelemetry
-
- openllmetry-ruby Public
Sister project to OpenLLMetry, but in Ruby. Open-source observability for your LLM application, based on OpenTelemetry
- openllmetry-fastify-demo Public
-
