use case
OpenTelemetry GenAI observability — standard LLM telemetry without vendor lock-in
Use OpenTelemetry GenAI semantic conventions to send model, token, latency, error, provider, and agent spans into Sutrace while keeping an export path open.
OpenTelemetry GenAI observability
LLM observability should not require a proprietary trace shape. OpenTelemetry's GenAI semantic conventions define a shared vocabulary for model calls, token usage, exceptions, metrics, model spans, agent spans, OpenAI, Anthropic, Bedrock, Azure AI, and MCP.
Sutrace treats those conventions as the import/export boundary.
Why OTel matters for AI agents
AI-agent stacks are fragmented:
- raw OpenAI and Anthropic SDKs
- LangChain and LangGraph
- LlamaIndex
- CrewAI
- OpenRouter
- Bedrock
- MCP tools
- custom orchestration
Without a common telemetry shape, teams get trapped in one vendor's trace model. OTel reduces that lock-in.
What Sutrace maps
| Sutrace field | OTel-style meaning |
|---|---|
provider | GenAI system/provider |
model | requested or response model |
inputTokens | input token usage |
outputTokens | output token usage |
latencyMs | model-call duration |
status / errorCode | exception and failure state |
project | service or product context |
agentId | agent identity |
route | operation or workflow |
First integration
If you already have an OTel collector, keep it. Add Sutrace as a destination for GenAI spans and use the dashboard for cost, latency, errors, and budget alerts.
If you do not have OTel yet, start with @sutrace/llm and export to OTel later:
import { wrapOpenAI } from "@sutrace/llm";
const openai = wrapOpenAI(client, {
apiKey: process.env.SUTRACE_API_KEY,
project: "support-agent",
agent: "refund-router",
});
Positioning
Sutrace is not trying to replace OpenTelemetry. Sutrace is the backend and workflow layer for teams that want useful defaults: spend breakdowns, budget alerts, model/provider views, web/API checks, and later hardware telemetry in the same workspace.
The standard stays open. The operating view gets simpler.