Installation
Configuration
Set environment variables before starting your app:| Variable | Required | Description |
|---|---|---|
CASCADE_API_KEY | Yes | Organization API key (csk_live_...) |
CASCADE_ENDPOINT | Yes | OTLP endpoint |
Initialize tracing
CallinitTracing() once at the top of your application:
Basic instrumentation
1. Wrap your entry point with traceRun
Wrap your agent’s main execution in traceRun so every span is part of the same trace:
2. Wrap LLM clients
Wrap your OpenAI or Anthropic client for automatic LLM span tracing:3. Wrap tools with tool
Decorate functions that the LLM calls as tools:
Multi-turn sessions
For chat-like agents, group traces under a session:traceSession as a context:
Sub-agents with traceAgent
Use traceAgent to name sub-agents within a run:
Auto-evaluation
Pass scorer names toinitTracing for automatic evaluation on every trace:
sessionEvals to run when the session ends:
Quick reference
| API | Purpose |
|---|---|
initTracing({ project, evals, sessionEvals }) | Initialize once at startup |
traceRun(name, metadata, fn) | Root span for a run |
traceAgent(name, metadata, fn) | Named sub-agent span |
traceSession(sessionId, fn) | Scope traces to a session |
setSessionId(id) / endSession(id) | Session context for multi-turn |
tool(options, fn) | Wrap a function as a traced tool |
wrapLlmClient(client) | Auto-trace OpenAI/Anthropic calls |