Skip to main content
Cascade detects OpenRouter as an OpenAI-compatible client. Use the standard OpenAI client with the OpenRouter base URL, then wrap it with Cascade.

Setup

from cascade import init_tracing, wrap_llm_client
from openai import OpenAI

init_tracing(project="my_agent")

client = wrap_llm_client(OpenAI(
    base_url="https://openrouter.ai/api/v1",
    api_key="sk-or-...",
))

Usage

response = client.chat.completions.create(
    model="anthropic/claude-sonnet-4-20250514",
    messages=[{"role": "user", "content": "Hello!"}],
)
# Automatically traced!

Custom OpenAI-compatible APIs

The same approach works for any OpenAI-compatible API:
from cascade import wrap_llm_client
from openai import OpenAI

client = wrap_llm_client(OpenAI(
    base_url="https://my-custom-api.com/v1",
    api_key="...",
))
The wrapped client behaves identically to the original. All existing method calls, attributes, and patterns continue to work. The only difference is that LLM calls now produce spans in your Cascade traces.
If the SDK does not recognize a client type, it returns the client unwrapped with a warning. LLM calls will still work but won’t be traced.