Overview
OpenRouter provides access to multiple LLM providers through a unified, OpenAI-compatible API. Since OpenRouter uses the same interface as OpenAI, Moda automatically tracks your OpenRouter calls with no extra configuration.Setup with Moda SDK
Direct API with Manual Tracing
For cases where you want to call the OpenRouter API directly (without the OpenAI client), use the Node.js SDK’s manual tracing:Moda.withLLMCall() is available in the Node.js SDK. See Manual Tracing for details.Supported Features
| Feature | Captured |
|---|---|
| Chat completions | Yes |
| Streaming | Yes |
| Token usage | Yes |
| Model name | Yes (includes provider prefix) |
| Tool use | Yes (when the underlying model supports it) |
Troubleshooting
Model name shows OpenRouter prefix?- This is expected. OpenRouter model names include the provider prefix (e.g.,
anthropic/claude-3.5-sonnet). Moda captures the exact model name returned by OpenRouter.
- Some models on OpenRouter may not return token usage data. This depends on the underlying provider.
For full SDK documentation, see the Python SDK or Node.js SDK guides.