Overview
Moda integrates with the Vercel AI SDK via its built-in telemetry support. The AI SDK’sexperimental_telemetry option emits telemetry data that Moda parses automatically, giving you conversation tracking, token usage, and analytics across all AI SDK providers.
Installation
Quick Start
Streaming
streamText works the same way. Telemetry is captured after the stream completes:
Structured Output
generateObject responses are captured as JSON in the assistant message:
Tool Use
Tool calls made by the model are captured as structured content blocks:Conversation Threading
When you setModa.conversationId or Moda.userId before calling an AI SDK function, those values are automatically included in the telemetry metadata:
getVercelAITelemetry() helper automatically includes moda.conversation_id and moda.user_id in the telemetry metadata when they are set.
Options Reference
| Option | Type | Default | Description |
|---|---|---|---|
recordInputs | boolean | true | Whether to record prompt messages in telemetry |
recordOutputs | boolean | true | Whether to record response content in telemetry |
functionId | string | - | Identifier for grouping telemetry by function |
metadata | Record<string, string> | - | Custom metadata attached to spans |
Supported Providers
Any provider supported by the Vercel AI SDK works with Moda. The model and provider are automatically captured.| Provider | Package | Example |
|---|---|---|
| OpenAI | @ai-sdk/openai | openai('gpt-4o') |
| Anthropic | @ai-sdk/anthropic | anthropic('claude-3-5-sonnet-20241022') |
@ai-sdk/google | google('gemini-1.5-pro') | |
| Mistral | @ai-sdk/mistral | mistral('mistral-large-latest') |
| Amazon Bedrock | @ai-sdk/amazon-bedrock | bedrock('anthropic.claude-3-sonnet') |
| Azure OpenAI | @ai-sdk/azure | azure('gpt-4o') |
Troubleshooting
Data not appearing in Moda?- Ensure
Moda.init()is called withawaitbefore your first AI SDK call - Call
await Moda.flush()before your program exits - Verify your API key is correct
- Enable debug mode:
Moda.init('key', { debug: true })
- Make sure
Moda.conversationIdis set before callingModa.getVercelAITelemetry() - The telemetry config is created at call time, so set the conversation ID first
- If you also use Moda’s native OpenAI/Anthropic instrumentation, both will capture data. The AI SDK telemetry captures the high-level AI SDK call, while native instrumentation captures the underlying provider API call. This is safe but may result in duplicate entries. To avoid this, you can disable native instrumentation for providers used through the AI SDK.
For full SDK documentation, see the Node.js SDK guide.