What is ingestion?
Ingestion allows you to send LLM usage data to Moda from your existing applications. This is useful if you:- Already have an application making direct calls to LLM providers
- Want to add observability without changing how you call the APIs
- Use OpenLLMetry or other OpenTelemetry-based instrumentation
Ways to ingest data
OpenLLMetry SDK
Recommended for most users. Add a few lines of code to automatically capture all LLM calls.
Direct API
Send LLM conversation data directly to the Moda API.
Multi-Type
Ingest channels, emails, tool calls, and call transcripts.
How it works
- Your app makes calls to LLM providers as normal
- Telemetry data is sent to Moda in the background
- Moda processes and stores the data
- View insights in the Moda dashboard
What data is captured?
| Field | Description |
|---|---|
| Conversation ID | Groups related messages together |
| User message | What the user asked |
| Assistant response | What the AI replied |
| Model | Which model was used |
| Timestamp | When the interaction happened |
| Token usage | How many tokens were consumed |