Skip to main content

What is Moda?

Moda is an analytics platform for LLM applications. Add a few lines of code to your existing OpenAI or Anthropic calls, and Moda automatically tracks every conversation — what users ask, how your AI responds, and which patterns lead to success or failure.

Get started in 5 minutes

Install the SDK, add two lines of code, and see your first conversation in the dashboard.

Why use Moda?

All providers, no code changes

Works with OpenAI, Anthropic, and AWS Bedrock. Switch between models without changing your integration.

Automatic conversation tracking

Every LLM call is logged and grouped into conversations automatically. See what your users are asking and how your AI responds.

Python and Node.js SDKs

Official SDKs for Python and Node.js. Add two lines of code to start tracking.

Direct API

Already have your own telemetry pipeline? Send events directly to the Moda API using simple JSON.

How it works

  1. Install the Moda SDK in your application
  2. Initialize with your API key — the SDK captures LLM calls in the background
  3. View conversations and analytics in the Moda dashboard
pip install moda-ai
import moda
from openai import OpenAI

moda.init("YOUR_MODA_API_KEY")

client = OpenAI(api_key="YOUR_OPENAI_KEY")
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello, how are you?"}]
)

moda.flush()

Next steps