Skip to main content

Prerequisites

Before you start, you need:

Step 1: Install the SDK

Install the Moda SDK along with your LLM provider’s client library:
pip install moda-ai openai

Step 2: Make your first request

Replace YOUR_MODA_API_KEY and your provider API key with your actual keys.
import moda
from openai import OpenAI

# Initialize Moda once at startup
moda.init("YOUR_MODA_API_KEY")

# Use OpenAI as normal -- Moda captures the call automatically
client = OpenAI(api_key="YOUR_OPENAI_KEY")
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello, how are you?"}]
)

print(response.choices[0].message.content)

# Flush before your process exits to ensure all data is sent
moda.flush()
You should see the model’s response printed in your terminal. Behind the scenes, Moda has captured the full request and response.

Step 3: Try a different provider

Moda works the same way with Anthropic. Initialize Moda, use the provider client as normal, and Moda captures everything automatically.
import moda
from anthropic import Anthropic

moda.init("YOUR_MODA_API_KEY")

client = Anthropic(api_key="YOUR_ANTHROPIC_KEY")
response = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello, how are you?"}]
)

print(response.content[0].text)

moda.flush()
For provider-specific features like streaming, tool use, and extended thinking, see the Anthropic and OpenAI guides.

Step 4: View your conversations

After running the code above, open the Moda dashboard to see your conversations. Each LLM call is automatically logged with the full request, response, model, and token usage.

Troubleshooting

ProblemSolution
Data not appearing in the dashboardMake sure moda.init() is called before creating the LLM client, and call moda.flush() before your process exits.
ModuleNotFoundError: No module named 'moda'Run pip install moda-ai (the package name is moda-ai, not moda).
Cannot find module 'moda-ai'Run npm install moda-ai in your project directory.

What is next?