Skip to main content

What is Moda?

Moda is an LLM API gateway that sits between your application and AI providers like OpenAI, Anthropic, and AWS Bedrock. It gives you a single endpoint to manage all your AI calls while automatically tracking usage and conversations.

Get started in 5 minutes

Set up your first API call through Moda.

Why use Moda?

One endpoint, all providers

Connect to OpenAI, Anthropic, and AWS Bedrock through a single API. Switch between models without changing your code.

Automatic tracking

Every conversation is logged automatically. See what your users are asking and how your AI is responding.

Simple integration

Works with any language or framework. If you can make an HTTP request, you can use Moda.

OpenTelemetry support

Already using OpenLLMetry? Send your traces directly to Moda for centralized observability.

How it works

  1. Send requests to Moda instead of directly to OpenAI or Anthropic
  2. Moda routes your request to the right provider based on the model you specify
  3. Get your response while Moda logs the conversation in the background
# Instead of calling OpenAI directly:
curl https://api.openai.com/v1/chat/completions

# Call Moda with your model choice:
curl https://api.example.com/v1/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -d '{"model": "gpt-4o@@openai", "messages": [...]}'

Next steps