What is the Moda Gateway?
The Moda Gateway is a proxy that sits between your application and LLM providers. Instead of making direct calls to OpenAI, Anthropic, or AWS Bedrock, you send requests to Moda and we route them to the right place.Benefits
Single endpoint
One API endpoint for all providers. No need to manage multiple SDKs or endpoints.
Provider flexibility
Switch between models and providers by changing a single field in your request.
Automatic logging
Every request and response is logged for analysis and debugging.
Conversation tracking
Multi-turn conversations are automatically grouped together.
How it works
- Your application sends a request to the Moda endpoint
- Moda reads the
model@@providerfield to determine where to route it - Moda forwards the request to the appropriate provider
- The response is streamed back to your application
- Both request and response are logged for observability
Request format
Moda uses a format similar to OpenAI’s Chat Completions API:model field, which includes the provider name after @@.