Skip to main content

Overview

Azure OpenAI Service provides access to OpenAI models hosted on Azure infrastructure. Since the Azure OpenAI client extends the standard OpenAI client, Moda automatically tracks your Azure OpenAI calls with the same setup as standard OpenAI.

Setup with Moda SDK

import moda
from openai import AzureOpenAI

moda.init("YOUR_MODA_API_KEY")

client = AzureOpenAI(
    api_key="YOUR_AZURE_API_KEY",
    api_version="2024-02-01",
    azure_endpoint="https://your-resource.openai.azure.com",
)

moda.conversation_id = "session_123"

response = client.chat.completions.create(
    model="gpt-4o",  # This is your deployment name
    messages=[{"role": "user", "content": "Hello!"}]
)

moda.flush()
The model parameter for Azure OpenAI is your deployment name, not the model name. Moda captures whatever value you pass as the model identifier.

Azure-Specific Configuration

ParameterDescription
azure_endpoint / endpointYour Azure OpenAI resource URL
api_version / apiVersionAzure API version (e.g., 2024-02-01)
api_key / apiKeyAzure OpenAI API key
modelYour deployment name (not the model name)

Supported Features

FeatureCaptured
Chat completionsYes
StreamingYes
Function calling / tool useYes
EmbeddingsYes
Token usageYes

Troubleshooting

Model name shows deployment name instead of actual model?
  • This is expected with Azure OpenAI. The model field will contain your deployment name.
Authentication errors?
  • Verify your api_key, api_version, and azure_endpoint are correct
  • Ensure the deployment exists in your Azure OpenAI resource
For full SDK documentation, see the Python SDK or Node.js SDK guides.