The Moda Claude Code skill teaches AI assistants how to correctly integrate Moda’s LLM observability SDK into any TypeScript or Python project. Instead of guessing at configuration, the AI detects your project stack and applies the right integration pattern automatically.The skill handles:
Detecting your framework and LLM provider
Installing the correct SDK (moda-ai for both npm and PyPI)
Placing init() at the right entry point for your framework
The skill automatically chooses the right conversation ID strategy:
Pattern
Strategy
Simple chatbot
Auto-detection (hash-based)
Web server
Scoped context per request
Agent framework
Explicit ID (auto doesn’t work with agents)
Multi-agent system
Shared parent task ID
Agent frameworks like LangChain, CrewAI, and AutoGPT require explicit conversation IDs. The automatic fallback does NOT work because each agent iteration constructs different messages.
Here’s what happens when you ask Claude Code to integrate Moda into a Next.js project:
Copy
You: Add Moda analytics to this projectClaude: I'll integrate Moda into your Next.js + Vercel AI SDK project.1. Installing the SDK: npm install moda-ai2. Creating instrumentation.ts for initialization: [creates instrumentation.ts with Moda.init()]3. Updating your route handler: [adds Moda.conversationId and experimental_telemetry to your existing chat route]4. Adding MODA_API_KEY to .env.local: [adds the environment variable]Done! Your LLM calls will now be tracked in Moda with automaticconversation threading. Get your API key from the Moda dashboardand add it to .env.local.