Skip to main content

What It Does

The Moda Claude Code skill teaches AI assistants how to correctly integrate Moda’s LLM observability SDK into any TypeScript or Python project. Instead of guessing at configuration, the AI detects your project stack and applies the right integration pattern automatically. The skill handles:
  • Detecting your framework and LLM provider
  • Installing the correct SDK (moda-ai for both npm and PyPI)
  • Placing init() at the right entry point for your framework
  • Choosing the correct conversation ID strategy
  • Adding flush() in cleanup hooks
  • Configuring Vercel AI SDK telemetry

Prerequisites

Installation

The skill files are available in the Moda GitHub repository. You can install them per-project or globally.
# From your project root
mkdir -p .claude/skills
cp -r path/to/moda-setup/ .claude/skills/moda-setup/

Usage

Once installed, the skill activates automatically when you ask Claude Code to:
  • “Add Moda observability to this project”
  • “Integrate LLM analytics”
  • “Set up conversation tracking for my AI calls”
  • “Add tracing for OpenAI/Anthropic calls”
You can also invoke it directly:
/moda-setup

What It Handles

Framework Detection

The skill inspects your project files to detect:
FileDetects
package.jsonNode.js framework, AI SDK, LLM provider
requirements.txt / pyproject.tomlPython framework, LLM provider
next.config.*Next.js
app/layout.tsxNext.js App Router
instrumentation.tsExisting instrumentation setup

Conversation Threading

The skill automatically chooses the right conversation ID strategy:
PatternStrategy
Simple chatbotAuto-detection (hash-based)
Web serverScoped context per request
Agent frameworkExplicit ID (auto doesn’t work with agents)
Multi-agent systemShared parent task ID
Agent frameworks like LangChain, CrewAI, and AutoGPT require explicit conversation IDs. The automatic fallback does NOT work because each agent iteration constructs different messages.

Correct Placement

The skill knows where to place initialization for each framework:
Frameworkinit() Locationflush() Location
Next.jsinstrumentation.tsRoute handler waitUntil
ExpressServer startupSIGTERM handler
FastAPILifespan eventLifespan cleanup
FlaskApp factory / module levelteardown_appcontext
Generic scriptTop of fileBefore exit

Supported Frameworks

TypeScript / Node.js

  • Next.js (App Router + Vercel AI SDK)
  • Express
  • Generic Node.js scripts
  • Vercel AI SDK
  • OpenAI SDK
  • Anthropic SDK

Python

  • FastAPI
  • Flask
  • LangChain
  • CrewAI
  • Generic Python scripts
  • OpenAI SDK
  • Anthropic SDK

Example Session

Here’s what happens when you ask Claude Code to integrate Moda into a Next.js project:
You: Add Moda analytics to this project

Claude: I'll integrate Moda into your Next.js + Vercel AI SDK project.

1. Installing the SDK:
   npm install moda-ai

2. Creating instrumentation.ts for initialization:
   [creates instrumentation.ts with Moda.init()]

3. Updating your route handler:
   [adds Moda.conversationId and experimental_telemetry
    to your existing chat route]

4. Adding MODA_API_KEY to .env.local:
   [adds the environment variable]

Done! Your LLM calls will now be tracked in Moda with automatic
conversation threading. Get your API key from the Moda dashboard
and add it to .env.local.