AI / LLM Observability
Vercel AI SDK
The Vercel AI SDK integration works with any AI provider through the unified Vercel AI SDK interface. Wrap your model with track() to automatically capture all LLM calls.
Package: @databuddy/ai | Import: @databuddy/ai/vercel
Installation
bash
bun add @databuddy/ai ai @ai-sdk/openaiQuick Start
tsx
import { createTracker } from "@databuddy/ai/vercel";
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";
const { track } = createTracker({
apiKey: process.env.DATABUDDY_API_KEY
});
const result = await generateText({
model: track(openai("gpt-4o")),
prompt: "Explain quantum computing in simple terms"
});Configuration
createTracker(options)
Create a tracking instance with your configuration:
tsx
import { createTracker } from "@databuddy/ai/vercel";
const { track, transport } = createTracker({
// Required: API key for authentication
apiKey: process.env.DATABUDDY_API_KEY,
// Optional: Custom API endpoint
apiUrl: "https://basket.databuddy.cc/llm",
// Optional: Compute token costs (default: true)
computeCosts: true,
// Optional: Don't capture message content (default: false)
privacyMode: false,
// Optional: Max content size in bytes (default: 1MB)
maxContentSize: 1_048_576,
// Optional: Success callback
onSuccess: (call) => console.log("AI call completed:", call.traceId),
// Optional: Error callback
onError: (call) => console.error("AI call failed:", call.error)
});Options Reference
| Option | Type | Default | Description |
|---|---|---|---|
apiKey | string | DATABUDDY_API_KEY env var | Required. API key for authentication |
apiUrl | string | https://basket.databuddy.cc/llm | API endpoint (or DATABUDDY_API_URL env var) |
transport | Transport | HTTP transport | Custom transport function |
computeCosts | boolean | true | Compute token costs using TokenLens |
privacyMode | boolean | false | Don't capture input/output content |
maxContentSize | number | 1048576 (1MB) | Max content size in bytes |
onSuccess | (call) => void | - | Callback on successful calls |
onError | (call) => void | - | Callback on failed calls |
Tracking Models
Basic Usage
tsx
import { createTracker } from "@databuddy/ai/vercel";
import { openai } from "@ai-sdk/openai";
import { anthropic } from "@ai-sdk/anthropic";
import { generateText, streamText } from "ai";
const { track } = createTracker({
apiKey: process.env.DATABUDDY_API_KEY
});
// Track OpenAI
const result1 = await generateText({
model: track(openai("gpt-4o")),
prompt: "Hello!"
});
// Track Anthropic
const result2 = await generateText({
model: track(anthropic("claude-sonnet-4-20250514")),
prompt: "Hello!"
});
// Streaming works too
const stream = await streamText({
model: track(openai("gpt-4o-mini")),
prompt: "Write a poem"
});Per-Call Options
Override options for specific calls:
tsx
const { track } = createTracker({
apiKey: process.env.DATABUDDY_API_KEY
});
// Enable privacy mode for sensitive calls
const result = await generateText({
model: track(openai("gpt-4o"), {
privacyMode: true,
traceId: "custom-trace-123"
}),
prompt: "Process this sensitive data..."
});Track Options
| Option | Type | Description |
|---|---|---|
traceId | string | Custom trace ID to link related calls |
transport | Transport | Override transport for this call |
computeCosts | boolean | Override cost computation |
privacyMode | boolean | Override privacy mode |
onSuccess | (call) => void | Override success callback |
onError | (call) => void | Override error callback |
Tool Tracking
Tools are automatically tracked when used:
tsx
import { createTracker } from "@databuddy/ai/vercel";
import { openai } from "@ai-sdk/openai";
import { generateText, tool } from "ai";
import { z } from "zod";
const { track } = createTracker({
apiKey: process.env.DATABUDDY_API_KEY
});
const result = await generateText({
model: track(openai("gpt-4o")),
prompt: "What's the weather in London?",
tools: {
getWeather: tool({
description: "Get current weather",
parameters: z.object({
city: z.string()
}),
execute: async ({ city }) => {
return { temperature: 15, condition: "cloudy" };
}
})
}
});
// Tracked data includes:
// tools: {
// callCount: 1,
// resultCount: 1,
// calledTools: ["getWeather"],
// availableTools: ["getWeather"]
// }Streaming
Streaming responses are tracked when the stream completes:
tsx
import { createTracker } from "@databuddy/ai/vercel";
import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";
const { track } = createTracker({
apiKey: process.env.DATABUDDY_API_KEY
});
const result = await streamText({
model: track(openai("gpt-4o")),
prompt: "Write a story"
});
// Stream the response
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
// Usage data is tracked when stream completesError Tracking
Errors are automatically captured:
tsx
const { track } = createTracker({
apiKey: process.env.DATABUDDY_API_KEY,
onError: (call) => {
console.error("AI call failed:", {
model: call.model,
error: call.error?.message,
durationMs: call.durationMs
});
}
});
try {
await generateText({
model: track(openai("gpt-4o")),
prompt: "..."
});
} catch (error) {
// Error is logged automatically, plus your onError callback runs
}Privacy Mode
Enable privacy mode to track usage without capturing content:
tsx
const { track } = createTracker({
apiKey: process.env.DATABUDDY_API_KEY,
privacyMode: true // Don't capture prompts/responses
});
// Only usage, costs, and metadata are tracked
// input: [] and output: [] in the logged dataCustom Transport
Use a custom transport for logging to other destinations:
tsx
import { createTracker, httpTransport, type LLMCall } from "@databuddy/ai/vercel";
// Custom transport that logs locally and sends to API
const customTransport = async (call: LLMCall) => {
console.log("AI call:", {
model: call.model,
tokens: call.usage.totalTokens,
cost: call.cost.totalCostUSD
});
// Also send to Databuddy
await httpTransport("https://basket.databuddy.cc/llm", "your-api-key")(call);
};
const { track } = createTracker({
transport: customTransport
});Trace IDs
Link related calls with trace IDs:
tsx
import { createTracker, createTraceId } from "@databuddy/ai/vercel";
const { track } = createTracker({
apiKey: process.env.DATABUDDY_API_KEY
});
// Generate a trace ID for a conversation
const traceId = createTraceId();
// All calls in this conversation share the trace ID
const result1 = await generateText({
model: track(openai("gpt-4o"), { traceId }),
prompt: "What is 2+2?"
});
const result2 = await generateText({
model: track(openai("gpt-4o"), { traceId }),
prompt: "And what is that times 3?"
});Supported Providers
The SDK works with any Vercel AI SDK provider:
| Provider | Package | Example |
|---|---|---|
| OpenAI | @ai-sdk/openai | openai("gpt-4o") |
| Anthropic | @ai-sdk/anthropic | anthropic("claude-sonnet-4-20250514") |
@ai-sdk/google | google("gemini-1.5-pro") | |
| Mistral | @ai-sdk/mistral | mistral("mistral-large") |
| Cohere | @ai-sdk/cohere | cohere("command-r-plus") |
| AWS Bedrock | @ai-sdk/amazon-bedrock | bedrock("anthropic.claude-3") |
| Azure | @ai-sdk/azure | azure("gpt-4") |
| Groq | @ai-sdk/groq | groq("mixtral-8x7b") |
TypeScript Types
tsx
import {
createTracker,
httpTransport,
createTraceId,
type CallOptions,
type Cost,
type ErrorInfo,
type LLMCall,
type Message,
type TrackerOptions,
type Transport,
type Usage
} from "@databuddy/ai/vercel";Related
How is this guide?