Setup Guide
Protect your AI costs in under 5 minutes. Pick your environment.
Install the SDK
npm install caplyrAdd your API key to Vercel
Go to Vercel Dashboard → Your Project → Settings → Environment Variables
Value: caplyr_sk_...
Wrap your AI client
// app/api/chat/route.ts
import Anthropic from "@anthropic-ai/sdk";
import { protect } from "caplyr";
const client = protect(new Anthropic(), {
apiKey: process.env.CAPLYR_API_KEY,
budget: { monthly: 500, daily: 50 },
mode: "cost_protect",
fallback: "claude-haiku-4-5-20251001",
});
export async function POST(req: Request) {
const { prompt } = await req.json();
const message = await client.messages.create({
model: "claude-sonnet-4-20250514",
max_tokens: 1024,
messages: [{ role: "user", content: prompt }],
});
return Response.json({ text: message.content[0].text });
}Deploy
git add . && git commit -m 'Add Caplyr cost protection' && git pushVercel auto-deploys. Check your Caplyr dashboard — data flows on the first AI call.
Using other providers?
Same protect() function. The SDK auto-detects any OpenAI-compatible client.
OpenAI
import OpenAI from "openai";
import { protect } from "caplyr";
const client = protect(new OpenAI(), {
apiKey: process.env.CAPLYR_API_KEY,
budget: { monthly: 500 },
fallback: "gpt-4o-mini",
});Groq
import Groq from "groq-sdk";
import { protect } from "caplyr";
const client = protect(new Groq(), {
apiKey: process.env.CAPLYR_API_KEY,
budget: { monthly: 500 },
fallback: "llama-3.1-8b-instant",
});Together AI
import OpenAI from "openai";
import { protect } from "caplyr";
// Together AI uses an OpenAI-compatible client
const client = protect(new OpenAI({
apiKey: process.env.TOGETHER_API_KEY,
baseURL: "https://api.together.xyz/v1",
}), {
apiKey: process.env.CAPLYR_API_KEY,
budget: { monthly: 500 },
});Any SDK with a client.chat.completions.create() interface works automatically — no extra configuration needed.
Common Questions
Does Caplyr proxy my API calls?
No. The SDK runs in your app. Your API calls go directly to the provider. Caplyr only receives logs and heartbeats.
What if Caplyr's backend is unreachable?
The SDK uses the last-known server response. Your app keeps working with the most recent budget status from the heartbeat.
Does it add latency?
The budget check is local (microseconds). Log shipping is async and non-blocking. The heartbeat runs in the background every 60 seconds.
Which providers are supported?
Anthropic, OpenAI, Groq, and Together AI. Any SDK with a chat.completions.create() interface (OpenAI-compatible) works automatically.
Can I use multiple providers?
Yes. Wrap each client separately with its own protect() call. They can share the same Caplyr API key or use different projects.