Setup Guide

Protect your AI costs in under 5 minutes. Pick your environment.

1

Install the SDK

npm install caplyr
2

Add your API key to Vercel

Go to Vercel Dashboard → Your Project → Settings → Environment Variables

Key: CAPLYR_API_KEY
Value: caplyr_sk_...
3

Wrap your AI client

// app/api/chat/route.ts
import Anthropic from "@anthropic-ai/sdk";
import { protect } from "caplyr";

const client = protect(new Anthropic(), {
  apiKey: process.env.CAPLYR_API_KEY,
  budget: { monthly: 500, daily: 50 },
  mode: "cost_protect",
  fallback: "claude-haiku-4-5-20251001",
});

export async function POST(req: Request) {
  const { prompt } = await req.json();
  const message = await client.messages.create({
    model: "claude-sonnet-4-20250514",
    max_tokens: 1024,
    messages: [{ role: "user", content: prompt }],
  });
  return Response.json({ text: message.content[0].text });
}
4

Deploy

git add . && git commit -m 'Add Caplyr cost protection' && git push

Vercel auto-deploys. Check your Caplyr dashboard — data flows on the first AI call.

Using other providers?

Same protect() function. The SDK auto-detects any OpenAI-compatible client.

OpenAI

import OpenAI from "openai";
import { protect } from "caplyr";

const client = protect(new OpenAI(), {
  apiKey: process.env.CAPLYR_API_KEY,
  budget: { monthly: 500 },
  fallback: "gpt-4o-mini",
});

Groq

import Groq from "groq-sdk";
import { protect } from "caplyr";

const client = protect(new Groq(), {
  apiKey: process.env.CAPLYR_API_KEY,
  budget: { monthly: 500 },
  fallback: "llama-3.1-8b-instant",
});

Together AI

import OpenAI from "openai";
import { protect } from "caplyr";

// Together AI uses an OpenAI-compatible client
const client = protect(new OpenAI({
  apiKey: process.env.TOGETHER_API_KEY,
  baseURL: "https://api.together.xyz/v1",
}), {
  apiKey: process.env.CAPLYR_API_KEY,
  budget: { monthly: 500 },
});

Any SDK with a client.chat.completions.create() interface works automatically — no extra configuration needed.

Common Questions

Does Caplyr proxy my API calls?

No. The SDK runs in your app. Your API calls go directly to the provider. Caplyr only receives logs and heartbeats.

What if Caplyr's backend is unreachable?

The SDK uses the last-known server response. Your app keeps working with the most recent budget status from the heartbeat.

Does it add latency?

The budget check is local (microseconds). Log shipping is async and non-blocking. The heartbeat runs in the background every 60 seconds.

Which providers are supported?

Anthropic, OpenAI, Groq, and Together AI. Any SDK with a chat.completions.create() interface (OpenAI-compatible) works automatically.

Can I use multiple providers?

Yes. Wrap each client separately with its own protect() call. They can share the same Caplyr API key or use different projects.

Ready to protect your AI costs?

Free tier available. No credit card required.

Get Started Free →