TypeScript

GateCtr + Vercel AI SDK

Use GateCtr with the Vercel AI SDK via custom provider base URL

1

Install

No additional packages required. Use your existing Vercel AI SDK installation.

2

Configure

Before
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";

const { text } = await generateText({
  model: openai("gpt-4o"),
  prompt: "Hello",
});
After GateCtr
import { createOpenAI } from "@ai-sdk/openai";
import { generateText } from "ai";

const gatectr = createOpenAI({
  baseURL: "https://api.gatectr.com/v1",
  apiKey: process.env.OPENAI_API_KEY,
});

const { text } = await generateText({
  model: gatectr("gpt-4o"),
  prompt: "Hello",
});
3

Test

Make a test call and check the GateCtr dashboard for token savings and cost data.

What GateCtr does under the hood for Vercel AI SDK

When you route Vercel AI SDK calls through GateCtr, every request is automatically compressed (up to 40% fewer tokens), scored for complexity (to select the optimal model), and checked against your budget cap before reaching the LLM provider. You get full observability β€” tokens, cost, latency β€” in the GateCtr dashboard.

Start saving with Vercel AI SDK β€” free

No credit card required. Up and running in 5 minutes.

Start free