Model Selection

The Built-in Agent uses the Vercel AI SDK under the hood, giving you access to models from OpenAI, Anthropic, and Google — plus the ability to use any custom AI SDK model.

Supported Models

Specify a model using the "provider:model" format (or "provider/model" — both work).

OpenAI

ModelSpecifier
GPT-5openai:gpt-5
GPT-5 Miniopenai:gpt-5-mini
GPT-4.1openai:gpt-4.1
GPT-4.1 Miniopenai:gpt-4.1-mini
GPT-4.1 Nanoopenai:gpt-4.1-nano
GPT-4oopenai:gpt-5.2
GPT-4o Miniopenai:gpt-5.2-mini
o3openai:o3
o3-miniopenai:o3-mini
o4-miniopenai:o4-mini
const agent = new BuiltInAgent({
  model: "openai:gpt-4.1",
});

Anthropic

ModelSpecifier
Claude Sonnet 4.5anthropic:claude-sonnet-4.5
Claude Sonnet 4anthropic:claude-sonnet-4
Claude 3.7 Sonnetanthropic:claude-3.7-sonnet
Claude Opus 4.1anthropic:claude-opus-4.1
Claude Opus 4anthropic:claude-opus-4
Claude 3.5 Haikuanthropic:claude-3.5-haiku
const agent = new BuiltInAgent({
  model: "anthropic:claude-sonnet-4.5",
});

Google

ModelSpecifier
Gemini 2.5 Progoogle:gemini-2.5-pro
Gemini 2.5 Flashgoogle:gemini-2.5-flash
Gemini 2.5 Flash Litegoogle:gemini-2.5-flash-lite
const agent = new BuiltInAgent({
  model: "google:gemini-2.5-pro",
});

Environment Variables

Set the API key for your chosen provider:

# OpenAI
OPENAI_API_KEY=sk-...

# Anthropic
ANTHROPIC_API_KEY=sk-ant-...

# Google
GOOGLE_API_KEY=...

Alternatively, pass the API key directly in your configuration:

const agent = new BuiltInAgent({
  model: "openai:gpt-4.1",
  apiKey: process.env.MY_OPENAI_KEY, // [!code highlight]
});

Custom Models (AI SDK)

For models not in the built-in list, you can pass any Vercel AI SDK LanguageModel instance directly:

const customProvider = createOpenAI({
  // [!code highlight]
  apiKey: process.env.MY_API_KEY, // [!code highlight]
  baseURL: "https://my-proxy.example.com/v1", // [!code highlight]
}); // [!code highlight]

const agent = new BuiltInAgent({
  model: customProvider("my-fine-tuned-model"), // [!code highlight]
});

This works with any AI SDK provider — Azure OpenAI, AWS Bedrock, Ollama, or any OpenAI-compatible endpoint:

const azure = createAzure({
  resourceName: "my-resource",
  apiKey: process.env.AZURE_API_KEY,
});

const agent = new BuiltInAgent({
  model: azure("my-deployment"),
});

How It Works

Under the hood, the Built-in Agent resolves model strings to AI SDK provider instances:

  • "openai:gpt-4.1"@ai-sdk/openaiopenai("gpt-4.1")
  • "anthropic:claude-sonnet-4.5"@ai-sdk/anthropicanthropic("claude-sonnet-4.5")
  • "google:gemini-2.5-pro"@ai-sdk/googlegoogle("gemini-2.5-pro")

Both "provider:model" and "provider/model" separators are supported and work identically.

2087950ee