Emerging
CopilotKit integration with LlamaIndex
Copy-paste command to clone the canonical starter
npx copilotkit@latest init --framework llamaindex
Polished starter chat with brand-styled CopilotChat surface
Agent uses tools to trigger UI generation (bar / pie chart components)
MCP server-driven UI via activity renderers
Natural conversation with frontend tool execution
Agent invokes client-side handlers registered with useFrontendTool
useFrontendTool with an async handler — agent awaits a simulated client-side notes DB query and uses the returned result
Interactive approval/decision surface rendered inline in the chat via the high-level `useHumanInTheLoop` hook
Time-picker card rendered inline via useHumanInTheLoop for a booking flow
Agent requests approval via useFrontendTool with an async handler; the approval UI pops up as an app-level modal OUTSIDE the chat
User approves agent actions before execution
Backend agent tools rendered as UI components
Out-of-the-box tool rendering — backend defines tools; frontend adds zero custom renderers and uses CopilotKit's built-in default UI
Single branded wildcard renderer via useDefaultRenderTool — the same app-designed card paints every tool call
Long-running agent tasks with generated UI
Per-token state delta streaming from agent to UI
Bidirectional shared state — UI writes preferences via agent.setState; agent writes notes via the set_notes tool. Backend reads preferences from state every turn through LlamaIndex's <state> prelude
Supervisor delegates to research / writing / critique sub-agents (each a stand-alone LlamaIndex FunctionAgent). Every delegation appends to a live log in shared agent state
Docked sidebar chat via <CopilotSidebar />
Floating popup chat via <CopilotPopup />
Customize CopilotChat via its slot system
Default CopilotChat re-themed via CopilotKitCSSProperties
Minimal custom chat surface built on useAgent
Full chat implementation built from scratch on useAgent
Frontend provides read-only context to the agent via useAgentContext
Visible reasoning/thinking chain alongside the final answer via a custom reasoningMessage slot
Built-in CopilotChatReasoningMessage renders without a custom slot
Sequential tool calls with reasoning tokens rendered side-by-side
Canonical A2UI BYOC — custom catalog (Card/StatusBadge/Metric/InfoRow/PrimaryButton/PieChart/BarChart) wired via a2ui.catalog on the provider; agent owns the generate_a2ui tool
A2UI rendering against a known client-side schema; the agent streams flight data into a pre-authored component tree
Streaming hierarchical JSON UI spec rendered via @json-render/react, with a Zod-validated catalog (MetricCard + PieChart + BarChart)
Streaming structured output via @hashbrownai/react, rendering a sales dashboard catalog (MetricCard + PieChart + BarChart)
Bearer-token gate via runtime onRequest hook with unauthenticated / authenticated states
Speech-to-text via @copilotkit/voice with a bundled sample audio button
Forward a typed config object (tone / expertise / response length) from the provider to the agent
Image and PDF uploads via CopilotChat attachments, processed by a vision-capable agent (gpt-4o)
Agent generates UI from an arbitrary component library inside a sandboxed iframe
Agent-authored UI that can invoke frontend sandbox functions from inside the iframe
Interactive component rendered inline in the chat via the lower-level `useInterrupt` primitive — direct control over the interrupt lifecycle
Resolve interrupts from a plain button grid — no chat, no useInterrupt render prop