Quick Start
Get a working chat UI running in under 5 minutes.
This page shows the scaffolded setup for getting a working chat app running quickly.
If you already have an existing Next.js app, use Installation or the End-to-End Guide instead.
1. Create your app
Run the create command. This scaffolds a Next.js app with OpenUI Chat already wired to an OpenAI-backed route.
bash npx @openuidev/cli@latest create cd genui-chat-app bash pnpm dlx @openuidev/cli@latest create cd genui-chat-app bash yarn dlx @openuidev/cli@latest create cd genui-chat-app bash bunx @openuidev/cli@latest create cd genui-chat-app 2. Add your API key
Create a .env.local file in the project root:
OPENAI_API_KEY=sk-your-key-here3. Start the dev server
bash npm run dev bash pnpm dev bash yarn dev bash bun dev Open http://localhost:3000 in your browser. You should see the default FullScreen chat. Try sending a message.
You should see a full-page chat experience with streaming responses enabled.
What you just built
The scaffold generates both the frontend and backend for you.
You do not need to recreate these files during quick start. This section is here so you know what the scaffold already configured.
The Frontend (app/page.tsx)
The frontend renders FullScreen, sends requests with processMessage, converts messages explicitly with openAIMessageFormat.toApi(messages), and parses the OpenAI SDK readable stream correctly.
import { openAIMessageFormat, openAIReadableStreamAdapter } from "@openuidev/react-headless";
import { FullScreen } from "@openuidev/react-ui";
import { openuiLibrary } from "@openuidev/react-ui/genui-lib";
export default function Page() {
return (
<FullScreen
processMessage={async ({ messages, abortController }) => {
return fetch("/api/chat", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
messages: openAIMessageFormat.toApi(messages),
}),
signal: abortController.signal,
});
}}
streamProtocol={openAIReadableStreamAdapter()}
componentLibrary={openuiLibrary}
agentName="OpenUI Chat"
/>
);
}The Backend (app/api/chat/route.ts)
The scaffold also creates a Next.js route handler at app/api/chat/route.ts.
That route:
- loads the system prompt generated by the CLI at build time
- receives OpenAI-format messages
- prepends the system prompt
- calls OpenAI Chat Completions with streaming enabled
- returns
response.toReadableStream()
The scaffold includes a prebuild step (openui generate) that creates the system prompt from your component library. This keeps the prompt on the server — it is never sent from the frontend.
Next steps
Now that the app is running, choose the next path based on what you want to change.