|
| 1 | +<a href="https://chat.vercel.ai/"> |
| 2 | + <img alt="Next.js 14 and App Router-ready AI chatbot." src="app/(chat)/opengraph-image.png"> |
| 3 | + <h1 align="center">Chat SDK</h1> |
| 4 | +</a> |
| 5 | + |
| 6 | +<p align="center"> |
| 7 | + Chat SDK is a free, open-source template built with Next.js and the AI SDK that helps you quickly build powerful chatbot applications. |
| 8 | +</p> |
| 9 | + |
| 10 | +<p align="center"> |
| 11 | + <a href="https://chat-sdk.dev"><strong>Read Docs</strong></a> · |
| 12 | + <a href="#features"><strong>Features</strong></a> · |
| 13 | + <a href="#model-providers"><strong>Model Providers</strong></a> · |
| 14 | + <a href="#deploy-your-own"><strong>Deploy Your Own</strong></a> · |
| 15 | + <a href="#running-locally"><strong>Running locally</strong></a> |
| 16 | +</p> |
| 17 | +<br/> |
| 18 | + |
| 19 | +## Features |
| 20 | + |
| 21 | +- [Next.js](https://nextjs.org) App Router |
| 22 | + - Advanced routing for seamless navigation and performance |
| 23 | + - React Server Components (RSCs) and Server Actions for server-side rendering and increased performance |
| 24 | +- [AI SDK](https://ai-sdk.dev/docs/introduction) |
| 25 | + - Unified API for generating text, structured objects, and tool calls with LLMs |
| 26 | + - Hooks for building dynamic chat and generative user interfaces |
| 27 | + - Supports xAI (default), OpenAI, Fireworks, and other model providers |
| 28 | +- [shadcn/ui](https://ui.shadcn.com) |
| 29 | + - Styling with [Tailwind CSS](https://tailwindcss.com) |
| 30 | + - Component primitives from [Radix UI](https://radix-ui.com) for accessibility and flexibility |
| 31 | +- Data Persistence |
| 32 | + - [Neon Serverless Postgres](https://vercel.com/marketplace/neon) for saving chat history and user data |
| 33 | + - [Vercel Blob](https://vercel.com/storage/blob) for efficient file storage |
| 34 | +- [Auth.js](https://authjs.dev) |
| 35 | + - Simple and secure authentication |
| 36 | + |
| 37 | +## Model Providers |
| 38 | + |
| 39 | +This template uses the [Vercel AI Gateway](https://vercel.com/docs/ai-gateway) to access multiple AI models through a unified interface. The default configuration includes [xAI](https://x.ai) models (`grok-2-vision-1212`, `grok-3-mini`) routed through the gateway. |
| 40 | + |
| 41 | +### AI Gateway Authentication |
| 42 | + |
| 43 | +**For Vercel deployments**: Authentication is handled automatically via OIDC tokens. |
| 44 | + |
| 45 | +**For non-Vercel deployments**: You need to provide an AI Gateway API key by setting the `AI_GATEWAY_API_KEY` environment variable in your `.env.local` file. |
| 46 | + |
| 47 | +With the [AI SDK](https://ai-sdk.dev/docs/introduction), you can also switch to direct LLM providers like [OpenAI](https://openai.com), [Anthropic](https://anthropic.com), [Cohere](https://cohere.com/), and [many more](https://ai-sdk.dev/providers/ai-sdk-providers) with just a few lines of code. |
| 48 | + |
| 49 | +## Deploy Your Own |
| 50 | + |
| 51 | +You can deploy your own version of the Next.js AI Chatbot to Vercel with one click: |
| 52 | + |
| 53 | +[](https://vercel.com/templates/next.js/nextjs-ai-chatbot) |
| 54 | + |
| 55 | +## Running locally |
| 56 | + |
| 57 | +You will need to use the environment variables [defined in `.env.example`](.env.example) to run Next.js AI Chatbot. It's recommended you use [Vercel Environment Variables](https://vercel.com/docs/projects/environment-variables) for this, but a `.env` file is all that is necessary. |
| 58 | + |
| 59 | +> Note: You should not commit your `.env` file or it will expose secrets that will allow others to control access to your various AI and authentication provider accounts. |
| 60 | +
|
| 61 | +1. Install Vercel CLI: `npm i -g vercel` |
| 62 | +2. Link local instance with Vercel and GitHub accounts (creates `.vercel` directory): `vercel link` |
| 63 | +3. Download your environment variables: `vercel env pull` |
| 64 | + |
| 65 | +```bash |
| 66 | +pnpm install |
| 67 | +pnpm dev |
| 68 | +``` |
| 69 | + |
| 70 | +Your app template should now be running on [localhost:3000](http://localhost:3000). |
0 commit comments