diff --git a/.agent/skills/deploy-to-vercel/Archive.zip b/.agent/skills/deploy-to-vercel/Archive.zip new file mode 100644 index 00000000..2945baff Binary files /dev/null and b/.agent/skills/deploy-to-vercel/Archive.zip differ diff --git a/.agent/skills/deploy-to-vercel/SKILL.md b/.agent/skills/deploy-to-vercel/SKILL.md new file mode 100644 index 00000000..a0251ce8 --- /dev/null +++ b/.agent/skills/deploy-to-vercel/SKILL.md @@ -0,0 +1,296 @@ +--- +name: deploy-to-vercel +description: Deploy applications and websites to Vercel. Use when the user requests deployment actions like "deploy my app", "deploy and give me the link", "push this live", or "create a preview deployment". +metadata: + author: vercel + version: "3.0.0" +--- + +# Deploy to Vercel + +Deploy any project to Vercel. **Always deploy as preview** (not production) unless the user explicitly asks for production. + +The goal is to get the user into the best long-term setup: their project linked to Vercel with git-push deploys. Every method below tries to move the user closer to that state. + +## Step 1: Gather Project State + +Run all four checks before deciding which method to use: + +```bash +# 1. Check for a git remote +git remote get-url origin 2>/dev/null + +# 2. Check if locally linked to a Vercel project (either file means linked) +cat .vercel/project.json 2>/dev/null || cat .vercel/repo.json 2>/dev/null + +# 3. Check if the Vercel CLI is installed and authenticated +vercel whoami 2>/dev/null + +# 4. List available teams (if authenticated) +vercel teams list --format json 2>/dev/null +``` + +### Team selection + +If the user belongs to multiple teams, present all available team slugs as a bulleted list and ask which one to deploy to. Once the user picks a team, proceed immediately to the next step — do not ask for additional confirmation. + +Pass the team slug via `--scope` on all subsequent CLI commands (`vercel deploy`, `vercel link`, `vercel inspect`, etc.): + +```bash +vercel deploy [path] -y --no-wait --scope +``` + +If the project is already linked (`.vercel/project.json` or `.vercel/repo.json` exists), the `orgId` in those files determines the team — no need to ask again. If there is only one team (or just a personal account), skip the prompt and use it directly. + +**About the `.vercel/` directory:** A linked project has either: +- `.vercel/project.json` — created by `vercel link` (single project linking). Contains `projectId` and `orgId`. +- `.vercel/repo.json` — created by `vercel link --repo` (repo-based linking). Contains `orgId`, `remoteName`, and a `projects` array mapping directories to Vercel project IDs. + +Either file means the project is linked. Check for both. + +**Do NOT** use `vercel project inspect`, `vercel ls`, or `vercel link` to detect state in an unlinked directory — without a `.vercel/` config, they will interactively prompt (or with `--yes`, silently link as a side-effect). Only `vercel whoami` is safe to run anywhere. + +## Step 2: Choose a Deploy Method + +### Linked (`.vercel/` exists) + has git remote → Git Push + +This is the ideal state. The project is linked and has git integration. + +1. **Ask the user before pushing.** Never push without explicit approval: + ``` + This project is connected to Vercel via git. I can commit and push to + trigger a deployment. Want me to proceed? + ``` + +2. **Commit and push:** + ```bash + git add . + git commit -m "deploy: " + git push + ``` + Vercel automatically builds from the push. Non-production branches get preview deployments; the production branch (usually `main`) gets a production deployment. + +3. **Retrieve the preview URL.** If the CLI is authenticated: + ```bash + sleep 5 + vercel ls --format json + ``` + The JSON output has a `deployments` array. Find the latest entry — its `url` field is the preview URL. + + If the CLI is not authenticated, tell the user to check the Vercel dashboard or the commit status checks on their git provider for the preview URL. + +--- + +### Linked (`.vercel/` exists) + no git remote → `vercel deploy` + +The project is linked but there's no git repo. Deploy directly with the CLI. + +```bash +vercel deploy [path] -y --no-wait +``` + +Use `--no-wait` so the CLI returns immediately with the deployment URL instead of blocking until the build finishes (builds can take a while). Then check on the deployment status with: + +```bash +vercel inspect +``` + +For production deploys (only if user explicitly asks): +```bash +vercel deploy [path] --prod -y --no-wait +``` + +--- + +### Not linked + CLI is authenticated → Link first, then deploy + +The CLI is working but the project isn't linked yet. This is the opportunity to get the user into the best state. + +1. **Ask the user which team to deploy to.** Present the team slugs from Step 1 as a bulleted list. If there's only one team (or just a personal account), skip this step. + +2. **Once a team is selected, proceed directly to linking.** Tell the user what will happen but do not ask for separate confirmation: + ``` + Linking this project to on Vercel. This will create a Vercel + project to deploy to and enable automatic deployments on future git pushes. + ``` + +3. **If a git remote exists**, use repo-based linking with the selected team scope: + ```bash + vercel link --repo --scope + ``` + This reads the git remote URL and matches it to existing Vercel projects that deploy from that repo. It creates `.vercel/repo.json`. This is much more reliable than `vercel link` (without `--repo`), which tries to match by directory name and often fails when the local folder and Vercel project are named differently. + + **If there is no git remote**, fall back to standard linking: + ```bash + vercel link --scope + ``` + This prompts the user to select or create a project. It creates `.vercel/project.json`. + +4. **Then deploy using the best available method:** + - If a git remote exists → commit and push (see git push method above) + - If no git remote → `vercel deploy [path] -y --no-wait --scope `, then `vercel inspect ` to check status + +--- + +### Not linked + CLI not authenticated → Install, auth, link, deploy + +The Vercel CLI isn't set up at all. + +1. **Install the CLI (if not already installed):** + ```bash + npm install -g vercel + ``` + +2. **Authenticate:** + ```bash + vercel login + ``` + The user completes auth in their browser. If running in a non-interactive environment where login is not possible, skip to the **no-auth fallback** below. + +3. **Ask which team to deploy to** — present team slugs from `vercel teams list --format json` as a bulleted list. If only one team / personal account, skip. Once selected, proceed immediately. + +4. **Link the project** with the selected team scope (use `--repo` if a git remote exists, plain `vercel link` otherwise): + ```bash + vercel link --repo --scope # if git remote exists + vercel link --scope # if no git remote + ``` + +5. **Deploy** using the best available method (git push if remote exists, otherwise `vercel deploy -y --no-wait --scope `, then `vercel inspect ` to check status). + +--- + +### No-Auth Fallback — claude.ai sandbox + +**When to use:** Last resort when the CLI can't be installed or authenticated in the claude.ai sandbox. This requires no authentication — it returns a **Preview URL** (live site) and a **Claim URL** (transfer to your Vercel account). + +```bash +bash /mnt/skills/user/deploy-to-vercel/resources/deploy.sh [path] +``` + +**Arguments:** +- `path` - Directory to deploy, or a `.tgz` file (defaults to current directory) + +**Examples:** +```bash +# Deploy current directory +bash /mnt/skills/user/deploy-to-vercel/resources/deploy.sh + +# Deploy specific project +bash /mnt/skills/user/deploy-to-vercel/resources/deploy.sh /path/to/project + +# Deploy existing tarball +bash /mnt/skills/user/deploy-to-vercel/resources/deploy.sh /path/to/project.tgz +``` + +The script auto-detects the framework from `package.json`, packages the project (excluding `node_modules`, `.git`, `.env`), uploads it, and waits for the build to complete. + +**Tell the user:** "Your deployment is ready at [previewUrl]. Claim it at [claimUrl] to manage your deployment." + +--- + +### No-Auth Fallback — Codex sandbox + +**When to use:** In the Codex sandbox where the CLI may not be authenticated. Codex runs in a sandboxed environment by default — try the CLI first, and fall back to the deploy script if auth fails. + +1. **Check whether the Vercel CLI is installed** (no escalation needed for this check): + ```bash + command -v vercel + ``` + +2. **If `vercel` is installed**, try deploying with the CLI: + ```bash + vercel deploy [path] -y --no-wait + ``` + +3. **If `vercel` is not installed, or the CLI fails with "No existing credentials found"**, use the fallback script: + ```bash + skill_dir="" + + # Deploy current directory + bash "$skill_dir/resources/deploy-codex.sh" + + # Deploy specific project + bash "$skill_dir/resources/deploy-codex.sh" /path/to/project + + # Deploy existing tarball + bash "$skill_dir/resources/deploy-codex.sh" /path/to/project.tgz + ``` + +The script handles framework detection, packaging, and deployment. It waits for the build to complete and returns JSON with `previewUrl` and `claimUrl`. + +**Tell the user:** "Your deployment is ready at [previewUrl]. Claim it at [claimUrl] to manage your deployment." + +**Escalated network access:** Only escalate the actual deploy command if sandboxing blocks the network call (`sandbox_permissions=require_escalated`). Do **not** escalate the `command -v vercel` check. + +--- + +## Agent-Specific Notes + +### Claude Code / terminal-based agents + +You have full shell access. Do NOT use the `/mnt/skills/` path. Follow the decision flow above using the CLI directly. + +For the no-auth fallback, run the deploy script from the skill's installed location: +```bash +bash ~/.claude/skills/deploy-to-vercel/resources/deploy.sh [path] +``` +The path may vary depending on where the user installed the skill. + +### Sandboxed environments (claude.ai) + +You likely cannot run `vercel login` or `git push`. Go directly to the **no-auth fallback — claude.ai sandbox**. + +### Codex + +Codex runs in a sandbox. Check if the CLI is available first, then fall back to the deploy script. Go to the **no-auth fallback — Codex sandbox**. + +--- + +## Output + +Always show the user the deployment URL. + +- **Git push:** Use `vercel ls --format json` to find the preview URL. If the CLI isn't authenticated, tell the user to check the Vercel dashboard or commit status checks. +- **CLI deploy:** Show the URL returned by `vercel deploy --no-wait`. Use `vercel inspect ` to check build status and report it to the user. +- **No-auth fallback:** Show both the preview URL and the claim URL: + ``` + Deployment successful! + + Preview URL: https://my-app-abc123.vercel.app + Claim URL: https://vercel.com/claim-deployment?code=... + + View your site at the Preview URL. + To transfer this deployment to your Vercel account, visit the Claim URL. + ``` + +**Do not** curl or fetch the deployed URL to verify it works. Just return the link. + +--- + +## Troubleshooting + +### Network Egress Error (claude.ai) + +If deployment fails due to network restrictions on claude.ai, tell the user: + +``` +Deployment failed due to network restrictions. To fix this: + +1. Go to https://claude.ai/settings/capabilities +2. Add *.vercel.com to the allowed domains +3. Try deploying again +``` + +### Escalated Network Access (Codex / sandboxed environments) + +If deployment fails due to network issues (timeouts, DNS errors, connection resets) in a sandboxed environment, rerun with escalated permissions (`sandbox_permissions=require_escalated`). Do not escalate the initial CLI availability check — only escalate the actual deploy command. + +Example guidance to the user: +``` +The deploy needs escalated network access to deploy to Vercel. I can rerun +the command with escalated permissions — want me to proceed? +``` + +### CLI Auth Failure + +If `vercel login` or `vercel deploy` fails with authentication errors, fall back to the no-auth deploy script (claude.ai or Codex variant, depending on the environment). diff --git a/.agent/skills/deploy-to-vercel/resources/deploy-codex.sh b/.agent/skills/deploy-to-vercel/resources/deploy-codex.sh new file mode 100644 index 00000000..af07d0fd --- /dev/null +++ b/.agent/skills/deploy-to-vercel/resources/deploy-codex.sh @@ -0,0 +1,301 @@ +#!/bin/bash + +# Vercel Deployment Script for Codex (via claimable deploy endpoint) +# Usage: ./deploy-codex.sh [project-path] +# Returns: JSON with previewUrl, claimUrl, deploymentId, projectId + +set -euo pipefail + +DEPLOY_ENDPOINT="https://codex-deploy-skills.vercel.sh/api/deploy" + +# Detect framework from package.json +detect_framework() { + local pkg_json="$1" + + if [ ! -f "$pkg_json" ]; then + echo "null" + return + fi + + local content=$(cat "$pkg_json") + + # Helper to check if a package exists in dependencies or devDependencies. + # Use exact matching by default, with a separate prefix matcher for scoped + # package families like "@remix-run/". + has_dep_exact() { + echo "$content" | grep -q "\"$1\"" + } + + has_dep_prefix() { + echo "$content" | grep -q "\"$1" + } + + # Order matters - check more specific frameworks first + + # Blitz + if has_dep_exact "blitz"; then echo "blitzjs"; return; fi + + # Next.js + if has_dep_exact "next"; then echo "nextjs"; return; fi + + # Gatsby + if has_dep_exact "gatsby"; then echo "gatsby"; return; fi + + # Remix + if has_dep_prefix "@remix-run/"; then echo "remix"; return; fi + + # React Router (v7 framework mode) + if has_dep_prefix "@react-router/"; then echo "react-router"; return; fi + + # TanStack Start + if has_dep_exact "@tanstack/start"; then echo "tanstack-start"; return; fi + + # Astro + if has_dep_exact "astro"; then echo "astro"; return; fi + + # Hydrogen (Shopify) + if has_dep_exact "@shopify/hydrogen"; then echo "hydrogen"; return; fi + + # SvelteKit + if has_dep_exact "@sveltejs/kit"; then echo "sveltekit-1"; return; fi + + # Svelte (standalone) + if has_dep_exact "svelte"; then echo "svelte"; return; fi + + # Nuxt + if has_dep_exact "nuxt"; then echo "nuxtjs"; return; fi + + # Vue with Vitepress + if has_dep_exact "vitepress"; then echo "vitepress"; return; fi + + # Vue with Vuepress + if has_dep_exact "vuepress"; then echo "vuepress"; return; fi + + # Gridsome + if has_dep_exact "gridsome"; then echo "gridsome"; return; fi + + # SolidStart + if has_dep_exact "@solidjs/start"; then echo "solidstart-1"; return; fi + + # Docusaurus + if has_dep_exact "@docusaurus/core"; then echo "docusaurus-2"; return; fi + + # RedwoodJS + if has_dep_prefix "@redwoodjs/"; then echo "redwoodjs"; return; fi + + # Hexo + if has_dep_exact "hexo"; then echo "hexo"; return; fi + + # Eleventy + if has_dep_exact "@11ty/eleventy"; then echo "eleventy"; return; fi + + # Angular / Ionic Angular + if has_dep_exact "@ionic/angular"; then echo "ionic-angular"; return; fi + if has_dep_exact "@angular/core"; then echo "angular"; return; fi + + # Ionic React + if has_dep_exact "@ionic/react"; then echo "ionic-react"; return; fi + + # Create React App + if has_dep_exact "react-scripts"; then echo "create-react-app"; return; fi + + # Ember + if has_dep_exact "ember-cli" || has_dep_exact "ember-source"; then echo "ember"; return; fi + + # Dojo + if has_dep_exact "@dojo/framework"; then echo "dojo"; return; fi + + # Polymer + if has_dep_prefix "@polymer/"; then echo "polymer"; return; fi + + # Preact + if has_dep_exact "preact"; then echo "preact"; return; fi + + # Stencil + if has_dep_exact "@stencil/core"; then echo "stencil"; return; fi + + # UmiJS + if has_dep_exact "umi"; then echo "umijs"; return; fi + + # Sapper (legacy Svelte) + if has_dep_exact "sapper"; then echo "sapper"; return; fi + + # Saber + if has_dep_exact "saber"; then echo "saber"; return; fi + + # Sanity + if has_dep_exact "sanity"; then echo "sanity-v3"; return; fi + if has_dep_prefix "@sanity/"; then echo "sanity"; return; fi + + # Storybook + if has_dep_prefix "@storybook/"; then echo "storybook"; return; fi + + # NestJS + if has_dep_exact "@nestjs/core"; then echo "nestjs"; return; fi + + # Elysia + if has_dep_exact "elysia"; then echo "elysia"; return; fi + + # Hono + if has_dep_exact "hono"; then echo "hono"; return; fi + + # Fastify + if has_dep_exact "fastify"; then echo "fastify"; return; fi + + # h3 + if has_dep_exact "h3"; then echo "h3"; return; fi + + # Nitro + if has_dep_exact "nitropack"; then echo "nitro"; return; fi + + # Express + if has_dep_exact "express"; then echo "express"; return; fi + + # Vite (generic - check last among JS frameworks) + if has_dep_exact "vite"; then echo "vite"; return; fi + + # Parcel + if has_dep_exact "parcel"; then echo "parcel"; return; fi + + # No framework detected + echo "null" +} + +# Parse arguments +INPUT_PATH="${1:-.}" + +# Create temp directory for packaging +TEMP_DIR=$(mktemp -d) +TARBALL="$TEMP_DIR/project.tgz" +STAGING_DIR="$TEMP_DIR/staging" +CLEANUP_TEMP=true + +cleanup() { + if [ "$CLEANUP_TEMP" = true ]; then + rm -rf "$TEMP_DIR" + fi +} +trap cleanup EXIT + +echo "Preparing deployment..." >&2 + +# Check if input is a .tgz file or a directory +FRAMEWORK="null" + +if [ -f "$INPUT_PATH" ] && [[ "$INPUT_PATH" == *.tgz ]]; then + # Input is already a tarball, use it directly + echo "Using provided tarball..." >&2 + TARBALL="$INPUT_PATH" + CLEANUP_TEMP=false + # Can't detect framework from tarball, leave as null +elif [ -d "$INPUT_PATH" ]; then + # Input is a directory, need to tar it + PROJECT_PATH=$(cd "$INPUT_PATH" && pwd) + + # Detect framework from package.json + FRAMEWORK=$(detect_framework "$PROJECT_PATH/package.json") + + # Stage files into a temporary directory to avoid mutating the source tree. + mkdir -p "$STAGING_DIR" + echo "Staging project files..." >&2 + tar -C "$PROJECT_PATH" \ + --exclude='node_modules' \ + --exclude='.git' \ + --exclude='.env' \ + --exclude='.env.*' \ + -cf - . | tar -C "$STAGING_DIR" -xf - + + # Check if this is a static HTML project (no package.json) + if [ ! -f "$PROJECT_PATH/package.json" ]; then + # Find HTML files in root + HTML_FILES=$(find "$STAGING_DIR" -maxdepth 1 -name "*.html" -type f) + HTML_COUNT=$(printf '%s\n' "$HTML_FILES" | sed '/^$/d' | wc -l | tr -d '[:space:]') + + # If there's exactly one HTML file and it's not index.html, rename it + if [ "$HTML_COUNT" -eq 1 ]; then + HTML_FILE=$(echo "$HTML_FILES" | head -1) + BASENAME=$(basename "$HTML_FILE") + if [ "$BASENAME" != "index.html" ]; then + echo "Renaming $BASENAME to index.html..." >&2 + mv "$HTML_FILE" "$STAGING_DIR/index.html" + fi + fi + fi + + # Create tarball from the staging directory + echo "Creating deployment package..." >&2 + tar -czf "$TARBALL" -C "$STAGING_DIR" . +else + echo "Error: Input must be a directory or a .tgz file" >&2 + exit 1 +fi + +if [ "$FRAMEWORK" != "null" ]; then + echo "Detected framework: $FRAMEWORK" >&2 +fi + +# Deploy +echo "Deploying..." >&2 +RESPONSE=$(curl -s -X POST "$DEPLOY_ENDPOINT" -F "file=@$TARBALL" -F "framework=$FRAMEWORK") + +# Check for error in response +if echo "$RESPONSE" | grep -q '"error"'; then + ERROR_MSG=$(echo "$RESPONSE" | grep -o '"error":"[^"]*"' | cut -d'"' -f4) + echo "Error: $ERROR_MSG" >&2 + exit 1 +fi + +# Extract URLs from response +PREVIEW_URL=$(echo "$RESPONSE" | grep -o '"previewUrl":"[^"]*"' | cut -d'"' -f4) +CLAIM_URL=$(echo "$RESPONSE" | grep -o '"claimUrl":"[^"]*"' | cut -d'"' -f4) + +if [ -z "$PREVIEW_URL" ]; then + echo "Error: Could not extract preview URL from response" >&2 + echo "$RESPONSE" >&2 + exit 1 +fi + +echo "Deployment started. Waiting for build to complete..." >&2 +echo "Preview URL: $PREVIEW_URL" >&2 + +# Poll the preview URL until it returns a non-5xx response (5xx = still building) +MAX_ATTEMPTS=60 # 5 minutes max (60 * 5 seconds) +ATTEMPT=0 + +while [ $ATTEMPT -lt $MAX_ATTEMPTS ]; do + HTTP_STATUS=$(curl -s -o /dev/null -w "%{http_code}" "$PREVIEW_URL") + + if [ "$HTTP_STATUS" -eq 200 ]; then + echo "" >&2 + echo "Deployment ready!" >&2 + break + elif [ "$HTTP_STATUS" -ge 500 ]; then + # 5xx means still building/deploying + echo "Building... (attempt $((ATTEMPT + 1))/$MAX_ATTEMPTS)" >&2 + sleep 5 + ATTEMPT=$((ATTEMPT + 1)) + elif [ "$HTTP_STATUS" -ge 400 ] && [ "$HTTP_STATUS" -lt 500 ]; then + # 4xx might be an error or the app itself returns 4xx - it's responding + echo "" >&2 + echo "Deployment ready (returned $HTTP_STATUS)!" >&2 + break + else + # Any other status, assume it's ready + echo "" >&2 + echo "Deployment ready!" >&2 + break + fi +done + +if [ $ATTEMPT -eq $MAX_ATTEMPTS ]; then + echo "" >&2 + echo "Warning: Timed out waiting for deployment, but it may still be building." >&2 +fi + +echo "" >&2 +echo "Preview URL: $PREVIEW_URL" >&2 +echo "Claim URL: $CLAIM_URL" >&2 +echo "" >&2 + +# Output JSON for programmatic use +echo "$RESPONSE" diff --git a/.agent/skills/deploy-to-vercel/resources/deploy.sh b/.agent/skills/deploy-to-vercel/resources/deploy.sh new file mode 100644 index 00000000..a458da63 --- /dev/null +++ b/.agent/skills/deploy-to-vercel/resources/deploy.sh @@ -0,0 +1,301 @@ +#!/bin/bash + +# Vercel Deployment Script (via claimable deploy endpoint) +# Usage: ./deploy.sh [project-path] +# Returns: JSON with previewUrl, claimUrl, deploymentId, projectId + +set -euo pipefail + +DEPLOY_ENDPOINT="https://claude-skills-deploy.vercel.com/api/deploy" + +# Detect framework from package.json +detect_framework() { + local pkg_json="$1" + + if [ ! -f "$pkg_json" ]; then + echo "null" + return + fi + + local content=$(cat "$pkg_json") + + # Helper to check if a package exists in dependencies or devDependencies. + # Use exact matching by default, with a separate prefix matcher for scoped + # package families like "@remix-run/". + has_dep_exact() { + echo "$content" | grep -q "\"$1\"" + } + + has_dep_prefix() { + echo "$content" | grep -q "\"$1" + } + + # Order matters - check more specific frameworks first + + # Blitz + if has_dep_exact "blitz"; then echo "blitzjs"; return; fi + + # Next.js + if has_dep_exact "next"; then echo "nextjs"; return; fi + + # Gatsby + if has_dep_exact "gatsby"; then echo "gatsby"; return; fi + + # Remix + if has_dep_prefix "@remix-run/"; then echo "remix"; return; fi + + # React Router (v7 framework mode) + if has_dep_prefix "@react-router/"; then echo "react-router"; return; fi + + # TanStack Start + if has_dep_exact "@tanstack/start"; then echo "tanstack-start"; return; fi + + # Astro + if has_dep_exact "astro"; then echo "astro"; return; fi + + # Hydrogen (Shopify) + if has_dep_exact "@shopify/hydrogen"; then echo "hydrogen"; return; fi + + # SvelteKit + if has_dep_exact "@sveltejs/kit"; then echo "sveltekit-1"; return; fi + + # Svelte (standalone) + if has_dep_exact "svelte"; then echo "svelte"; return; fi + + # Nuxt + if has_dep_exact "nuxt"; then echo "nuxtjs"; return; fi + + # Vue with Vitepress + if has_dep_exact "vitepress"; then echo "vitepress"; return; fi + + # Vue with Vuepress + if has_dep_exact "vuepress"; then echo "vuepress"; return; fi + + # Gridsome + if has_dep_exact "gridsome"; then echo "gridsome"; return; fi + + # SolidStart + if has_dep_exact "@solidjs/start"; then echo "solidstart-1"; return; fi + + # Docusaurus + if has_dep_exact "@docusaurus/core"; then echo "docusaurus-2"; return; fi + + # RedwoodJS + if has_dep_prefix "@redwoodjs/"; then echo "redwoodjs"; return; fi + + # Hexo + if has_dep_exact "hexo"; then echo "hexo"; return; fi + + # Eleventy + if has_dep_exact "@11ty/eleventy"; then echo "eleventy"; return; fi + + # Angular / Ionic Angular + if has_dep_exact "@ionic/angular"; then echo "ionic-angular"; return; fi + if has_dep_exact "@angular/core"; then echo "angular"; return; fi + + # Ionic React + if has_dep_exact "@ionic/react"; then echo "ionic-react"; return; fi + + # Create React App + if has_dep_exact "react-scripts"; then echo "create-react-app"; return; fi + + # Ember + if has_dep_exact "ember-cli" || has_dep_exact "ember-source"; then echo "ember"; return; fi + + # Dojo + if has_dep_exact "@dojo/framework"; then echo "dojo"; return; fi + + # Polymer + if has_dep_prefix "@polymer/"; then echo "polymer"; return; fi + + # Preact + if has_dep_exact "preact"; then echo "preact"; return; fi + + # Stencil + if has_dep_exact "@stencil/core"; then echo "stencil"; return; fi + + # UmiJS + if has_dep_exact "umi"; then echo "umijs"; return; fi + + # Sapper (legacy Svelte) + if has_dep_exact "sapper"; then echo "sapper"; return; fi + + # Saber + if has_dep_exact "saber"; then echo "saber"; return; fi + + # Sanity + if has_dep_exact "sanity"; then echo "sanity-v3"; return; fi + if has_dep_prefix "@sanity/"; then echo "sanity"; return; fi + + # Storybook + if has_dep_prefix "@storybook/"; then echo "storybook"; return; fi + + # NestJS + if has_dep_exact "@nestjs/core"; then echo "nestjs"; return; fi + + # Elysia + if has_dep_exact "elysia"; then echo "elysia"; return; fi + + # Hono + if has_dep_exact "hono"; then echo "hono"; return; fi + + # Fastify + if has_dep_exact "fastify"; then echo "fastify"; return; fi + + # h3 + if has_dep_exact "h3"; then echo "h3"; return; fi + + # Nitro + if has_dep_exact "nitropack"; then echo "nitro"; return; fi + + # Express + if has_dep_exact "express"; then echo "express"; return; fi + + # Vite (generic - check last among JS frameworks) + if has_dep_exact "vite"; then echo "vite"; return; fi + + # Parcel + if has_dep_exact "parcel"; then echo "parcel"; return; fi + + # No framework detected + echo "null" +} + +# Parse arguments +INPUT_PATH="${1:-.}" + +# Create temp directory for packaging +TEMP_DIR=$(mktemp -d) +TARBALL="$TEMP_DIR/project.tgz" +STAGING_DIR="$TEMP_DIR/staging" +CLEANUP_TEMP=true + +cleanup() { + if [ "$CLEANUP_TEMP" = true ]; then + rm -rf "$TEMP_DIR" + fi +} +trap cleanup EXIT + +echo "Preparing deployment..." >&2 + +# Check if input is a .tgz file or a directory +FRAMEWORK="null" + +if [ -f "$INPUT_PATH" ] && [[ "$INPUT_PATH" == *.tgz ]]; then + # Input is already a tarball, use it directly + echo "Using provided tarball..." >&2 + TARBALL="$INPUT_PATH" + CLEANUP_TEMP=false + # Can't detect framework from tarball, leave as null +elif [ -d "$INPUT_PATH" ]; then + # Input is a directory, need to tar it + PROJECT_PATH=$(cd "$INPUT_PATH" && pwd) + + # Detect framework from package.json + FRAMEWORK=$(detect_framework "$PROJECT_PATH/package.json") + + # Stage files into a temporary directory to avoid mutating the source tree. + mkdir -p "$STAGING_DIR" + echo "Staging project files..." >&2 + tar -C "$PROJECT_PATH" \ + --exclude='node_modules' \ + --exclude='.git' \ + --exclude='.env' \ + --exclude='.env.*' \ + -cf - . | tar -C "$STAGING_DIR" -xf - + + # Check if this is a static HTML project (no package.json) + if [ ! -f "$PROJECT_PATH/package.json" ]; then + # Find HTML files in root + HTML_FILES=$(find "$STAGING_DIR" -maxdepth 1 -name "*.html" -type f) + HTML_COUNT=$(printf '%s\n' "$HTML_FILES" | sed '/^$/d' | wc -l | tr -d '[:space:]') + + # If there's exactly one HTML file and it's not index.html, rename it + if [ "$HTML_COUNT" -eq 1 ]; then + HTML_FILE=$(echo "$HTML_FILES" | head -1) + BASENAME=$(basename "$HTML_FILE") + if [ "$BASENAME" != "index.html" ]; then + echo "Renaming $BASENAME to index.html..." >&2 + mv "$HTML_FILE" "$STAGING_DIR/index.html" + fi + fi + fi + + # Create tarball from the staging directory + echo "Creating deployment package..." >&2 + tar -czf "$TARBALL" -C "$STAGING_DIR" . +else + echo "Error: Input must be a directory or a .tgz file" >&2 + exit 1 +fi + +if [ "$FRAMEWORK" != "null" ]; then + echo "Detected framework: $FRAMEWORK" >&2 +fi + +# Deploy +echo "Deploying..." >&2 +RESPONSE=$(curl -s -X POST "$DEPLOY_ENDPOINT" -F "file=@$TARBALL" -F "framework=$FRAMEWORK") + +# Check for error in response +if echo "$RESPONSE" | grep -q '"error"'; then + ERROR_MSG=$(echo "$RESPONSE" | grep -o '"error":"[^"]*"' | cut -d'"' -f4) + echo "Error: $ERROR_MSG" >&2 + exit 1 +fi + +# Extract URLs from response +PREVIEW_URL=$(echo "$RESPONSE" | grep -o '"previewUrl":"[^"]*"' | cut -d'"' -f4) +CLAIM_URL=$(echo "$RESPONSE" | grep -o '"claimUrl":"[^"]*"' | cut -d'"' -f4) + +if [ -z "$PREVIEW_URL" ]; then + echo "Error: Could not extract preview URL from response" >&2 + echo "$RESPONSE" >&2 + exit 1 +fi + +echo "Deployment started. Waiting for build to complete..." >&2 +echo "Preview URL: $PREVIEW_URL" >&2 + +# Poll the preview URL until it returns a non-5xx response (5xx = still building) +MAX_ATTEMPTS=60 # 5 minutes max (60 * 5 seconds) +ATTEMPT=0 + +while [ $ATTEMPT -lt $MAX_ATTEMPTS ]; do + HTTP_STATUS=$(curl -s -o /dev/null -w "%{http_code}" "$PREVIEW_URL") + + if [ "$HTTP_STATUS" -eq 200 ]; then + echo "" >&2 + echo "Deployment ready!" >&2 + break + elif [ "$HTTP_STATUS" -ge 500 ]; then + # 5xx means still building/deploying + echo "Building... (attempt $((ATTEMPT + 1))/$MAX_ATTEMPTS)" >&2 + sleep 5 + ATTEMPT=$((ATTEMPT + 1)) + elif [ "$HTTP_STATUS" -ge 400 ] && [ "$HTTP_STATUS" -lt 500 ]; then + # 4xx might be an error or the app itself returns 4xx - it's responding + echo "" >&2 + echo "Deployment ready (returned $HTTP_STATUS)!" >&2 + break + else + # Any other status, assume it's ready + echo "" >&2 + echo "Deployment ready!" >&2 + break + fi +done + +if [ $ATTEMPT -eq $MAX_ATTEMPTS ]; then + echo "" >&2 + echo "Warning: Timed out waiting for deployment, but it may still be building." >&2 +fi + +echo "" >&2 +echo "Preview URL: $PREVIEW_URL" >&2 +echo "Claim URL: $CLAIM_URL" >&2 +echo "" >&2 + +# Output JSON for programmatic use +echo "$RESPONSE" diff --git a/.agent/skills/next-best-practices/SKILL.md b/.agent/skills/next-best-practices/SKILL.md new file mode 100644 index 00000000..437896b4 --- /dev/null +++ b/.agent/skills/next-best-practices/SKILL.md @@ -0,0 +1,153 @@ +--- +name: next-best-practices +description: Next.js best practices - file conventions, RSC boundaries, data patterns, async APIs, metadata, error handling, route handlers, image/font optimization, bundling +user-invocable: false +--- + +# Next.js Best Practices + +Apply these rules when writing or reviewing Next.js code. + +## File Conventions + +See [file-conventions.md](./file-conventions.md) for: +- Project structure and special files +- Route segments (dynamic, catch-all, groups) +- Parallel and intercepting routes +- Middleware rename in v16 (middleware → proxy) + +## RSC Boundaries + +Detect invalid React Server Component patterns. + +See [rsc-boundaries.md](./rsc-boundaries.md) for: +- Async client component detection (invalid) +- Non-serializable props detection +- Server Action exceptions + +## Async Patterns + +Next.js 15+ async API changes. + +See [async-patterns.md](./async-patterns.md) for: +- Async `params` and `searchParams` +- Async `cookies()` and `headers()` +- Migration codemod + +## Runtime Selection + +See [runtime-selection.md](./runtime-selection.md) for: +- Default to Node.js runtime +- When Edge runtime is appropriate + +## Directives + +See [directives.md](./directives.md) for: +- `'use client'`, `'use server'` (React) +- `'use cache'` (Next.js) + +## Functions + +See [functions.md](./functions.md) for: +- Navigation hooks: `useRouter`, `usePathname`, `useSearchParams`, `useParams` +- Server functions: `cookies`, `headers`, `draftMode`, `after` +- Generate functions: `generateStaticParams`, `generateMetadata` + +## Error Handling + +See [error-handling.md](./error-handling.md) for: +- `error.tsx`, `global-error.tsx`, `not-found.tsx` +- `redirect`, `permanentRedirect`, `notFound` +- `forbidden`, `unauthorized` (auth errors) +- `unstable_rethrow` for catch blocks + +## Data Patterns + +See [data-patterns.md](./data-patterns.md) for: +- Server Components vs Server Actions vs Route Handlers +- Avoiding data waterfalls (`Promise.all`, Suspense, preload) +- Client component data fetching + +## Route Handlers + +See [route-handlers.md](./route-handlers.md) for: +- `route.ts` basics +- GET handler conflicts with `page.tsx` +- Environment behavior (no React DOM) +- When to use vs Server Actions + +## Metadata & OG Images + +See [metadata.md](./metadata.md) for: +- Static and dynamic metadata +- `generateMetadata` function +- OG image generation with `next/og` +- File-based metadata conventions + +## Image Optimization + +See [image.md](./image.md) for: +- Always use `next/image` over `` +- Remote images configuration +- Responsive `sizes` attribute +- Blur placeholders +- Priority loading for LCP + +## Font Optimization + +See [font.md](./font.md) for: +- `next/font` setup +- Google Fonts, local fonts +- Tailwind CSS integration +- Preloading subsets + +## Bundling + +See [bundling.md](./bundling.md) for: +- Server-incompatible packages +- CSS imports (not link tags) +- Polyfills (already included) +- ESM/CommonJS issues +- Bundle analysis + +## Scripts + +See [scripts.md](./scripts.md) for: +- `next/script` vs native script tags +- Inline scripts need `id` +- Loading strategies +- Google Analytics with `@next/third-parties` + +## Hydration Errors + +See [hydration-error.md](./hydration-error.md) for: +- Common causes (browser APIs, dates, invalid HTML) +- Debugging with error overlay +- Fixes for each cause + +## Suspense Boundaries + +See [suspense-boundaries.md](./suspense-boundaries.md) for: +- CSR bailout with `useSearchParams` and `usePathname` +- Which hooks require Suspense boundaries + +## Parallel & Intercepting Routes + +See [parallel-routes.md](./parallel-routes.md) for: +- Modal patterns with `@slot` and `(.)` interceptors +- `default.tsx` for fallbacks +- Closing modals correctly with `router.back()` + +## Self-Hosting + +See [self-hosting.md](./self-hosting.md) for: +- `output: 'standalone'` for Docker +- Cache handlers for multi-instance ISR +- What works vs needs extra setup + +## Debug Tricks + +See [debug-tricks.md](./debug-tricks.md) for: +- MCP endpoint for AI-assisted debugging +- Rebuild specific routes with `--debug-build-paths` + diff --git a/.agent/skills/next-best-practices/async-patterns.md b/.agent/skills/next-best-practices/async-patterns.md new file mode 100644 index 00000000..dce8d8cc --- /dev/null +++ b/.agent/skills/next-best-practices/async-patterns.md @@ -0,0 +1,87 @@ +# Async Patterns + +In Next.js 15+, `params`, `searchParams`, `cookies()`, and `headers()` are asynchronous. + +## Async Params and SearchParams + +Always type them as `Promise<...>` and await them. + +### Pages and Layouts + +```tsx +type Props = { params: Promise<{ slug: string }> } + +export default async function Page({ params }: Props) { + const { slug } = await params +} +``` + +### Route Handlers + +```tsx +export async function GET( + request: Request, + { params }: { params: Promise<{ id: string }> } +) { + const { id } = await params +} +``` + +### SearchParams + +```tsx +type Props = { + params: Promise<{ slug: string }> + searchParams: Promise<{ query?: string }> +} + +export default async function Page({ params, searchParams }: Props) { + const { slug } = await params + const { query } = await searchParams +} +``` + +### Synchronous Components + +Use `React.use()` for non-async components: + +```tsx +import { use } from 'react' + +type Props = { params: Promise<{ slug: string }> } + +export default function Page({ params }: Props) { + const { slug } = use(params) +} +``` + +### generateMetadata + +```tsx +type Props = { params: Promise<{ slug: string }> } + +export async function generateMetadata({ params }: Props): Promise { + const { slug } = await params + return { title: slug } +} +``` + +## Async Cookies and Headers + +```tsx +import { cookies, headers } from 'next/headers' + +export default async function Page() { + const cookieStore = await cookies() + const headersList = await headers() + + const theme = cookieStore.get('theme') + const userAgent = headersList.get('user-agent') +} +``` + +## Migration Codemod + +```bash +npx @next/codemod@latest next-async-request-api . +``` diff --git a/.agent/skills/next-best-practices/bundling.md b/.agent/skills/next-best-practices/bundling.md new file mode 100644 index 00000000..ac5e814c --- /dev/null +++ b/.agent/skills/next-best-practices/bundling.md @@ -0,0 +1,180 @@ +# Bundling + +Fix common bundling issues with third-party packages. + +## Server-Incompatible Packages + +Some packages use browser APIs (`window`, `document`, `localStorage`) and fail in Server Components. + +### Error Signs + +``` +ReferenceError: window is not defined +ReferenceError: document is not defined +ReferenceError: localStorage is not defined +Module not found: Can't resolve 'fs' +``` + +### Solution 1: Mark as Client-Only + +If the package is only needed on client: + +```tsx +// Bad: Fails - package uses window +import SomeChart from 'some-chart-library' + +export default function Page() { + return +} + +// Good: Use dynamic import with ssr: false +import dynamic from 'next/dynamic' + +const SomeChart = dynamic(() => import('some-chart-library'), { + ssr: false, +}) + +export default function Page() { + return +} +``` + +### Solution 2: Externalize from Server Bundle + +For packages that should run on server but have bundling issues: + +```js +// next.config.js +module.exports = { + serverExternalPackages: ['problematic-package'], +} +``` + +Use this for: +- Packages with native bindings (sharp, bcrypt) +- Packages that don't bundle well (some ORMs) +- Packages with circular dependencies + +### Solution 3: Client Component Wrapper + +Wrap the entire usage in a client component: + +```tsx +// components/ChartWrapper.tsx +'use client' + +import { Chart } from 'chart-library' + +export function ChartWrapper(props) { + return +} + +// app/page.tsx (server component) +import { ChartWrapper } from '@/components/ChartWrapper' + +export default function Page() { + return +} +``` + +## CSS Imports + +Import CSS files instead of using `` tags. Next.js handles bundling and optimization. + +```tsx +// Bad: Manual link tag + + +// Good: Import CSS +import './styles.css' + +// Good: CSS Modules +import styles from './Button.module.css' +``` + +## Polyfills + +Next.js includes common polyfills automatically. Don't load redundant ones from polyfill.io or similar CDNs. + +Already included: `Array.from`, `Object.assign`, `Promise`, `fetch`, `Map`, `Set`, `Symbol`, `URLSearchParams`, and 50+ others. + +```tsx +// Bad: Redundant polyfills + + +// Good: Next.js Script component +import Script from 'next/script' + + +``` + +## Don't Put Script in Head + +`next/script` should not be placed inside `next/head`. It handles its own positioning. + +```tsx +// Bad: Script inside Head +import Head from 'next/head' +import Script from 'next/script' + + + + +// Good: Next.js component +import { GoogleAnalytics } from '@next/third-parties/google' + +export default function Layout({ children }) { + return ( + + {children} + + + ) +} +``` + +## Google Tag Manager + +```tsx +import { GoogleTagManager } from '@next/third-parties/google' + +export default function Layout({ children }) { + return ( + + + {children} + + ) +} +``` + +## Other Third-Party Scripts + +```tsx +// YouTube embed +import { YouTubeEmbed } from '@next/third-parties/google' + + + +// Google Maps +import { GoogleMapsEmbed } from '@next/third-parties/google' + + +``` + +## Quick Reference + +| Pattern | Issue | Fix | +|---------|-------|-----| +| ` + +// Good: Next.js Script component +import Script from 'next/script' + + +``` + +## Don't Put Script in Head + +`next/script` should not be placed inside `next/head`. It handles its own positioning. + +```tsx +// Bad: Script inside Head +import Head from 'next/head' +import Script from 'next/script' + + + + +// Good: Next.js component +import { GoogleAnalytics } from '@next/third-parties/google' + +export default function Layout({ children }) { + return ( + + {children} + + + ) +} +``` + +## Google Tag Manager + +```tsx +import { GoogleTagManager } from '@next/third-parties/google' + +export default function Layout({ children }) { + return ( + + + {children} + + ) +} +``` + +## Other Third-Party Scripts + +```tsx +// YouTube embed +import { YouTubeEmbed } from '@next/third-parties/google' + + + +// Google Maps +import { GoogleMapsEmbed } from '@next/third-parties/google' + + +``` + +## Quick Reference + +| Pattern | Issue | Fix | +|---------|-------|-----| +| ` + +// Good: Next.js Script component +import Script from 'next/script' + + +``` + +## Don't Put Script in Head + +`next/script` should not be placed inside `next/head`. It handles its own positioning. + +```tsx +// Bad: Script inside Head +import Head from 'next/head' +import Script from 'next/script' + + + + +// Good: Next.js component +import { GoogleAnalytics } from '@next/third-parties/google' + +export default function Layout({ children }) { + return ( + + {children} + + + ) +} +``` + +## Google Tag Manager + +```tsx +import { GoogleTagManager } from '@next/third-parties/google' + +export default function Layout({ children }) { + return ( + + + {children} + + ) +} +``` + +## Other Third-Party Scripts + +```tsx +// YouTube embed +import { YouTubeEmbed } from '@next/third-parties/google' + + + +// Google Maps +import { GoogleMapsEmbed } from '@next/third-parties/google' + + +``` + +## Quick Reference + +| Pattern | Issue | Fix | +|---------|-------|-----| +| ` + +// Good: Next.js Script component +import Script from 'next/script' + + +``` + +## Don't Put Script in Head + +`next/script` should not be placed inside `next/head`. It handles its own positioning. + +```tsx +// Bad: Script inside Head +import Head from 'next/head' +import Script from 'next/script' + + + + +// Good: Next.js component +import { GoogleAnalytics } from '@next/third-parties/google' + +export default function Layout({ children }) { + return ( + + {children} + + + ) +} +``` + +## Google Tag Manager + +```tsx +import { GoogleTagManager } from '@next/third-parties/google' + +export default function Layout({ children }) { + return ( + + + {children} + + ) +} +``` + +## Other Third-Party Scripts + +```tsx +// YouTube embed +import { YouTubeEmbed } from '@next/third-parties/google' + + + +// Google Maps +import { GoogleMapsEmbed } from '@next/third-parties/google' + + +``` + +## Quick Reference + +| Pattern | Issue | Fix | +|---------|-------|-----| +| `