Lift local document folders into a cloud-hosted MCP server. Loft syncs your files (Markdown, PDF, text) to a Turso database with vector embeddings, then serves them as an MCP endpoint that Claude, Cursor, and other AI tools can search.
Local files → CLI (chunk + embed) → Turso DB → Vercel MCP server → Claude / Cursor
Follow these steps in order. The whole process takes about 10 minutes.
You need the following installed on your machine:
- Bun — runtime and package manager
curl -fsSL https://bun.sh/install | bash - Poppler — for PDF text extraction (skip if you don't need PDF support)
brew install poppler
- Sign up at turso.tech (free tier: 9 GB storage, 500 databases)
- Install the Turso CLI:
brew install tursodatabase/tap/turso turso auth login
- Create a database:
turso db create loft
- Get your database URL:
This will print something like
turso db show loft --url
libsql://loft-yourname.turso.io— save this. - Create an auth token:
Save the token that's printed.
turso db tokens create loft
- Go to platform.openai.com/api-keys
- Create a new API key
- Save the key (starts with
sk-)
This is used for generating text embeddings (text-embedding-3-small). Cost is ~$0.02 per 1M tokens.
Only needed if you want Vision OCR for images embedded in PDFs.
- Go to console.anthropic.com/settings/keys
- Create a new API key
- Save the key (starts with
sk-ant-)
git clone <repo-url> && cd loft
bun install
# Link the CLI globally so `loft` is available everywhere
cd packages/cli && bun linkThis saves your credentials to ~/.loft/config.json and creates the database tables.
loft setupYou'll be prompted for:
- Turso database URL — the
libsql://...URL from Step 2 - Turso auth token — the token from Step 2
- OpenAI API key — from Step 3
- Anthropic API key — from Step 4 (press Enter to skip)
# Create a project pointing at your docs folder(s)
loft init my-project ./docs
# Sync files to the cloud (chunks + embeds)
loft sync my-project
# Generate an API key for accessing via MCP
loft key my-project-
Push this repo to your own GitHub account
-
Go to vercel.com/new and import the repo
-
Set the Root Directory to
packages/server -
Add these environment variables in the Vercel dashboard:
Variable Value TURSO_URLlibsql://loft-yourname.turso.io(from Step 2)TURSO_AUTH_TOKENYour Turso auth token (from Step 2) OPENAI_API_KEYYour OpenAI API key (from Step 3) -
Deploy. Your MCP endpoint will be at
https://<your-app>.vercel.app/mcp.
So that loft key can generate complete connection strings:
# Edit ~/.loft/config.json and add:
# "vercelUrl": "your-app.vercel.app"Add as an MCP server URL directly:
https://your-app.vercel.app/mcp?key=lft_xxx
Add to your config file (~/Library/Application Support/Claude/claude_desktop_config.json):
{
"mcpServers": {
"my-project": {
"command": "npx",
"args": ["-y", "loft-proxy"],
"env": {
"LOFT_URL": "https://your-app.vercel.app/mcp?key=lft_xxx"
}
}
}
}The loft key command prints both connection formats when you generate a key.
| Command | Description |
|---|---|
loft setup |
Configure Turso + OpenAI credentials and run migrations |
loft init <name> <folder> [folders...] |
Create a project linked to local folders |
loft init <name> <folder> --add |
Add folders to an existing project |
loft sync <name> |
Sync files to the cloud (incremental, hash-based) |
loft status <name> |
Show project stats and sync status |
loft projects |
List all projects |
loft key <name> [--name label] |
Generate an API key |
loft keys <name> |
List API keys for a project |
loft delete <name> |
Permanently delete a project and all its data |
The server exposes 5 tools to AI clients:
| Tool | Description |
|---|---|
retrieve |
Direct lookup for specific documents or passages |
discover |
Explore the knowledge base with hybrid search (vector + BM25) |
gather |
Deep research using multiple search strategies |
stats |
Get document and chunk counts |
expand_document |
Get full text of a document by ID |
- Markdown (
.md) - PDF (
.pdf) — requires Poppler for text extraction - Plain text (
.txt)
Loft runs almost entirely on free tiers:
- Turso: 9 GB storage, 500 databases (free)
- Vercel: serverless functions (free hobby tier)
- OpenAI: ~$0.02 per 1M tokens for embeddings (text-embedding-3-small)