From f494ffb2d17a3f9c26e6b6604e84e42840febdc5 Mon Sep 17 00:00:00 2001 From: Cenk Tekin Date: Sat, 28 Mar 2026 15:58:00 +0300 Subject: [PATCH 1/2] docs: add OpenAI-compatible embedding provider documentation Covers provider selection, Gemini free tier setup, OpenAI, and other compatible APIs (Groq, vLLM, LiteLLM). Updates Config table with new environment variables and aliases. --- README.md | 77 +++++++++++++++++++++++++++++++++++++++++++++++++++---- 1 file changed, 72 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index a00a5ad..e683714 100644 --- a/README.md +++ b/README.md @@ -132,6 +132,68 @@ npm install npm run build ``` +## Embedding Providers + +Context+ supports two embedding backends controlled by `CONTEXTPLUS_EMBED_PROVIDER`: + +| Provider | Value | Requires | Best For | +|----------|-------|----------|----------| +| **Ollama** (default) | `ollama` | Local Ollama server | Free, offline, private | +| **OpenAI-compatible** | `openai` | API key | Gemini (free tier), OpenAI, Groq, vLLM | + +### Ollama (Default) + +No extra configuration needed. Just run Ollama with an embedding model: + +```bash +ollama pull nomic-embed-text +ollama serve +``` + +### Google Gemini (Free Tier) + +```json +{ + "env": { + "CONTEXTPLUS_EMBED_PROVIDER": "openai", + "CONTEXTPLUS_OPENAI_API_KEY": "YOUR_GEMINI_API_KEY", + "CONTEXTPLUS_OPENAI_BASE_URL": "https://generativelanguage.googleapis.com/v1beta/openai", + "CONTEXTPLUS_OPENAI_EMBED_MODEL": "text-embedding-004" + } +} +``` + +Get a free API key at [Google AI Studio](https://aistudio.google.com/apikey). + +### OpenAI + +```json +{ + "env": { + "CONTEXTPLUS_EMBED_PROVIDER": "openai", + "OPENAI_API_KEY": "sk-...", + "OPENAI_EMBED_MODEL": "text-embedding-3-small" + } +} +``` + +### Other OpenAI-compatible APIs (Groq, vLLM, LiteLLM) + +Any endpoint implementing the [OpenAI Embeddings API](https://platform.openai.com/docs/api-reference/embeddings) works: + +```json +{ + "env": { + "CONTEXTPLUS_EMBED_PROVIDER": "openai", + "CONTEXTPLUS_OPENAI_API_KEY": "YOUR_KEY", + "CONTEXTPLUS_OPENAI_BASE_URL": "https://your-proxy.example.com/v1", + "CONTEXTPLUS_OPENAI_EMBED_MODEL": "your-model-name" + } +} +``` + +> **Note:** The `semantic_navigate` tool also uses a chat model for cluster labeling. When using the `openai` provider, set `CONTEXTPLUS_OPENAI_CHAT_MODEL` (default: `gpt-4o-mini`). + ## Architecture Three layers built with TypeScript over stdio using the Model Context Protocol SDK: @@ -146,11 +208,16 @@ Three layers built with TypeScript over stdio using the Model Context Protocol S ## Config -| Variable | Type | Default | Description | -| --------------------------------------- | ------------------------- | ------------------ | ------------------------------------------------------------- | -| `OLLAMA_EMBED_MODEL` | string | `nomic-embed-text` | Embedding model | -| `OLLAMA_API_KEY` | string | - | Ollama Cloud API key | -| `OLLAMA_CHAT_MODEL` | string | `llama3.2` | Chat model for cluster labeling | +| Variable | Type | Default | Description | +| --------------------------------------- | ------------------------- | -------------------------------------- | ------------------------------------------------------------- | +| `CONTEXTPLUS_EMBED_PROVIDER` | string | `ollama` | Embedding backend: `ollama` or `openai` | +| `OLLAMA_EMBED_MODEL` | string | `nomic-embed-text` | Ollama embedding model | +| `OLLAMA_API_KEY` | string | - | Ollama Cloud API key | +| `OLLAMA_CHAT_MODEL` | string | `llama3.2` | Ollama chat model for cluster labeling | +| `CONTEXTPLUS_OPENAI_API_KEY` | string | - | API key for OpenAI-compatible provider (alias: `OPENAI_API_KEY`) | +| `CONTEXTPLUS_OPENAI_BASE_URL` | string | `https://api.openai.com/v1` | OpenAI-compatible endpoint URL (alias: `OPENAI_BASE_URL`) | +| `CONTEXTPLUS_OPENAI_EMBED_MODEL` | string | `text-embedding-3-small` | OpenAI-compatible embedding model (alias: `OPENAI_EMBED_MODEL`) | +| `CONTEXTPLUS_OPENAI_CHAT_MODEL` | string | `gpt-4o-mini` | OpenAI-compatible chat model for labeling (alias: `OPENAI_CHAT_MODEL`) | | `CONTEXTPLUS_EMBED_BATCH_SIZE` | string (parsed as number) | `8` | Embedding batch size per GPU call, clamped to 5-10 | | `CONTEXTPLUS_EMBED_CHUNK_CHARS` | string (parsed as number) | `2000` | Per-chunk chars before merge, clamped to 256-8000 | | `CONTEXTPLUS_MAX_EMBED_FILE_SIZE` | string (parsed as number) | `51200` | Skip non-code text files larger than this many bytes | From 78446bce7d4148b469777076310ab1ad31e2c0b9 Mon Sep 17 00:00:00 2001 From: Cenk Tekin Date: Sat, 28 Mar 2026 16:06:25 +0300 Subject: [PATCH 2/2] docs: use full MCP config examples instead of env-only fragments Addresses Copilot review feedback - JSON snippets now show complete mcpServers structure for Claude Code, with a note about reusing the env block in other IDE configs. --- README.md | 50 ++++++++++++++++++++++++++++++++++++-------------- 1 file changed, 36 insertions(+), 14 deletions(-) diff --git a/README.md b/README.md index e683714..edf8525 100644 --- a/README.md +++ b/README.md @@ -152,13 +152,21 @@ ollama serve ### Google Gemini (Free Tier) +Full Claude Code `.mcp.json` example: + ```json { - "env": { - "CONTEXTPLUS_EMBED_PROVIDER": "openai", - "CONTEXTPLUS_OPENAI_API_KEY": "YOUR_GEMINI_API_KEY", - "CONTEXTPLUS_OPENAI_BASE_URL": "https://generativelanguage.googleapis.com/v1beta/openai", - "CONTEXTPLUS_OPENAI_EMBED_MODEL": "text-embedding-004" + "mcpServers": { + "contextplus": { + "command": "npx", + "args": ["-y", "contextplus"], + "env": { + "CONTEXTPLUS_EMBED_PROVIDER": "openai", + "CONTEXTPLUS_OPENAI_API_KEY": "YOUR_GEMINI_API_KEY", + "CONTEXTPLUS_OPENAI_BASE_URL": "https://generativelanguage.googleapis.com/v1beta/openai", + "CONTEXTPLUS_OPENAI_EMBED_MODEL": "text-embedding-004" + } + } } } ``` @@ -169,10 +177,16 @@ Get a free API key at [Google AI Studio](https://aistudio.google.com/apikey). ```json { - "env": { - "CONTEXTPLUS_EMBED_PROVIDER": "openai", - "OPENAI_API_KEY": "sk-...", - "OPENAI_EMBED_MODEL": "text-embedding-3-small" + "mcpServers": { + "contextplus": { + "command": "npx", + "args": ["-y", "contextplus"], + "env": { + "CONTEXTPLUS_EMBED_PROVIDER": "openai", + "OPENAI_API_KEY": "sk-...", + "OPENAI_EMBED_MODEL": "text-embedding-3-small" + } + } } } ``` @@ -183,16 +197,24 @@ Any endpoint implementing the [OpenAI Embeddings API](https://platform.openai.co ```json { - "env": { - "CONTEXTPLUS_EMBED_PROVIDER": "openai", - "CONTEXTPLUS_OPENAI_API_KEY": "YOUR_KEY", - "CONTEXTPLUS_OPENAI_BASE_URL": "https://your-proxy.example.com/v1", - "CONTEXTPLUS_OPENAI_EMBED_MODEL": "your-model-name" + "mcpServers": { + "contextplus": { + "command": "npx", + "args": ["-y", "contextplus"], + "env": { + "CONTEXTPLUS_EMBED_PROVIDER": "openai", + "CONTEXTPLUS_OPENAI_API_KEY": "YOUR_KEY", + "CONTEXTPLUS_OPENAI_BASE_URL": "https://your-proxy.example.com/v1", + "CONTEXTPLUS_OPENAI_EMBED_MODEL": "your-model-name" + } + } } } ``` > **Note:** The `semantic_navigate` tool also uses a chat model for cluster labeling. When using the `openai` provider, set `CONTEXTPLUS_OPENAI_CHAT_MODEL` (default: `gpt-4o-mini`). +> +> For VS Code, Cursor, or OpenCode, use the same `env` block inside your IDE's MCP config format (see [Config file locations](#setup) table above). ## Architecture