Upload any document and explore it as an interactive knowledge graph. Concepts are extracted via NLP co-occurrence analysis, clustered using the Louvain community detection algorithm, and ranked by betweenness centrality — then rendered as a force-directed graph you can explore and chat with.
- Ingest — upload a PDF, image (OCR), URL (scraped), or plain text
- Extract — tokenizes content, builds a co-occurrence graph of key concepts
- Analyze — runs Louvain community detection (clusters) + betweenness centrality (importance ranking)
- Visualize — renders an interactive Sigma.js force-directed graph: node size = importance, color = cluster
- Chat — LLM-powered
/insightsand/chatendpoints let you query the graph in natural language
| Layer | Tech |
|---|---|
| Frontend | Next.js 14, React, TypeScript, Sigma.js, Framer Motion, Tailwind CSS |
| Backend | Node.js, Express, TypeScript |
| Graph | graphology, graphology-communities-louvain, graphology-metrics |
| Ingestion | pdf-parse, Tesseract.js (OCR), Cheerio (URL scraping) |
| LLM | Anthropic Claude SDK, OpenAI SDK, LM Studio (local), Ollama (local) |
| Storage | SQLite (better-sqlite3) |
git clone https://github.com/And1zle/textgraph-lab
cd textgraph-labBackend:
cd backend
cp .env.example .env # edit to set your LLM provider
npm install
npm run dev # starts on http://localhost:3001Frontend:
cd frontend
npm install
npm run dev # starts on http://localhost:3000Then open http://localhost:3000.
Copy backend/.env.example to backend/.env and set your preferred LLM provider:
# Choose one: lmstudio | ollama | openai | anthropic
LLM_PROVIDER=lmstudio
LLM_MODEL=qwen/qwen3.5-9b
# LM Studio (local, free)
LMSTUDIO_BASE_URL=http://localhost:1234/v1
LMSTUDIO_API_KEY=lm-studio
# Ollama (local, free)
OLLAMA_BASE_URL=http://localhost:11434/v1
OLLAMA_MODEL=llama3.1:8b
# Cloud providers (require API keys)
OPENAI_API_KEY=
ANTHROPIC_API_KEY=The tool works fully offline with LM Studio or Ollama — no API key required.
textgraph-lab/
├── backend/
│ ├── src/
│ │ ├── index.ts # Express server entry point
│ │ ├── routes/ # API route handlers
│ │ ├── services/ # LLM factory, graph engine, ingestion
│ │ ├── db/ # SQLite schema and queries
│ │ └── prompts/ # LLM system prompts
│ ├── .env.example
│ └── package.json
├── frontend/
│ ├── app/ # Next.js app router pages
│ ├── components/
│ │ ├── GraphCanvas.tsx # Sigma.js force-directed graph
│ │ ├── ChatPanel.tsx # LLM chat interface
│ │ ├── InsightPanel.tsx # Auto-generated insights
│ │ ├── TopicSidebar.tsx # Community/cluster browser
│ │ └── AddSourcePanel.tsx# Document upload UI
│ └── package.json
└── start.bat # Windows: starts both servers
| Method | Path | Description |
|---|---|---|
| POST | /api/analyses |
Create a new analysis session |
| POST | /api/analyses/:id/ingest |
Upload a document to an analysis |
| GET | /api/analyses/:id |
Get graph data for an analysis |
| POST | /api/analyses/:id/insights |
Generate LLM insights from the graph |
| POST | /api/analyses/:id/chat |
Chat with the graph |
| POST | /api/analyses/:id/rebuild |
Re-run graph extraction |
MIT