An Event-Driven, Multi-Agent Cognitive Architecture designed to autonomously plan, research, and synthesize complex technical reports.
Nexus AI represents a shift from "Chatbots" to "Agentic Systems."
While traditional LLMs generate text in a single pass, Nexus employs a Graph-Based State Machine to coordinate specialized autonomous agents. It creates a dynamic loop where a Planner deconstructs goals, a Researcher gathers ground-truth data from the live web, and a Writer synthesizes the findings into professional-grade reports.
The system runs on a Microservices Architecture orchestrated via Docker, ensuring persistent memory and scalable execution separate from the user interface.
Nexus operates on a distributed containerized stack. The Brain (Python) interacts with the Body (Docker Services) via asynchronous protocols.
graph TD
User[User / Next.js Dashboard] -->|REST API| API[FastAPI Gateway]
API -->|Trigger| Graph[LangGraph Orchestrator]
subgraph "The Cognitive Architecture"
Graph -->|Step 1| Planner[Planner Agent]
Graph -->|Step 2| Research[Researcher Agent]
Graph -->|Step 3| Writer[Writer Agent]
end
subgraph "Infrastructure Layer (Docker)"
Research -->|Live Web Search| DDG[DuckDuckGo Tool]
Graph -->|Persist State| Postgres[(PostgreSQL)]
Graph -->|Semantic Memory| Qdrant[(Qdrant Vector DB)]
Graph -->|Relationship Map| Neo4j[(Neo4j Graph DB)]
end
Writer -->|Final Markdown| Postgres
Postgres -->|Fetch History| API
Unlike standard chains, Nexus uses LangGraph to define a cyclic workflow. Below is the actual logic governing the agent swarm.
Every agent shares a "Brain" (State). This ensures the Researcher knows what the Planner decided, and the Writer sees what the Researcher found.
class AgentState(TypedDict):
mission: str # User's initial goal
plan: List[str] # Step-by-step execution plan
research_data: List[str] # Raw data gathered from the web
final_response: str # The synthesized report
revision_number: int # Counter for self-correction loopsThis agent uses Llama-3-70b (via Groq) to function as a Project Manager. It forces a structured JSON output to ensure the plan is machine-readable.
def planner_node(state: AgentState):
print(f"--- PLANNER: Analyzing mission... ---")
prompt = f"You are a Senior Architect. Break this mission into 5 logical steps: {state['mission']}"
# We use temperature=0 for strict logical reasoning
response = llm.invoke([HumanMessage(content=prompt)])
steps = parse_into_list(response.content)
return {"plan": steps}This agent has "Hands." It utilizes the DuckDuckGo Search Tool to step outside the LLM's training data and fetch real-time information from 2025.
def researcher_node(state: AgentState):
plan = state['plan']
results = []
for task in plan:
# The agent autonomously executes web searches
data = web_search.run(task)
results.append(data)
return {"research_data": results}- Next.js 14 (App Router): Chosen for Server-Side Rendering (SSR) to ensure high-performance dashboard loading.
- Tailwind CSS v4: Utilized for a "Glassmorphism" design system with hardware-accelerated animations.
- Sonner & Framer Motion: For reactive UI feedback (Toasts) and state transitions.
- FastAPI: Selected over Flask for its asynchronous capabilities (
async/await), which are critical for handling long-running agent tasks without blocking the server. - SQLAlchemy (ORM): Manages relational data mapping between Python objects and the PostgreSQL container.
The system spins up 4 isolated containers via docker-compose:
- PostgreSQL: Stores user history and mission logs (Persistent Volume).
- Qdrant: (Vector DB) For semantic search and long-term memory retrieval.
- Redis: (Message Broker) Handles task queues for the Celery workers.
- Neo4j: (Graph DB) Maps relationships between researched entities.
- Core Architecture: LangGraph + FastAPI + Docker
- UI: Next.js Dashboard with Real-time Feedback & Glassmorphism
- Persistence: PostgreSQL History Saver & Mission Archives
- Export: PDF Report Generation
- Self-Correction: "Critic" Agent to grade reports and request re-writes.
- Multi-Modal: Capability to read images and PDFs as source material.
Prerequisite: Ensure Docker Desktop is installed and running.
git clone [https://github.com/Devarshp0511/Nexus-AI.git](https://github.com/Devarshp0511/Nexus-AI.git)
cd nexus-aiSpin up the database cluster in detached mode:
docker-compose up -dCreate a .env file in the backend/ directory:
# Database Connection
DATABASE_URL=postgresql://nexus:nexus_password@localhost:5432/nexus_db
# AI Provider Key (Perplexity, Groq, or OpenAI)
PPLX_API_KEY="pplx-..."
# OR
GROQ_API_KEY="gsk_..."cd backend
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
python main.pyOpen a new terminal tab:
cd frontend
npm install
npm run devVisit http://localhost:3000 to access the Mission Control Center.
Distributed under the MIT License. See LICENSE for more information.
