-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Feature
Copy link
Labels
Description
Summary
Develop a fully integrated chat interface for the StormCom project that uses Ollama via a cloud endpoint as the backend LLM provider. The interface must leverage the existing authentication system, ensuring only authenticated users can access and use the chat functionality.
Objectives
- Create a modern, responsive chat UI within the Next.js 16 (App Router) framework, accessible only to authenticated users.
- Integrate with Ollama's cloud API (LLM backend) for real-time LLM-powered conversational responses.
- Ensure all messages and context are securely associated with authenticated sessions.
- Design for extensibility to support future multi-tenancy and fine-grained role-based access.
Requirements
1. UI/UX
- Add a new
/chatprotected route (App Router group) only accessible to logged-in users. - Modern chat interface styled with shadcn-ui primitives; supports Markdown, code blocks, and loading states.
- Display user and assistant messages clearly, with session state persisted per user.
- Loading and error feedback for LLM response delays.
2. Authentication Integration
- Gate the entire chat route using the existing NextAuth middleware and session patterns.
- On the server: validate session tokens/JWT and associate chat histories with
session.user.id. - Support signed-in via email (current) and password (future); ensure session/tenant isolation.
3. Backend Integration (Ollama Cloud)
- Add secure backend API route
/api/chat/ollamathat:- Authenticating the user session on each request.
- Accepts user message and sends it to configured Ollama cloud API.
- Streams or returns LLM responses as appropriate.
- API keys for Ollama (and any secrets) must be set via
.env.localand never exposed in frontend code. - Add README documentation for required configuration, environment variables, and deployment steps.
4. Data Persistence
- Store chat history per user (PostgreSQL via Prisma 7):
- Schema: User ID, message content (user/assistant), timestamps.
- Model extensible to support organizations (multi-tenant) in future.
5. Security & Access
- Ensure only the authenticated user (or tenant org) may access their message history.
- Sanitize and validate all message inputs/outputs for safety (prevent prompt/HTML injection).
6. Dev Experience
- Include robust TypeScript typings throughout the chat, backend handler, and Prisma models.
- Ensure all new code passes
npm run type-checkandnpm run lint(warnings OK, no errors). - Add detailed documentation to the repository: usage instructions, config, and extending the chat.
Out of Scope
- No open anonymous usage; chat is for signed-in users only.
- No public endpoints to Ollama or chat API.
- No file uploads or image generation (text-only chat).
- Testing suite not required unless requested.
Acceptance Criteria
- Authenticated users can access
/chat, send messages, and receive streaming responses from Ollama (cloud backend). - All chat data persists securely in the database, scoped to user session.
- No sensitive credentials are leaked to client-side.
- Codebase passes build/type-check/lint workflows.
- Documentation updated for setup, environment config, and feature usage.
References:
- Ollama cloud API docs
- NextAuth.js Documentation
- Prisma Docs
- See
/src/app/(auth)for authentication patterns.
Please ask in comments if further clarification is needed, or if there are specific product requirements for multi-tenancy and permissions.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
Type
Projects
Status
In progress