Establish the open source foundation: contribution guidelines, code of conduct, spec structure, and an updated README that reflects the project's full capabilities.
CONTRIBUTING.md— conventional commits + PR processCODE_OF_CONDUCT.md— Contributor Covenantdocs/specs/— directory structure + spec templateREADME.md— document Spring AI abstraction and supported providers
Give the agent memory. Persist conversation history per
conversationIdusing Spring Data + H2 (in-memory by default). TheconversationIdis optional — if omitted, the interaction remains stateless.
ConversationMessageJPA entityConversationRepositoryvia Spring Data- H2 as the default in-memory store (zero external dependencies)
conversationIdas optional field in the chat requestagent.context.limit— configurable context window parameter
Improve the quality and control of the conversation loop — streaming responses, configurable model parameters, and better prompt composition.
- Streaming responses (
/api/chat/stream) - Configurable model parameters (temperature, top-p) via
application.properties - Docker Compose with Ollama — single
docker compose upto run everything
Enable the agent to respond with reliable, schema-bound JSON — essential for agents that feed downstream systems or other agents.
- Structured JSON responses via Spring AI
- Output schema configurable per deployment
- Validation and error handling for malformed outputs
Make the template ready for real deployments: metrics, structured logging, rate limiting, and basic authentication.
- Structured logging (JSON)
- Custom Actuator metrics (token usage, model latency)
- Rate limiting
- API Key authentication (header-based)