🚧 Heads Up — Under Active Development: I am currently building the engine. Expect rapid updates, shifting internal APIs, and the occasional spark as I tighten the bolts.
Let the background subroutines handle your complex code and system tasks, while the Buddy keeps you updated in the foreground.
- Bring Your Own Model: Works out of the box with Groq, Ollama, Llama, and Gemini.
- Native System Access: Autonomously reads, writes, edits files, and executes shell commands.
- Sandboxed Execution: Strict directory isolation prevents the AI from modifying files outside your project root.
- Buddy Mode: A secondary, highly sophisticated conversational agent that chats with you while the primary agent works.
Prerequisites: Node.js >= 20, pnpm >= 10.15.1
# Clone the repository
git clone [https://github.com/yourusername/noob-cli.git](https://github.com/yourusername/noob-cli.git)
cd noob-cli
# Install dependencies
pnpm install
# Configure your environment
cp .env.example .env
# Edit .env to add your API keys or local LLM URLs
# Build the project
pnpm buildYou can fire off quick tasks directly from your terminal or drop into an interactive session.
Interactive Session (with Buddy Mode)
pnpm dev:cli -bOne-Shot Command
pnpm dev:cli "List all the TypeScript files in the src directory"-p, --provider <name>: Override default AI provider (groq,ollama,llama,gemini)-m, --model <name>: Specify the model to use-b, --buddy: Enable the background conversational sidekick-c, --context <pattern>: Load specific files into context using glob patterns
This project is built as a monorepo. For deeper technical details, check out the documentation for the individual packages:
- @noob-cli/core: The headless engine handling LLM orchestration, secure native tool execution, and the WebSocket server.
- @noob-cli/cli: The interactive terminal frontend, parsing commands and managing the dual-stream UI.