A command-line tool to practice system design interviews with AI-powered feedback using LangChain and Ollama.
macOS:
brew install ollamaLinux:
curl -fsSL https://ollama.ai/install.sh | shWindows: Download from https://ollama.ai
# Start Ollama service (keep this terminal open)
ollama serve
# In a new terminal, download the recommended model
ollama pull gemma3
# Verify installation
ollama list# Clone and setup
git clone git@github.com:yltw27/system-sifu.git
cd system-sifu
npm install
npm run build
# Configure settings (first time only)
npm start config
# Start your first interview!
npm start interview- 🆓 Local AI with Ollama - No API keys required, runs offline
- 🤖 Multiple Providers - Supports Ollama, OpenAI, and Anthropic
- 🎯 Experience-based Questions - Tailored for Junior to Staff+ levels
- 📊 Structured Interview Flow - Mirrors real FAANG interview process
- 💬 Detailed AI Feedback - Comprehensive performance analysis
- ⚙️ Persistent Configuration - Saves your preferences
The CLI follows a structured interview format used by major tech companies:
- Understand functional requirements
- Identify non-functional requirements
- Clarify scope and constraints
- Calculate system scale (users, requests/sec)
- Estimate storage requirements
- Determine bandwidth needs
- Create architecture diagram using Mermaid syntax
- Identify major components
- Show data flow
- Discuss technology choices
- Address scalability concerns
- Explore trade-offs and alternatives
- Detailed performance analysis
- Specific improvement suggestions
- Score across multiple dimensions
Run npm start config to customize:
- Experience Level: Junior, Mid, Senior, Staff+
- Interview Duration: 15-90 minutes (< 30 mins = speed round, skips diagram)
- AI Provider: Ollama (local), OpenAI, or Anthropic
- Model Selection: Choose based on your preferences
# Start a practice interview
npm start interview
# Update your configuration
npm start config
# Show help
npm start --help
# Development mode
npm run dev interviewCheck if Ollama is running:
curl http://localhost:11434/api/tagsCommon solutions:
# Restart Ollama service
ollama serve
# List available models
ollama list
# Pull missing model
ollama pull gemma3
# Check system resources
ollama ps- Practice realistic system design interviews
- Get instant, detailed feedback
- Build confidence before real interviews
- Track improvement over time
- Everything runs locally with Ollama
- No data sent to external APIs
- Practice with proprietary system designs safely
- Completely free with Ollama
- No API usage limits or charges
- Practice as much as you want
Contributions welcome! This project uses:
- TypeScript for type safety
- LangChain for AI integration
- Commander.js for CLI interface
- Inquirer for interactive prompts
MIT License - see LICENSE for details.