Skip to content

dedalus-labs/wingman

Wingman

uv tool install wingman-cli

Ralph 'Wingman' Wiggum flying with Dedalus wings
AI-powered TUI coding assistant for the terminal. Your copilot for code.

Features

  • Multi-model support: OpenAI, Anthropic, Google, xAI, Mistral, DeepSeek
  • Coding tools: File read/write, shell commands, grep, with diff previews
  • MCP integration: Connect to Model Context Protocol servers
  • Split panels: Work on multiple conversations simultaneously
  • Checkpoints: Automatic file snapshots with rollback support
  • Project memory: Persistent context per directory
  • Image support: Attach and analyze images in conversations
  • Context management: Auto-compaction when context runs low

Philosophy

Wingman ships only the features people actually use. No scope creep, no feature flags for hypothetical workflows, no slash commands that exist "just in case." If a feature doesn't earn its keep in daily use, it gets cut. Simplicity is the product.

Installation

Using uv (recommended)

uv tool install wingman-cli
Installing uv
# macOS / Linux
curl -LsSf https://astral.sh/uv/install.sh | sh

# Windows
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

# Or via pip/pipx
pip install uv

Using pip

pip install wingman-cli

Using pipx

pipx install wingman-cli

Quick Start

  1. Run Wingman:

    wingman
  2. Enter your Dedalus API key when prompted

  3. Start chatting - Type your message and press Enter

Commands

Command Description
/new Start new chat
/rename <name> Rename session
/delete Delete session
/split Split panel
/close Close panel
/model Switch model
/code Toggle coding mode
/cd <path> Change directory
/ls List files
/ps List processes
/kill <id> Stop process
/history View checkpoints
/rollback <id> Restore checkpoint
/diff Show changes
/compact Compact context
/context Context usage
/mcp MCP servers
/memory Project memory
/export Export session
/import <file> Import file
/key API key
/clear Clear chat
/help Show help

Configuration

Wingman stores configuration in ~/.wingman/:

~/.wingman/
├── config.json      # API key and settings
├── sessions/        # Chat history
├── checkpoints/     # File snapshots
└── memory/          # Project memory files

Supported Models

  • OpenAI: GPT-4.1, GPT-4o, o1, o3, o4-mini
  • Anthropic: Claude Opus 4.5, Sonnet 4.5, Haiku 4.5, Sonnet 4
  • Google: Gemini 2.5 Pro, Gemini 2.5 Flash, Gemini 2.0 Flash
  • xAI: Grok 4, Grok 3
  • DeepSeek: DeepSeek Chat, DeepSeek Reasoner
  • Mistral: Mistral Large, Mistral Small, Codestral

Requirements

Optional: Faster Search (Recommended)

Install fd and ripgrep for significantly faster file operations:

# macOS
brew install fd ripgrep

# Ubuntu/Debian
sudo apt install fd-find ripgrep

# Arch
sudo pacman -S fd ripgrep

Wingman automatically detects and uses these tools when available, falling back to find/grep otherwise.

License

MIT. See LICENSE for details.

Links


Dedalus Labs © 2026.

About

scaling capability quotient

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors