Skip to content

Amaan9136/ollama-code-agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama Code Agent

A local AI coding assistant that works exactly like Claude Code CLI - reads your files, edits code, and generates complete files. 100% free, runs locally.

Setup

pip install -r requirements.txt
python app.py

Open: http://localhost:5000

Requirements

  • Python 3.8+
  • Ollama running locally
  • At least one coding model pulled (recommended below)

Cloud Models (Ollama Cloud - no local GPU needed)

Sign in first:

ollama signin
bash
# Primary - best for Agentic coding
ollama run qwen3-coder:480b-cloud

# All-purpose heavy reasoning
ollama run gpt-oss:120b-cloud

# Faster / lighter cloud option
ollama run gpt-oss:20b-cloud

# Complex long-horizon engineering tasks
ollama run deepseek-v3.1:671b-cloud

Features

  • Upload any files (drag & drop, zip archives)
  • Full file tree explorer
  • Add files to AI context (double-click)
  • Streaming chat responses
  • AI generates complete files in <file path="..."> format
  • Click "Save" or "Open" on generated files
  • Built-in code editor
  • Multiple file tabs
  • Auto syntax highlighting

Usage

  1. Start Ollama: ollama serve
  2. Pull a model: ollama pull qwen3-coder:480b-cloud
  3. Run app: python app.py
  4. Upload your project (zip or individual files)
  5. Double-click files to add to context
  6. Chat with the AI about your code

FOR PROMPTING - MARKDOWN

MAIN BRANCH (GITHUB)

repomix --remote "https://github.com/Amaan9136/ollama-code-agent" -o ollama-code-agent-repomix.md --style markdown --ignore "node_modules,.next,dist,build,.git,.turbo,coverage"

FULL CODEBASE (LOCAL)

repomix "D:\0 AMAAN MAIN\My NextJS\OLLAMA-CODE-AGENT" -o ollama-code-agent-repomix.md --style markdown --ignore "node_modules,.next,dist,build,.git,.turbo,coverage"

About

Ollama Code Agent is a local AI coding assistant that mimics Claude Code CLI, enabling file-aware code generation, editing, and project-level understanding. It runs fully offline with Ollama, supports multi-file workflows, and delivers fast, streaming responses for efficient development.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors