Skip to content

lavirox/avossistant

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Avossistant

A local AI assistant built on top of Ollama, enabled via MCP

Local Setup:

Clone the repository

  • git clone <repo-url>
  1. Install uv
  • Use your package manager of choice or directly install
  1. Set up your virtual environment (venv)
  • run uv venv in the root directory
  1. Sync dependencies
  • run uv sync to download and sync the dependencies
  1. Install Ollama on your computer (in command line using your package manager)

  2. Pull the model we are using (5.2 gb)

  • run ollama pull qwen3 -> Will take up to 1hr
  1. Run the CLI
  • cd into src
  • run uv run main.py server.py to initialize the CLI
  • type ping to test the tool, or quit to end the chat
  • run uv run main.py server.py --gui to use the assistant with a GUI
  • click disconnect, then ctrl + c in the terminal to stop the gui

About

Local AI assistant powered by Ollama and qwen3, with a custom MCP server for Gmail integration. Read, search, and send emails through natural language

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages