okay so I built this thing because I was getting like 200 promo emails a day and I was NOT about to manually unsubscribe from all of them like some kind of animal. so yeah, this is a full-stack Gmail automation service with a cyberpunk dashboard, encrypted backups, and a cron job that silently deletes your digital trash every night while you sleep. you're welcome.
built with: Next.js 19 + React 19 (yes, the new one) Β· Express 5 Β· PostgreSQL Β· Docker Β· Gmail API Β· AES-256-GCM encryption Β· and a concerning amount of caffeine
| Dashboard | Rules Management |
|---|---|
![]() |
![]() |
| Reports | Auth Flow |
|---|---|
![]() |
![]() |
- Rule Engine β define rules by keyword, sender, or email age. set it and forget it.
- Cyberpunk Dashboard β drag-and-drop rule management, live metrics, the whole vibe
- Encrypted Reports β every deletion run saves an XLSX report encrypted with AES-256-GCM. your data, your keys.
- Daily Cron β runs at midnight by default. your inbox wakes up clean every morning.
- Secure by default β helmet, rate limiting, session hardening, encrypted token storage. not just vibes.
This project is built with a security-first mindset. Because it handles your Gmail access, I've implemented multiple layers of protection:
- AES-256-GCM Encryption: All generated reports are encrypted before they hit the disk. I use PBKDF2 with 100,000 iterations for key derivation.
- Encrypted Local Backups: Your data, your keys. Even if someone gains access to your server files, your email backups are unreadable without your
ENCRYPTION_KEY. - OAuth 2.0: I used Google's official OAuth flow. Your Google password never touches my servers.
- Hardened API: Protected by
helmet,express-rate-limit, and CSRF protection (where applicable).
you need a Google Cloud project with the Gmail API enabled. takes like 5 minutes.
- go to console.cloud.google.com and create a new project
- navigate to APIs & Services β Library, search for Gmail API, and enable it
- go to APIs & Services β OAuth consent screen
- choose External, fill in the app name + your email
- add the scope:
https://www.googleapis.com/auth/gmail.modify - add your own Google account as a Test User (required while in testing mode)
- go to APIs & Services β Credentials β Create Credentials β OAuth 2.0 Client ID
- Application type: Web application
- Authorized redirect URIs:
http://localhost:5555/api/auth/google/callback
- copy your Client ID and Client Secret β you'll need them in the next step
cp .env.example .envthen fill in your .env β here's what everything means:
# Google OAuth β from Step 0 above
GOOGLE_CLIENT_ID=your_client_id_here
GOOGLE_CLIENT_SECRET=your_client_secret_here
GOOGLE_REDIRECT_URI=http://localhost:5555/api/auth/google/callback
# Session β just a random string, make it weird
SESSION_SECRET=some_long_random_string_here
# Database
DB_HOST=localhost # use "db" if running via Docker
DB_PORT=5432
DB_NAME=email_deleter
DB_USER=postgres
DB_PASSWORD=your_password_here
# Encryption Key β MUST be exactly 64 hex characters
# generate one: node -e "console.log(require('crypto').randomBytes(32).toString('hex'))"
ENCRYPTION_KEY=your_64_char_hex_key_here
# URLs
FRONTEND_URL=http://localhost:3067
NEXT_PUBLIC_API_URL=http://localhost:5555/api
# Cron (default: midnight daily β change if you're a night owl)
CRON_SCHEDULE="0 0 * * *"
β οΈ lose yourENCRYPTION_KEYand all stored tokens become unreadable. back it up somewhere safe. no, a sticky note doesn't count.
make sure you have Docker + Docker Compose installed, then:
# 1. build and start everything (db + backend + frontend)
make deploy
# 2. seed the database with default rules
make seedthat's it. seriously. Docker handles postgres, migrations, everything.
| Service | URL |
|---|---|
| Frontend UI | http://localhost:3067 |
| Backend API | http://localhost:5555 |
make up # start containers (after first build)
make down # stop containers
make restart # restart all containers
make logs # tail logs in real-time
make migrate # run DB migrations manually
make clean # nuclear option β wipes containers + volumesor if you're allergic to Makefiles and prefer raw Docker:
docker-compose build --no-cache
docker-compose up -d
docker-compose logs -f
docker-compose down
docker-compose down -v # also removes the database volumeprerequisites: Node.js 20+, PostgreSQL 15+, npm
cd backend
npm install
# make sure your .env has DB_HOST=localhost
npm run db:migrate
npm run db:seed
npm dev # starts with nodemon on port 5555cd frontend
npm install
npm run dev # starts Next.js on port 3067cd backend
npm test # jest with coverage
npm run test:watch # watch mode for when you're in the zonenpm run lint # check
npm run lint:fix # fix (the coward's way, but also the smart way).
βββ backend/ # Express 5 API β rules, cron, Gmail integration
β βββ src/
β βββ controllers/
β βββ services/ # the actual brains
β βββ models/ # Sequelize + PostgreSQL
β βββ routes/
βββ frontend/ # Next.js 19 + React 19 dashboard
β βββ src/
β βββ app/
β βββ components/
β βββ store/ # Zustand (because Redux is a lot)
βββ docs/ # deeper dives if you're into that
βββ docker-compose.yml
βββ Makefile
βββ .env.example
PRs welcome! if you find a bug, open an issue. if you fix a bug, you're a legend.
please don't commit your .env file. I'm begging you.
This project is licensed under the MIT License. You are free to use, modify, and fork this code, but please give credit to Kashif Raza by keeping the copyright notice intact in all copies or substantial portions of the software.
built because I was tired of my inbox. now it's someone else's problem too (yours, if you clone this).



