This Flask web application allows users to compare prices of products detected in images using the Google Cloud Vision API.
Users upload an image, and the app extracts related e-commerce URLs and their corresponding prices.
🚧 Note: The app works perfectly on localhost. The Render deployment is experimental and may have limitations (session handling & file uploads).
👉 Deployed Application on Render
(If it doesn’t load correctly, please run locally using the steps below.)
- 📸 Image Upload – Upload a product image.
- 🔎 Google Vision API – Detects web entities and product matches.
- 💰 Price Scraping – Fetches product prices from Amazon, Flipkart, Myntra, Ajio, Croma, TataCliq, etc.
- 📊 Comparison View – Displays multiple store links with their prices.
- ⚡ API-Based Scraping – Uses ScraperAPI instead of local Selenium/Chrome.
- Upload Image – User uploads a product photo.
- Vision Detection – Google Vision API finds best-guess labels + matching e-commerce pages.
- Scraper API – Scrapes those URLs to extract prices.
- Comparison Display – Results are shown side-by-side for easy comparison.
git clone https://github.com/your-username/your-repository.git
cd your-repositorypython -m venv venv
source venv/bin/activate # On macOS/Linux
venv\Scripts\activate # On Windowspip install -r requirements.txtCreate a .env file in the root directory:
FLASK_SECRET_KEY=your_secret_key
SCRAPER_API_KEY=your_scraperapi_key
GOOGLE_APPLICATION_CREDENTIALS=google-credentials.json
google-credentials.json→ Download from Google Cloud Console and place in the project root.SCRAPER_API_KEY→ Get a free ScraperAPI key.
python app.pyThen open:
👉 http://127.0.0.1:5000
If you want to deploy on Render:
- Push your code to GitHub.
- Create a new Render Web Service.
- In Settings → Environment Variables, add:
FLASK_SECRET_KEYSCRAPER_API_KEYGOOGLE_APPLICATION_CREDENTIALS(use Render’s Secret File feature).
- Set Build Command:
pip install -r requirements.txt
- Set Start Command:
gunicorn webS:app
- Flask – Python backend
- Google Cloud Vision API – Product & entity detection
- ScraperAPI + BeautifulSoup – Price scraping
- Bootstrap / HTML – Frontend UI
Currently, the application does not use Selenium (headless browser automation).
Instead, it relies on ScraperAPI + BeautifulSoup for lightweight scraping.
- Faster and more lightweight.
- Easier to deploy on cloud platforms (Render, Vercel).
- No need for Chrome/Chromedriver binaries.
- Some prices may be missed if they are rendered dynamically with JavaScript.
- Certain sites with aggressive anti-bot measures (like dynamic classes, hidden prices) may block scraping.
- Selenium would give more accurate scraping since it renders the page like a real browser.
👉 In short: ScraperAPI = lightweight but limited, Selenium = heavier but more accurate.
- Improve price scraping accuracy
- Add more e-commerce sites
- Enhance frontend UI
- Optional Selenium mode for more reliable scraping
- Optimize for cloud deployment (Render/Vercel)
For any questions, reach me at:
📧 aman.tshekar@gmail.com