Skip to content

MabudAlam/NomAI-App

Repository files navigation

NomAI Logo

NomAI – AI Nutrition & Meal Tracking

⚑ Overview

NomAI is a powerful AI Agent that brings nutrition and food intelligence to life. Whether you're analyzing meals through images, chatting with an AI nutrition assistant, or generating personalized weekly diet plans β€” NomAI handles the heavy lifting with a sophisticated multi-step LLM pipeline backed by real-time web research.


✨ Features

Feature Description
🧠 AI Nutrition Analysis Analyze food from images or text descriptions with a 3-step pipeline: food extraction β†’ web search β†’ LLM synthesis
πŸ’¬ Conversational AI Chatbot LangChain-powered agent that understands dietary preferences, allergies, and health goals
🍽️ Weekly Diet Planner Generate 7-day personalized meal plans with carb cycling, variety tracking, and macro targets
πŸ”„ Meal Alternatives Get 5 AI-suggested alternative meals respecting your dietary profile
πŸ“Š Nutrition Tracking Mark meals as eaten, update plans on the fly, and track diet history
πŸ”— Dual LLM Support Seamlessly switch between Google Gemini and OpenRouter (Claude) providers
🌐 Web-Grounded Analysis Nutrition data enriched with web search results from Exa or DuckDuckGo
πŸ›’οΈ Firestore Persistence Chat history and diet plans stored in Google Firestore

πŸš€ Quick Backend Deployment

Get your AI gateway running in seconds:

Deploy on Railway

Backend Repository: https://github.com/Pavel401/NomAI


Screenshots

Below is a gallery of the current screenshots in static/screenshots/.


πŸ—οΈ System Architecture

NomAI is architected as a high-performance distributed system, separating the cross-platform Flutter client from a sophisticated AI orchestration backend.

πŸ—ΊοΈ Full-Stack Interaction

The following diagram illustrates the flow from the client through the FastAPI gateway to the AI engines and persistence layers.

graph TD
    Client["πŸ“± Client (Mobile / Web)"]
    Main["main.py β€” FastAPI App"]
    
    Client --> Main

    Main --> NutritionRouter["/api/v1/nutrition"]
    Main --> ChatRouter["/api/v1/users"]
    Main --> AgentRouter["/api/v1/chat"]
    Main --> DietRouter["/api/v1/diet"]

    NutritionRouter --> NutritionServiceV2
    AgentRouter --> LangChainAgent["πŸ€– LangChain Agent"]
    LangChainAgent --> AgentTools["Tools: analyse_image\nanalyse_food_description"]
    AgentTools --> NutritionServiceV2
    ChatRouter --> ChatFirestore
    DietRouter --> DietService

    NutritionServiceV2 --> FoodExtractor["FoodExtractorService"]
    NutritionServiceV2 --> SearchService
    NutritionServiceV2 --> LLMProvider["LLM Provider\n(Gemini / OpenRouter)"]
    DietService --> LLMProvider
    DietService --> DietFirestoreDB["DietFirestore"]

    FoodExtractor --> LLMProvider
    SearchService --> ExaAPI["πŸ” Exa / DuckDuckGo"]

    ChatFirestore --> Firestore["πŸ”₯ Firestore DB"]
    DietFirestoreDB --> Firestore
Loading

🧠 AI Intelligence & Decision Logic

1. ReAct Agent Decision Flow

The backend operates as a Reasoning + Acting (ReAct) agent. It doesn't just respond; it evaluates user intent, selects specialized tools, and iterates to find the most accurate facts.

graph TD
    User["πŸ‘€ User Input\n(Chat/Image)"] --> Context["πŸ“‹ Context Builder\n(Preferences + Allergies + Goals)"]
    Context --> Brain["🧠 LLM Controller\n(ReAct State Graph)"]
    
    Brain --> Decision{"Is this food-related?"}
    
    Decision -- "No / Simple Q&A" --> Direct["Direct Friendly Answer"]
    Decision -- "Yes / Needs Analysis" --> ToolSelection["πŸ› οΈ Tool Selection"]
    
    ToolSelection -- "Image Provided" --> ToolA["πŸ“Έ analyse_image"]
    ToolSelection -- "Text Description" --> ToolB["πŸ“ analyse_food_description"]
    
    ToolA --> Pipe["πŸ§ͺ Nutrition Pipeline"]
    ToolB --> Pipe
    
    Pipe --> Observation["πŸ” Tool Observation\n(Structured Data)"]
    Observation --> Brain
    
    Brain --> Final["🎁 Final Personalized Response"]
Loading

2. πŸ§ͺ 3-Step Nutrition Analysis Pipeline

To ensure "hallucination-free" data, NomAI uses a web-grounded pipeline:

  1. Identification: Detection of food items & generation of enriched search queries.
  2. Web Grounding: Targeted searches (Exa/DuckDuckGo) for authoritative USDA/FDA or brand data.
  3. Multimodal Synthesis: Synthesis of Actual Image + Web Facts + User Prompt into structured nutritional data.

3. πŸ“… Diet Plan Generation (Carb Cycling)

The system applies metabolic variety patterns rather than static targets.

graph TD
    Input["πŸ“₯ DietInput Payload"] --> Calc["βš–οΈ Target Calculator"]
    Calc --> Patterns["πŸ”„ Carb Cycling Logic\n(Cyclical Macro Variation)"]
    Patterns --> Loop["πŸ” 7-Day Generation Loop"]
    Loop --> DayPrompt["πŸ“ Prompt + Used Foods Tracking"]
    DayPrompt --> LLMCall["πŸ€– LLM Provider"]
    LLMCall --> Variety["πŸ₯— Update Diversity Score"]
    Variety -- "Next Day" --> Loop
    Variety -- "End" --> Aggregator["πŸ“Š Weekly Aggregator"]
Loading

πŸš€ Setup & Deployment

1. Backend Configuration

The backend acts as the AI Gateway for the app.

  • Source: https://github.com/Pavel401/NomAI
    Deploy on Railway
  • Deployment: We recommend Railway or GCP Cloud Run.
  • Environment Variables:
    • PROVIDER_TYPE: gemini or openrouter.
    • GOOGLE_API_KEY: For Gemini Vision analysis.
    • SEARCH_PROVIDER: exa or duckduckgo for web grounding.
    • FIRESTORE_DATABASE_ID: Set to mealai.

2. Firebase Core Services

NomAI relies on Firebase for real-time sync and security.

  • Authentication: Enable Email and Google providers.
  • Firestore: Initialize in production mode.
  • Remote Config: Add the base_url key pointing to your deployed backend.

3. Client Execution (FVM)

NomAI works on iOS, Android, and Web.

# 1. SDK Isolation
fvm use 3.35.0

# 2. Platform Configs
# - Android: google-services.json
# - iOS: GoogleService-Info.plist
# - Web: firebase-config script

# 3. Compile & Run
fvm flutter pub get
fvm flutter run           # Mobile
fvm flutter run -d chrome # Web

πŸ“¦ Build & Release

fvm flutter build apk --release    # Android
fvm flutter build ios --release    # iOS
fvm flutter build web --release    # Web

πŸ“‚ Folder Structure

lib/
β”œβ”€β”€ app/
β”‚   β”œβ”€β”€ components/         # Reusable UI components (Buttons, Modals, Inputs)
β”‚   β”œβ”€β”€ constants/          # Application theme, colors, and API endpoints
β”‚   β”œβ”€β”€ models/             # Base data models and JSON serialization
β”‚   β”œβ”€β”€ modules/            # Feature-centric modular architecture
β”‚   β”‚   β”œβ”€β”€ Analytics/      # Data visualization and dietary metrics
β”‚   β”‚   β”œβ”€β”€ Auth/           # Firebase Authentication flows
β”‚   β”‚   β”œβ”€β”€ Chat/           # Conversational AI Assistant
β”‚   β”‚   β”œβ”€β”€ DashBoard/      # Core metrics and daily logging summary
β”‚   β”‚   β”œβ”€β”€ Diet/           # Weekly plan generation and alternates
β”‚   β”‚   β”œβ”€β”€ Onboarding/     # User profiling and goal setting
β”‚   β”‚   └── Scanner/        # Real-time food recognition using Vision AI
β”‚   β”œβ”€β”€ providers/          # Infrastructure services (RemoteConfig, BLoC)
β”‚   β”œβ”€β”€ repo/               # Data layer: Firebase SDKs and FastAPI integrations
β”‚   β”œβ”€β”€ services/           # State monitoring and global domain logic
β”‚   └── utility/            # Helper utilities (Registry, Haptics, Formatting)
β”œβ”€β”€ firebase_options.dart   # Platform-specific Firebase settings
└── main.dart               # App entry point
assets/
β”œβ”€β”€ lottie/                 # High-performance micro-animations
β”œβ”€β”€ png/                    # Branding assets
└── svg/                    # Resolution-independent iconography