Skip to content

Wiki Home

Darjusch edited this page May 16, 2021 · 4 revisions

Welcome to Goodbuy Wiki!

Goodbuy Backend Repository

Introduction

This is the repository of Goodbuy Project CODE University SS 2021. Detailed information can be found here.

Motivation

Consume for a better world! Project Goodbuy is about providing guidance to customers who need information for sustainable consumption patterns as a web application.

There are many products that are behind several suspicious corporations that are not easy to notice that the product is from these companies. There are also so many subsidiaries of these companies that it is not easy to notice where it’s from.

Our goal was to provide the information more closely. We come up with a user story that users can scan the product while purchasing and our product can give you the information back about the exact product info based on barcode i.e. EAN number.

Product Basic Usage

Our product enables the user to do the following:

  1. Scan the EAN of the product
  2. Send the request with EAN
  3. Check the database if the product exists already
  4. If yes, send back the information of the product
  5. If no, save the product and trigger the scraper for later search result

What is it?

The backend infrastructure of this project mainly consists of four microservices - REST API, RabbitMQ scraper, monitoring server, and MongoDB database. We also have an image recognition service server, though it will be connected later on.

Build Status

Build Status Maintenance Issues Issues Issues Issues

Built with

ts nodejs express mongodb aws

..and RabbitMQ, Docker too! (can't find the badge :D)

Setup

Prerequisite:

  • Env variables - Passcamp or contact one of our contributor via email or discord channel!
  • Node.js 14.16.1^
  • MongoDB community 4.4.1^

Installation

You can run the REST API server locally.

  1. Clone the source code
git clone https://github.com/code-goodbuy/goodbuy-nodejs && cd goodbuy-nodejs
  1. Checkout the dev branch
git checkout dev
  1. Add global ENV variables (list shared via PassCamp) - mandatory
  2. Install Node.js (will also install NPM)
  3. Install MongoDB as a local database and run the service daemon
sudo service mongodb start
  • macOS
brew services start mongodb-community@4.4
  1. Install project dependencies
npm i
  1. To compile Typescript, move to /src folder
npx tsc -w -p .
  1. Run the server locally
npm run local

Overview of the Structure

Overview of the structure

NGINX Load Balancer

  • It is currently connected to the REST API server and scraper but it can be connected to more API servers later if necessary. It is supposed to handle all the requests from the client and distribute the load into multiple servers when heavy traffic happens. For the current iteration, we don’t necessarily need the load balancer but we placed it for scalability in mind.

REST API Server

  • As the diagram above shows, it is the center of the backend system. It only serves as an API server. It receives the REST API request from the client and responds with data from MongoDB accordingly. Potentially every API call would be saved in /log, transferred to a centralized monitoring server for analysis of the traffic.

  • Detailed API documentation can be found here in swaggerhub.

RabbitMQ Message Broker

  • RabbitMQ scraper server is for patching the data from OpenFoodFacts database to the matched product collection in MongoDB. It only gets triggered when a user requests a non-exist product in our database.

Prometheus/Grafana Monitoring Server

  • The name is self-explanatory - it gathers hardware metrics of every AWS instance from the node-exporter agent and API calls into one place for observability purposes. You can set the notification setup on certain metrics to be alerted in a timely manner.

MongoDB Atlas

  • Currently, we use our database on MongoDB Atlas as a platform. It serves mainly two instances - team Goodbuy database and OpenFoodFacts database that we cloned from here.

Tests

To run the tests:

$ npm run test

Security Measures

OWASP Threat Dragon Model

Threat Model Overview

This is a list of the cybersecurity measures we have implemented in our project. It implements the major threats we have assessed through our threat modeling.

  • Password hashed in the database
  • Email verification
  • Input type validation
  • Input size validation
  • JWT authentication token
  • JWT refresh token
  • VPN for ssh access to our VPC / EC2 instances
  • Inbound restrictions on every EC2 instance for every service running
  • NGINX reverse proxy
  • HTTPS (TLS)
  • Changed default passwords
  • Restricted users for MongoDB & RabbitMQ
  • Secured secrets and credentials in environment variables
  • Configured headers:
    • Content-Security-Policy "default-src 'none'; frame-ancestors 'none'; base-uri 'self'; form-action 'self'; style-src-elem gb-be.de;"
    • Strict-Transport-Policy "max-age=31536000; includeSubDomains" always;
    • X-Frame-Options: DENY
    • X-Content-Type-Options nosniff
    • X-XSS-Protection “1; mode=block”

That last point gives us an A+ grade in the Mozilla Observatory (link).

MongoDB NoSQL Database Model

MongoDB NoSQL Database Model

Notes

  • The database is centralized around product collection
  • Ean(barcode number) is an absolute unique number, it is assigned as product Id and index
  • Brand/Corporation is embedded inside of product collection
  • One user can have many products: it can be changed in the upcoming iteration
  • Product collection have scraper_info field for enumerating unscraped product feature
  • Product ID and state has a compound index for optimizing query speed
  • email field of user collection is indexed for optimizing query speed

Backup & Rollback Strategy

MongoDB Atlas Backup Status

  • Openfootfacts cluster snapshot is saved into AWS S3 bucket every 6 hours, the data can be restored via MongoDB Atlas console if necessary.

What did we do:

Darjusch

Setting up the backend architecture with Anthony.

RestAPI: JWT authentication and authorisation. Everything related to Product, Authentication + Authorisation routes including validators, controllers, tests, middleware. Error handler.

Documenting the api: ( for my routes ) https://app.swaggerhub.com/apis/Goodbuy-node/Goodbuy/1.0.0#free

Architectural design of api: ( a bit outdated ) https://cacoo.com/diagrams/rnbFT6a4SGyJymzM/4FB51

Monitoring: Grafana + Prometheus

Logging: ( With Jongwoo ) 
Grafana + Loki + Promtrail

Reverse Proxy: ( With Anthony ) 
Nginx

Credits

@5h3rr1ll - Anthony Sherrill
@Darjusch - Darjusch Schrand
@jwdotpark - Jongwoo Park
@d-pettersson - David Pettersson

Licence

GitHub license