-
Notifications
You must be signed in to change notification settings - Fork 0
Wiki Home
This is the repository of Goodbuy Project CODE University SS 2021. Detailed information can be found here.
Consume for a better world! Project Goodbuy is about providing guidance to customers who need information for sustainable consumption patterns as a web application.
There are many products that are behind several suspicious corporations that are not easy to notice that the product is from these companies. There are also so many subsidiaries of these companies that it is not easy to notice where it’s from.
Our goal was to provide the information more closely. We come up with a user story that users can scan the product while purchasing and our product can give you the information back about the exact product info based on barcode i.e. EAN number.
Our product enables the user to do the following:
- Scan the EAN of the product
- Send the request with EAN
- Check the database if the product exists already
- If yes, send back the information of the product
- If no, save the product and trigger the scraper for later search result
The backend infrastructure of this project mainly consists of four microservices - REST API, RabbitMQ scraper, monitoring server, and MongoDB database. We also have an image recognition service server, though it will be connected later on.
..and RabbitMQ, Docker too! (can't find the badge :D)
Prerequisite:
- Env variables - Passcamp or contact one of our contributor via email or discord channel!
- Node.js 14.16.1^
- MongoDB community 4.4.1^
You can run the REST API server locally.
- Clone the source code
git clone https://github.com/code-goodbuy/goodbuy-nodejs && cd goodbuy-nodejs
- Checkout the dev branch
git checkout dev
- Add global ENV variables (list shared via PassCamp) - mandatory
- Install Node.js (will also install NPM)
- Install MongoDB as a local database and run the service daemon
sudo service mongodb start
- macOS
brew services start mongodb-community@4.4
- Install project dependencies
npm i
- To compile Typescript, move to /src folder
npx tsc -w -p .
- Run the server locally
npm run local
- It is currently connected to the REST API server and scraper but it can be connected to more API servers later if necessary. It is supposed to handle all the requests from the client and distribute the load into multiple servers when heavy traffic happens. For the current iteration, we don’t necessarily need the load balancer but we placed it for scalability in mind.
-
As the diagram above shows, it is the center of the backend system. It only serves as an API server. It receives the REST API request from the client and responds with data from MongoDB accordingly. Potentially every API call would be saved in /log, transferred to a centralized monitoring server for analysis of the traffic.
-
Detailed API documentation can be found here in swaggerhub.
- RabbitMQ scraper server is for patching the data from OpenFoodFacts database to the matched product collection in MongoDB. It only gets triggered when a user requests a non-exist product in our database.
- The name is self-explanatory - it gathers hardware metrics of every AWS instance from the node-exporter agent and API calls into one place for observability purposes. You can set the notification setup on certain metrics to be alerted in a timely manner.
- Currently, we use our database on MongoDB Atlas as a platform. It serves mainly two instances - team Goodbuy database and OpenFoodFacts database that we cloned from here.
To run the tests:
$ npm run test
This is a list of the cybersecurity measures we have implemented in our project. It implements the major threats we have assessed through our threat modeling.
- Password hashed in the database
- Email verification
- Input type validation
- Input size validation
- JWT authentication token
- JWT refresh token
- VPN for ssh access to our VPC / EC2 instances
- Inbound restrictions on every EC2 instance for every service running
- NGINX reverse proxy
- HTTPS (TLS)
- Changed default passwords
- Restricted users for MongoDB & RabbitMQ
- Secured secrets and credentials in environment variables
- Configured headers:
- Content-Security-Policy "default-src 'none'; frame-ancestors 'none'; base-uri 'self'; form-action 'self'; style-src-elem gb-be.de;"
- Strict-Transport-Policy "max-age=31536000; includeSubDomains" always;
- X-Frame-Options: DENY
- X-Content-Type-Options nosniff
- X-XSS-Protection “1; mode=block”
That last point gives us an A+ grade in the Mozilla Observatory (link).
- The database is centralized around product collection
- Ean(barcode number) is an absolute unique number, it is assigned as product Id and index
- Brand/Corporation is embedded inside of product collection
- One user can have many products: it can be changed in the upcoming iteration
- Product collection have scraper_info field for enumerating unscraped product feature
- Product ID and state has a compound index for optimizing query speed
- email field of user collection is indexed for optimizing query speed
- Openfootfacts cluster snapshot is saved into AWS S3 bucket every 6 hours, the data can be restored via MongoDB Atlas console if necessary.
Setting up the backend architecture with Anthony.
RestAPI: JWT authentication and authorisation. Everything related to Product, Authentication + Authorisation routes including validators, controllers, tests, middleware. Error handler.
Documenting the api: ( for my routes ) https://app.swaggerhub.com/apis/Goodbuy-node/Goodbuy/1.0.0#free
Architectural design of api: ( a bit outdated ) https://cacoo.com/diagrams/rnbFT6a4SGyJymzM/4FB51
Monitoring: Grafana + Prometheus
Logging: ( With Jongwoo ) Grafana + Loki + Promtrail
Reverse Proxy: ( With Anthony ) Nginx
@5h3rr1ll - Anthony Sherrill
@Darjusch - Darjusch Schrand
@jwdotpark - Jongwoo Park
@d-pettersson - David Pettersson



