Skip to content

sbendimerad/mlflow_demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Logo


REX: MONITORING ET DÉPLOIEMENT DE MODÈLES AVEC MLFLOW - 2025/01/21

This project demonstrates an MLflow setup with two components:

  1. Deployment Folder: Contains a Dockerfile and launch.sh for deploying the MLflow server.
  2. Experimentation Folder: Contains a Jupyter Notebook to showcase experiments and MLflow tracking.

Project Structure

mlflow-demo/
├── deployment/
│   ├── Dockerfile
│   ├── launch.sh
├── experimentation/
│   ├── mlflow_experiment_notebook.ipynb
├── README.md

Instructions for Local Deployment

1. Build the Docker Image

Navigate to the deployment/ folder and build the Docker image:

docker build . -t mlflowserverlocal

2. Run the Docker Container

Run the container locally with the following command:

docker run \
-p 5000:5000 \
-v $(pwd)/:$(pwd)/ \
mlflowserverlocal
  • Port Mapping: Maps the container's internal port 5000 to your local available machine's port 5000.
  • Volume Mapping: Mounts the current directory to the container, enabling file sharing.

Once the container is running, you can access the MLflow UI at:
http://localhost:5000


Deploying on the Cloud

To deploy this project on the cloud:

1. Push the Docker Image to a Cloud Container Registry

Tag your image with your registry URL and push it:

docker tag mlflowserverlocal <your-cloud-registry>/mlflowserver
docker push <your-cloud-registry>/mlflowserver

2. Set Up Cloud Resources

  • Cloud Storage: For saving and accessing MLflow artifacts and models. Examples:

    • AWS S3
    • Google Cloud Storage
    • Azure Blob Storage
  • Setting Up a Backend Store: Create a database for the backstore URI needed by mlflow (You can create your DB on the same cloud provider that has been used for the storage)

3. Update Environment Variables

Modify the environment variables in the application to point to your cloud resources:

  • MLFLOW_TRACKING_URI: URL of your deployed API or MLflow server.
  • ARTIFACT_STORE: Cloud storage path for artifacts.

4. Run the Application on Cloud

Deploy the containerized app to a cloud platform like AWS ECS, GCP Cloud Run, or Azure Container Instances.


Key Notes

  • If you want to know more about cloud deployment, check out my articlee published on towardsdatascienc.

About

This project demonstrates an MLflow setup and experimentations test

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors