The Segment Anything Model (SAM) is a cutting-edge, promptable deep learning model designed for 2D natural image segmentation. However, medical imagingβparticularly brain tumor segmentation π§ βpresents unique challenges, including highly irregular tumor boundaries, variations in intensity, and the complex, multidimensional nature of medical scans.
To address these challenges, this project extends SAM's segmentation capabilities by integrating Parameter-Efficient Fine-Tuning (PEFT) techniques and training on the BraTS Intracranial Meningioma 2023 dataset. Furthermore, we introduce U-SAM, a novel framework that combines SAMβs segmentation power with the precision and structural awareness of U-Net architectures, thereby enhancing segmentation accuracy for complex medical imaging tasks.
Given the difficulties encountered in programmatic data retrieval from Synapse, I have made our curated dataset (including our specific training/test split) and model checkpoints available for easy access: π U-SAM Project SharePoint
π’ If you experience any accessibility issues with the dataset or checkpoints, please feel free to contact me.
βββ U-SAM/ # Scripts for running and evaluating U-SAM
β βββ U1-SAM2.py > Runs U1-SAM2
β βββ U-SAM-Combo.py > Runs the U-SAM combination
βββ U-Net/ # U-Net model scripts
β βββ unet_test.py > Tests U-Net
β βββ unet_train.py > Trains U-Net
β βββ unet3d.py > 3D U-Net architecture implementation
βββ SAM/ # SAM model scripts
β βββ sam_test.py > Tests SAM
β βββ sam_train.py > Trains SAM
β βββ sam_tuning.py > Tunes SAM hyperparameters
βββ dataset/ # Dataset folders (must be downloaded separately)
β βββ BraTS-MEN-Test/
β βββ BraTS-MEN-Train/
β βββ BraTS-MEN-Validate/
βββ checkpoints/ # Pre-trained model checkpoints (must be downloaded separately)
β βββ peft-sam_epoch_125.pth.tar
β βββ unet_epoch_100.pth.tar
β βββ sam_vit_b_01ec64.pth
βββ utilities/ # Utility scripts
β βββ brats_dataset.py > Custom dataset loader for BraTS
β βββ general_utils.py > General helper functions
β βββ model_utils.py > Model-related utilities
β βββ plot.py > Visualization functions
β βββ preprocess.py > Preprocessing functions
βββ requirements.txt # Dependencies list
To set up the environment for this project:
- Create a Virtual Environment (recommended):
# Using venv python3 -m venv samseg-env source samseg-env/bin/activate # Using conda conda create --name samseg-env python=3.8 conda activate samseg-env
- Install Required Libraries:
pip install -r requirements.txt
If you run into module import issues, ensure you're in the root project directory and run:
export PYTHONPATH=$(pwd):$PYTHONPATHpython3 SAM/sam_tuning.py --checkpoint_path checkpoints/sam_vit_b_01ec64.pth \
--output_dir SAM/tuning_outputs --n_trials 50python3 SAM/sam_train.py --checkpoint_path checkpoints/peft-sam_epoch_125.pth.tar \
--output_dir SAM/training_outputs --init_lr 0.00006 \
--decay_rate 0.995 --max_epoch 50 --batch_size 1 \
--samples_per_epoch 150 --backup_interval 10 \
--validate --validation_interval 5- Vanilla SAM:
python3 SAM/sam_test.py --vanilla --visualise --results_dir SAM/vanilla_test_results --batch_size 1
- PEFT-SAM:
python3 SAM/sam_test.py --peft_path checkpoints/peft-sam_epoch_125.pth.tar \ --visualise --results_dir SAM/peft_test_results --batch_size 1
python3 UNet/unet_train.py --checkpoint_path checkpoints/unet_epoch_100.pth.tar \
--output_dir UNet/training_outputs --init_lr 0.00006 \
--decay_rate 0.995 --max_epoch 50 --batch_size 1 \
--samples_per_epoch 150 --backup_interval 10 \
--validate --validation_interval 5python3 UNet/unet_test.py --checkpoint_path checkpoints/unet_epoch_100.pth.tar \
--visualise --results_dir UNet/unet_test_results --batch_size 1python3 U-SAM/U-SAM-Combo.py --unet_checkpoint checkpoints/unet_epoch_100.pth.tar \
--sam_checkpoint checkpoints/peft-sam_epoch_125.pth.tar \
--visualise --output_dir U-SAM/u-sam-combo-results --bbox_emulationpython3 U-SAM/U1-SAM2.py --unet_checkpoint checkpoints/unet_epoch_100.pth.tar \
--sam_checkpoint checkpoints/peft-sam_epoch_125.pth.tar \
--visualise --output_dir U-SAM/u1-sam2-results --bbox_emulation- Dataset & Checkpoints: The dataset must be downloaded separately from the SharePoint link above.
- Ensure paths are correct before running any scripts.
- GPU Support: Requires CUDA for optimal performance.
- Tested Environment: A100 Core GPU on UCT's HPC.
- Citations: Full citations can be found in the accompanying paper and code, with special thanks to the Kurtlab BraTS 2023 submission for their adaptable project setup.
π Happy segmenting! π§ For any questions or comments about this project, please donβt hesitate to reach out!