Skip to content

Quantimb-Lab/DeiT_Covid

Repository files navigation

Data-Efficient Training of Pure Vision Transformers for the Task of Chest X-ray Abnormality Detection Using Knowledge Distillation

The Pytorch implementation of the paper "Data-Efficient Training of Pure Vision Transformers for the Task of Chest X-ray Abnormality Detection Using Knowledge Distillation" introduced in 44th International Engineering in Medicine and Biology Conference, Glasgow, United Kingdom.

The code and pretrained models are available in this repository. The code is built on the the DeiT code available on the github repository.

Datasets

COVID-19 Image Data Collection

ChestX-ray8: Hospital-scale Chest X-ray Database and Benchmarks on Weakly-Supervised Classification and Localization of Common Thorax Diseases

CheXNet: Radiologist-Level Pneumonia Detection on Chest X-Rays with Deep Learning

Labeled Optical Coherence Tomography (OCT) and Chest X-Ray Images for Classification

Models

All teacher and student models are available here and here

References

[1] A. Vaswani et al., “Attention is All you Need,” in Advances in Neural Information Processing Systems, 2017, vol. 30, [Online]. Available: https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf.

[2] A. Dosovitskiy et al., “An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale,” Oct. 2020, [Online]. Available: http://arxiv.org/abs/2010.11929.

[3] H. Touvron, M. Cord, M. Douze, F. Massa, A. Sablayrolles, and H. Jégou, “Training data-efficient image transformers & distillation through attention,” Dec. 2020, [Online]. Available: http://arxiv.org/abs/2012.12877.

About

The Pytorch implementation of the paper "Data-Efficient Training of Pure Vision Transformers for the Task of Chest X-ray Abnormality Detection Using Knowledge Distillation" introduced in 44th International Engineering in Medicine and Biology Conference, Glasgow, United Kingdom.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages