Optimizing the Hyperparameter Tuning of YOLOv5 for Breast Cancer Detection
Loading...
Date
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
This research aims to find the best use optimizer for the task while reducing training time. We optimized the YOLOv5s model and focused on three optimizers, including the Stochastic Gradient Descent (SGD) optimizer, Adaptive Moment Estimation (Adam) optimizer, and Adam with Weight Decay Regularization (AdamW) optimizer. This research utilized 1,471 mammogram images from National Cancer Institute and Udonthani Cancer Hospital, Thailand. A dataset of mammograms was labeled into six classes, including Masses Benign, Masses Malignant, Calcifications Benign, Calcifications Malignant, Associated Features Benign, and Associated Features Malignant, to classify the results accurately. We found that the SGD optimizer outperformed the others, with a mean average precision (mAP) of 0.91, a precision of 0.92, a recall of 0.85, and the shortest training time of about 5.453 hr.