Filter Pruning with Convolutional Approximation Small Model Framework

dc.contributor.authorMonthon Intraraprasit
dc.contributor.authorOrachat Chitsobhuk
dc.date.accessioned2025-07-21T06:09:49Z
dc.date.issued2023-09-05
dc.description.abstractConvolutional neural networks (CNNs) are extensively utilized in computer vision; however, they pose challenges in terms of computational time and storage requirements. To address this issue, one well-known approach is filter pruning. However, fine-tuning pruned models necessitates substantial computing power and a large retraining dataset. To restore model performance after pruning each layer, we propose the Convolutional Approximation Small Model (CASM) framework. CASM involves training a compact model with the remaining kernels and optimizing their weights to restore feature maps that resemble the original kernels. This method requires less complexity and fewer training samples compared to basic fine-tuning. We evaluate the performance of CASM on the CIFAR-10 and ImageNet datasets using VGG-16 and ResNet-50 models. The experimental results demonstrate that CASM surpasses the basic fine-tuning framework in terms of time acceleration (3.3× faster), requiring a smaller dataset for performance recovery after pruning, and achieving enhanced accuracy.
dc.identifier.doi10.3390/computation11090176
dc.identifier.urihttps://dspace.kmitl.ac.th/handle/123456789/12791
dc.subjectPruning
dc.subjectFeature (linguistics)
dc.subjectFLOPS
dc.subject.classificationAdvanced Neural Network Applications
dc.titleFilter Pruning with Convolutional Approximation Small Model Framework
dc.typeArticle

Files

Collections