齐钰, 蔡春波, 向阳, et al. Study on Optimization of Transformer Model in Defect Recognition of Magnetic Powder Inspectio[J]. Mechanical Science and Technology for Aerospace Engineering, 2026, 45(3): 509-517.
齐钰, 蔡春波, 向阳, et al. Study on Optimization of Transformer Model in Defect Recognition of Magnetic Powder Inspectio[J]. Mechanical Science and Technology for Aerospace Engineering, 2026, 45(3): 509-517. DOI: 10.13433/j.cnki.1003-8728.20240072.
The use of Transformer-based object detection models has gained prominence in recent years due to their remarkable representational and learning capabilities across various domains. However
their large size and complex structures make them challenging to deploy on low-computation hardware. Addressing the current challenges associated with Transformers
a lightweight Transformer-based model called Lite-Deformable-DETR for object detection is introduced. With minimal impact on accuracy
the present s approach significantly reduces the model complexity and computational load. The method constructs a deep learning model by incorporating the deep separable convolutions and the inverse residual structures
leading to the lightweighting of the Encoder layers and Backbone network. Lite-Deformable-DETR is trained and compared with both Transformer-based models and traditional CNN models. By comparing with the training results
the Lite-Deformable-DETR model achieves a 56% reduction in model size
a 43% reduction in computational workload
and a 1.6% mAP reduction in accuracy loss on the target dataset.