Automated Design of Attention-Integrated CNN Architectures


Akpolat M. Z., DEMİRCİ M. F.

9th International Symposium on Innovative Approaches in Smart Technologies, ISAS 2025, Gaziantep, Turkey, 27 - 28 June 2025, (Full Text) identifier

  • Publication Type: Conference Paper / Full Text
  • Doi Number: 10.1109/isas66241.2025.11101897
  • City: Gaziantep
  • Country: Turkey
  • Keywords: Attention Mechanisms, CNN Architecture Optimization, Convolutional Block Attention Module (CBAM), Coordinate Attention (CA), Evolutionary Algorithms, Evolutionary Neural Architecture Search (ENAS), Multi-GPU Training, Squeeze-and-Excitation (SE)
  • Ankara Yıldırım Beyazıt University Affiliated: Yes

Abstract

This paper presents an enhanced evolutionary neural architecture search (ENAS) framework that integrates attention mechanisms into the search space as architectural modules. Specifically, we combine three widely used CNN building blocks-ResNet, DenseNet, and Inception-with three distinct attention mechanisms: Squeeze-and-Excitation (SE), Convolutional Block Attention Module (CBAM), and Coordinate Attention (CA). This results in a total of twelve modules: three base modules and nine attention-augmented variants. These components are treated as interchangeable elements within the search space and are assembled to construct diverse candidate architectures during the evolutionary process. While attention modules have not been a primary focus in the search space design of most existing ENAS frameworks, our approach explores their potential integration to contribute to this area. The proposed approach is evaluated against two previous ENAS frameworks-AE-CNN and FA-CNN-and consistently achieves superior results in terms of average accuracy and convergence stability across generations. Our findings underscore the benefit of incorporating attentionaugmented modules into ENAS workflows and provide a foundation for further research on attention-driven architecture design.