9th International Symposium on Innovative Approaches in Smart Technologies, ISAS 2025, Gaziantep, Turkey, 27 - 28 June 2025, (Full Text)
This paper presents an enhanced evolutionary neural architecture search (ENAS) framework that integrates attention mechanisms into the search space as architectural modules. Specifically, we combine three widely used CNN building blocks-ResNet, DenseNet, and Inception-with three distinct attention mechanisms: Squeeze-and-Excitation (SE), Convolutional Block Attention Module (CBAM), and Coordinate Attention (CA). This results in a total of twelve modules: three base modules and nine attention-augmented variants. These components are treated as interchangeable elements within the search space and are assembled to construct diverse candidate architectures during the evolutionary process. While attention modules have not been a primary focus in the search space design of most existing ENAS frameworks, our approach explores their potential integration to contribute to this area. The proposed approach is evaluated against two previous ENAS frameworks-AE-CNN and FA-CNN-and consistently achieves superior results in terms of average accuracy and convergence stability across generations. Our findings underscore the benefit of incorporating attentionaugmented modules into ENAS workflows and provide a foundation for further research on attention-driven architecture design.