Revisiting Convolutional Block Attention Module: Attention Enhancing Entropy For Semantic Segmentation of Images

Document Type

Conference Article

Publication Title

Cins 2024 2nd International Conference on Computational Intelligence and Network Systems

Abstract

An attention mechanism is important to measure the relevance of channels for a multi-channel convolutional neural net. An attention mechanism helps to concentrate on extracting features from input channels in order to produce a classification or segmentation of objects of interest from an image. Squeeze- and-Excitation (SE), Attention Gate (AG) and Convolutional Block Attention Module (CBAM) are introduced to enhance the important features of channels. CBAM uses channel attention and spatial attention within the channels. The channel attention of CBAM works out to prioritise essential channels using the global average and global maximum values of each channel processed by a shared multi-layer perceptron (MLP). The channel attention scales the important channels and forwards those channels to the spatial attention module. The spatial attention identifies the regions of interest (ROIs) for better feature processing. Drawing inspiration from CBAM, the channel attention mechanism is modified to use the entropy value of each channel to scale the global average value of the channel. In this proposal, U-Net and its skip connections are used as a baseline architecture. In the proposal, the skip connections are modified to apply attention mechanisms. The experiments show that the product of entropy and the global average of each channel is successful in better segmenting ROIs in an image that provides at least 1% improvement in terms of Dice Similarity Coefficient (DSC) for semantic segmentation.

DOI

10.1109/CINS63881.2024.10864429

Publication Date

1-1-2024

Share

COinS