Spherical Transformer on Cortical Surfaces

Mach Learn Med Imaging. 2022 Sep:2022:406-415. doi: 10.1007/978-3-031-21014-3_42. Epub 2022 Dec 16.

Abstract

Motivated by the recent great success of attention modeling in computer vision, it is highly desired to extend the Transformer architecture from the conventional Euclidean space to non-Euclidean spaces. Given the intrinsic spherical topology of brain cortical surfaces in neuroimaging, in this study, we propose a novel Spherical Transformer, an effective general-purpose backbone using the self-attention mechanism for analysis of cortical surface data represented by triangular meshes. By mapping the cortical surface onto a sphere and splitting it uniformly into overlapping spherical surface patches, we encode the long-range dependency within each patch by the self-attention operation and formulate the cross-patch feature transmission via overlapping regions. By limiting the self-attention computation to local patches, our proposed Spherical Transformer preserves detailed contextual information and enjoys great efficiency with linear computational complexity with respect to the patch size. Moreover, to better process longitudinal cortical surfaces, which are increasingly popular in neuroimaging studies, we unprecedentedly propose the spatiotemporal self-attention operation to jointly extract the spatial context and dynamic developmental patterns within a single layer, thus further enlarging the expressive power of the generated representation. To comprehensively evaluate the performance of our Spherical Transformer, we validate it on a surface-level prediction task and a vertex-level dense prediction task, respectively, i.e., the cognition prediction and cortical thickness map development prediction, which are important in early brain development mapping. Both applications demonstrate the competitive performance of our Spherical Transformer in comparison with the state-of-the-art methods.

Keywords: Cognition; Cortical Surface; Development Prediction; Transformer.