BaGFN: Broad Attentive Graph Fusion Network for High-Order Feature Interactions

IEEE Trans Neural Netw Learn Syst. 2023 Aug;34(8):4499-4513. doi: 10.1109/TNNLS.2021.3116209. Epub 2023 Aug 4.

Abstract

Modeling feature interactions is of crucial significance to high-quality feature engineering on multifiled sparse data. At present, a series of state-of-the-art methods extract cross features in a rather implicit bitwise fashion and lack enough comprehensive and flexible competence of learning sophisticated interactions among different feature fields. In this article, we propose a new broad attentive graph fusion network (BaGFN) to better model high-order feature interactions in a flexible and explicit manner. On the one hand, we design an attentive graph fusion module to strengthen high-order feature representation under graph structure. The graph-based module develops a new bilinear-cross aggregation function to aggregate the graph node information, employs the self-attention mechanism to learn the impact of neighborhood nodes, and updates the high-order representation of features by multihop fusion steps. On the other hand, we further construct a broad attentive cross module to refine high-order feature interactions at a bitwise level. The optimized module designs a new broad attention mechanism to dynamically learn the importance weights of cross features and efficiently conduct the sophisticated high-order feature interactions at the granularity of feature dimensions. The final experimental results demonstrate the effectiveness of our proposed model.