Residual Attention Augmentation Graph Neural Network for Improved Node Classification

Authors

  • Muhammad Affan Abbas Department of Electrical and Information Engineering, Control Science and Engineering, Tianjin University, China
  • Waqar Ali Department of Environmental Sciences, Informatics, and Statistics, Ca' Foscari University of Venice, Italy
  • Florentin Smarandache Mathematics, Physics, and Natural Science Division, University of New Mexico, USA
  • Sultan S. Alshamrani Department of Information Technology, College of Computer and Information Technology, Taif University, Saudi Arabia
  • Muhammad Ahsan Raza Department of Information Sciences, University of Education Lahore, Multan Campus, Pakistan
  • Abdullah Alshehri Department of Information Technology, Faculty of Computing and Information, Al-Baha University, Saudi Arabia
  • Mubashir Ali Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, China
Volume: 14 | Issue: 2 | Pages: 13238-13242 | April 2024 | https://doi.org/10.48084/etasr.6844

Abstract

Graph Neural Networks (GNNs) have emerged as a powerful tool for node representation learning within graph structures. However, designing a robust GNN architecture for node classification remains a challenge. This study introduces an efficient and straightforward Residual Attention Augmentation GNN (RAA-GNN) model, which incorporates an attention mechanism with skip connections to discerningly weigh node features and overcome the over-smoothing problem of GNNs. Additionally, a novel MixUp data augmentation method was developed to improve model training. The proposed approach was rigorously evaluated on various node classification benchmarks, encompassing both social and citation networks. The proposed method outperformed state-of-the-art techniques by achieving up to 1% accuracy improvement. Furthermore, when applied to the novel Twitch social network dataset, the proposed model yielded remarkably promising results. These findings provide valuable insights for researchers and practitioners working with graph-structured data.

Keywords:

graph neural networks, node classification, over-smoothing, citation and social networks, mixup data augmentation

Downloads

Download data is not yet available.

References

S. Khoshraftar and A. An, "A Survey on Graph Representation Learning Methods," ACM Transactions on Intelligent Systems and Technology, vol. 15, no. 1, Jan. 2024, Art. no. 19.

P. W. Battaglia et al., "Relational inductive biases, deep learning, and graph networks." arXiv, Oct. 17, 2018.

J. Zhou et al., "Graph neural networks: A review of methods and applications," AI Open, vol. 1, pp. 57–81, Jan. 2020.

A. Elhassouny and F. Smarandache, "Trends in deep convolutional neural Networks architectures: a review," in 2019 International Conference of Computer Science and Renewable Energies (ICCSRE), Agadir, Morocco, Jul. 2019.

H. Guo, Y. Mao, and R. Zhang, "MixUp as Locally Linear Out-of-Manifold Regularization," Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, no. 01, pp. 3714–3722, Jul. 2019.

P. H. C. Avelar, A. R. Tavares, M. Gori, and L. C. Lamb, "Discrete and Continuous Deep Residual Learning Over Graphs." arXiv, Nov. 26, 2019.

R. Liao, Z. Zhao, R. Urtasun, and R. S. Zemel, "LanczosNet: Multi-Scale Deep Graph Convolutional Networks." arXiv, Oct. 23, 2019.

A. Deptuła, "Application of the Dependency Graph Method in the Analysis of Automatic Transmission Gearboxes," Engineering, Technology & Applied Science Research, vol. 11, no. 2, pp. 7033–7040, Apr. 2021.

K. Ding, Z. Xu, H. Tong, and H. Liu, "Data Augmentation for Deep Graph Learning: A Survey," ACM SIGKDD Explorations Newsletter, vol. 24, no. 2, pp. 61–77, Sep. 2022.

X. Han, Z. Jiang, N. Liu, and X. Hu, "G-Mixup: Graph Data Augmentation for Graph Classification," in Proceedings of the 39th International Conference on Machine Learning, Jun. 2022, pp. 8230–8248.

F. Smarandache, "Extension of HyperGraph to n-SuperHyperGraph and to Plithogenic n-SuperHyperGraph, and Extension of HyperAlgebra to n-ary (Classical-/Neutro-/Anti-)HyperAlgebra," Neutrosophic Sets and Systems, vol. 33, pp. 289–295, Feb. 2020.

K. Xu, M. Zhang, S. Jegelka, and K. Kawaguchi, "Optimization of Graph Neural Networks: Implicit Acceleration by Skip Connections and More Depth," in Proceedings of the 38th International Conference on Machine Learning, Jul. 2021, pp. 11592–11602.

T. K. Rusch, M. M. Bronstein, and S. Mishra, "A Survey on Oversmoothing in Graph Neural Networks." arXiv, Mar. 20, 2023.

D. D. Van, "Application of Advanced Deep Convolutional Neural Networks for the Recognition of Road Surface Anomalies," Engineering, Technology & Applied Science Research, vol. 13, no. 3, pp. 10765–10768, Jun. 2023.

H. Sasaki, S. Yamamoto, A. Agchbayar, and Ν. Nkhbayasgalan, "Extracting Problem Linkages to Improve Knowledge Exchange between Science and Technology Domains using an Attention-based Language Model," Engineering, Technology & Applied Science Research, vol. 10, no. 4, pp. 5903–5913, Aug. 2020.

J. Zhu, Y. Yan, L. Zhao, M. Heimann, L. Akoglu, and D. Koutra, "Beyond Homophily in Graph Neural Networks: Current Limitations and Effective Designs," in Advances in Neural Information Processing Systems, 2020, vol. 33, pp. 7793–7804.

M. Chen, Z. Wei, Z. Huang, B. Ding, and Y. Li, "Simple and Deep Graph Convolutional Networks," in Proceedings of the 37th International Conference on Machine Learning, Nov. 2020, pp. 1725–1735.

S. K. Maurya, X. Liu, and T. Murata, "Simplifying approach to node classification in Graph Neural Networks," Journal of Computational Science, vol. 62, Jul. 2022, Art. no. 101695.

B. Rozemberczki and R. Sarkar, "Twitch Gamers: a Dataset for Evaluating Proximity Preserving and Structural Role-based Node Embeddings." arXiv, Feb. 16, 2021.

Downloads

How to Cite

[1]
M. A. Abbas, “Residual Attention Augmentation Graph Neural Network for Improved Node Classification”, Eng. Technol. Appl. Sci. Res., vol. 14, no. 2, pp. 13238–13242, Apr. 2024.

Metrics

Abstract Views: 160
PDF Downloads: 167

Metrics Information

Most read articles by the same author(s)