PlumX Metrics
Embed PlumX Metrics

KNN-GNN: A powerful graph neural network enhanced by aggregating K-nearest neighbors in common subspace

Expert Systems with Applications, ISSN: 0957-4174, Vol: 253, Page: 124217
2024
  • 3
    Citations
  • 0
    Usage
  • 7
    Captures
  • 1
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Citations
    3
  • Captures
    7
  • Mentions
    1
    • News Mentions
      1
      • News
        1

Most Recent News

Researchers at Lanzhou University Report New Data on Networks (Knn-gnn: a Powerful Graph Neural Network Enhanced By Aggregating K-nearest Neighbors In Common Subspace)

2024 OCT 01 (NewsRx) -- By a News Reporter-Staff News Editor at Network Daily News -- Investigators publish new report on Networks. According to news

Article Description

It has been proven that graph neural networks (GNNs) are effective for a variety of graph learning-based applications. Typical GNNs iteratively aggregate messages from immediate neighbors under the homophily assumption. However, there are a larger number of heterophilous networks in our daily life, where the ability of these GNNs is limited. Recently, some GNN models have been proposed to handle networks with heterophily via some key designs like aggregating higher-order neighbors and combining immediate representations. But ”noise” information transmitted from different-order neighbors will be injected into the representations of nodes. In this paper, we propose a new GNN model, called KNN-GNN, to effectively perform the node classification task for networks with various homophily levels. The main idea of KNN-GNN is to learn a comprehensive and accurate representation for each node by integrating not only the local information from its neighborhood but also the non-local information held by its similar nodes decentralized in the network. Specifically, the local information of a node is generated from itself and its 1-hop neighbor. Then, we project all nodes into a common subspace, where similar nodes are desired to be close to each other. The non-local information of a node is gathered by aggregating its K-Nearest Neighbors searched in the common subspace. We evaluate the performance of KNN-GNN on both real and synthetic datasets including networks with diverse homophily levels. The results demonstrate that KNN-GNN outstrips the state-of-the-art baselines. Moreover, the ablation experiments show that the core designs in KNN-GNN play a critical role in node representation learning.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know