KNN-GNN: A powerful graph neural network enhanced by aggregating K-nearest neighbors in common subspace
Expert Systems with Applications, ISSN: 0957-4174, Vol: 253, Page: 124217
2024
- 3Citations
- 7Captures
- 1Mentions
Metric Options: Counts1 Year3 YearSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Most Recent News
Researchers at Lanzhou University Report New Data on Networks (Knn-gnn: a Powerful Graph Neural Network Enhanced By Aggregating K-nearest Neighbors In Common Subspace)
2024 OCT 01 (NewsRx) -- By a News Reporter-Staff News Editor at Network Daily News -- Investigators publish new report on Networks. According to news
Article Description
It has been proven that graph neural networks (GNNs) are effective for a variety of graph learning-based applications. Typical GNNs iteratively aggregate messages from immediate neighbors under the homophily assumption. However, there are a larger number of heterophilous networks in our daily life, where the ability of these GNNs is limited. Recently, some GNN models have been proposed to handle networks with heterophily via some key designs like aggregating higher-order neighbors and combining immediate representations. But ”noise” information transmitted from different-order neighbors will be injected into the representations of nodes. In this paper, we propose a new GNN model, called KNN-GNN, to effectively perform the node classification task for networks with various homophily levels. The main idea of KNN-GNN is to learn a comprehensive and accurate representation for each node by integrating not only the local information from its neighborhood but also the non-local information held by its similar nodes decentralized in the network. Specifically, the local information of a node is generated from itself and its 1-hop neighbor. Then, we project all nodes into a common subspace, where similar nodes are desired to be close to each other. The non-local information of a node is gathered by aggregating its K-Nearest Neighbors searched in the common subspace. We evaluate the performance of KNN-GNN on both real and synthetic datasets including networks with diverse homophily levels. The results demonstrate that KNN-GNN outstrips the state-of-the-art baselines. Moreover, the ablation experiments show that the core designs in KNN-GNN play a critical role in node representation learning.
Bibliographic Details
http://www.sciencedirect.com/science/article/pii/S0957417424010832; http://dx.doi.org/10.1016/j.eswa.2024.124217; http://www.scopus.com/inward/record.url?partnerID=HzOxMe3b&scp=85194381745&origin=inward; https://linkinghub.elsevier.com/retrieve/pii/S0957417424010832; https://dx.doi.org/10.1016/j.eswa.2024.124217
Elsevier BV
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know