PlumX Metrics
Embed PlumX Metrics

FedGK: Communication-Efficient Federated Learning through Group-Guided Knowledge Distillation

ACM Transactions on Internet Technology, ISSN: 1557-6051, Vol: 24, Issue: 4, Page: 1-21
2024
  • 0
    Citations
  • 0
    Usage
  • 8
    Captures
  • 1
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Captures
    8
  • Mentions
    1
    • News Mentions
      1
      • 1

Most Recent News

University of Helsinki Details Findings in Internet Technology (Fedgk: Communication-efficient Federated Learning Through Group-guided Knowledge Distillation)

2024 DEC 26 (NewsRx) -- By a News Reporter-Staff News Editor at Internet Daily News -- Fresh data on Internet and World Wide Web -

Article Description

Federated learning (FL) empowers a cohort of participating devices to contribute collaboratively to a global neural network model, ensuring that their training data remains private and stored locally. Despite its advantages in computational efficiency and privacy preservation, FL grapples with the challenge of non-IID (not independent and identically distributed) data from diverse clients, leading to discrepancies between local and global models and potential performance degradation. In this article, we propose FedGK, an innovative communication-efficient Group-Guided FL framework designed for heterogeneous data distributions. FedGK employs a localized-guided framework that enables the client to effectively assimilate key knowledge from teachers and peers while minimizing extraneous peer information in FL scenarios. We conduct an in-depth analysis of the dynamic similarities among clients over successive communication rounds and develop a novel clustering approach that accurately groups clients with diverse heterogeneities. We implement FedGK on public datasets with an innovative data transformation pattern called "cluster-shift non-IID", which mirrors the more prevalent data distributions in real-world settings and could be grouped into clusters with similar data distributions. Extensive experimental results on public datasets demonstrate that the proposed approach FedGK improves accuracy by up to 32.89% and saves up to 53.33% communication cost over state-of-the-art baselines.

Bibliographic Details

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know