PlumX Metrics
Embed PlumX Metrics

A unified consensus-based parallel algorithm for high-dimensional regression with combined regularizations

Computational Statistics & Data Analysis, ISSN: 0167-9473, Vol: 203, Page: 108081
2025
  • 1
    Citations
  • 0
    Usage
  • 0
    Captures
  • 1
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Citations
    1
  • Mentions
    1
    • News Mentions
      1
      • News
        1

Most Recent News

Reports from Chongqing University Highlight Recent Findings in Statistics and Data Analysis (A Unified Consensus-based Parallel Algorithm for High-dimensional Regression With Combined Regularizations)

2025 MAR 05 (NewsRx) -- By a News Reporter-Staff News Editor at Information Technology Daily -- A new study on Information Technology - Statistics and

Article Description

The parallel algorithm is widely recognized for its effectiveness in handling large-scale datasets stored in a distributed manner, making it a popular choice for solving statistical learning models. However, there is currently limited research on parallel algorithms specifically designed for high-dimensional regression with combined regularization terms. These terms, such as elastic-net, sparse group lasso, sparse fused lasso, and their nonconvex variants, have gained significant attention in various fields due to their ability to incorporate prior information and promote sparsity within specific groups or fused variables. The scarcity of parallel algorithms for combined regularizations can be attributed to the inherent nonsmoothness and complexity of these terms, as well as the absence of closed-form solutions for certain proximal operators associated with them. This paper proposes a unified constrained optimization formulation based on the consensus problem for these types of convex and nonconvex regression problems, and derives the corresponding parallel alternating direction method of multipliers (ADMM) algorithms. Furthermore, it is proven that the proposed algorithm not only has global convergence but also exhibits a linear convergence rate. It is worth noting that the computational complexity of the proposed algorithm remains the same for different regularization terms and losses, which implicitly demonstrates the universality of this algorithm. Extensive simulation experiments, along with a financial example, serve to demonstrate the reliability, stability, and scalability of our algorithm. The R package for implementing the proposed algorithm can be obtained at https://github.com/xfwu1016/CPADMM.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know