Manifold constrained joint sparse learning via non-convex regularization
Neurocomputing, ISSN: 0925-2312, Vol: 458, Page: 112-126
2021
- 3Citations
- 3Captures
Metric Options: CountsSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Article Description
The traditional robust principal component analysis (RPCA) via decomposition into low-rank plus sparse matrices offers a powerful framework for a large variety of applications in computer vision. However, the reconstructed image experiences serious interference by Gaussian noise, resulting in the degradation of image quality during the denoising process. Thus, a novel manifold constrained joint sparse learning (MCJSL) via non-convex regularization approach is proposed in this paper. Morelly, the manifold constraint is adopted to preserve the local geometric structures and the non-convex joint sparsity is introduced to capture the global row-wise sparse structures. To solve MCJSL, an efficient optimization algorithm using the manifold alternating direction method of multipliers (MADMM) is designed with closed-form solutions and it achieves a fast and convergent procedure. Moreover, the convergence is analyzed mathematically and numerically. Comparisons among the proposed MCJSL and some state-of-the-art solvers, on several accessible datasets, are presented to demonstrate its superiority in image denoising and background subtraction. The results indicate the importance to incorporate the manifold learning and non-convex joint sparse regularization into a general RPCA framework.
Bibliographic Details
http://www.sciencedirect.com/science/article/pii/S0925231221009036; http://dx.doi.org/10.1016/j.neucom.2021.06.008; http://www.scopus.com/inward/record.url?partnerID=HzOxMe3b&scp=85109186142&origin=inward; https://linkinghub.elsevier.com/retrieve/pii/S0925231221009036; https://dx.doi.org/10.1016/j.neucom.2021.06.008
Elsevier BV
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know