dc.contributor.author | Huang, J | en_US |
dc.contributor.author | Wang, H | en_US |
dc.contributor.author | Wang, X | en_US |
dc.contributor.author | Ruzhansky, M | en_US |
dc.date.accessioned | 2023-02-21T14:54:11Z | |
dc.date.issued | 2021-07-01 | en_US |
dc.identifier.uri | https://qmro.qmul.ac.uk/xmlui/handle/123456789/84572 | |
dc.description | Final version but delete the graphic processing part | en_US |
dc.description.abstract | In this paper, we propose an interesting semi-sparsity smoothing algorithm based on a novel sparsity-inducing optimization framework. This method is derived from the multiple observations that semi-sparsity prior knowledge is more universally applicable, especially in areas where sparsity is not fully admitted, such as polynomial-smoothing surfaces. We illustrate that this semi-sparsity can be identified into a generalized $L_0$-norm minimization in higher-order gradient domains, thereby giving rise to a new "feature-aware" filtering method with a powerful simultaneous-fitting ability in both sparse features (singularities and sharpening edges) and non-sparse regions (polynomial-smoothing surfaces). Notice that a direct solver is always unavailable due to the non-convexity and combinatorial nature of $L_0$-norm minimization. Instead, we solve the model based on an efficient half-quadratic splitting minimization with fast Fourier transforms (FFTs) for acceleration. We finally demonstrate its versatility and many benefits to a series of signal/image processing and computer vision applications. | en_US |
dc.relation.ispartof | IEEE Transactions on Image Processing, 2023 | en_US |
dc.rights | Attribution-NonCommercial-ShareAlike 3.0 United States | * |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-sa/3.0/us/ | * |
dc.subject | cs.CV | en_US |
dc.subject | cs.CV | en_US |
dc.title | Semi-Sparsity for Smoothing Filters | en_US |
dc.type | Article | |
pubs.author-url | http://arxiv.org/abs/2107.00627v3 | en_US |
pubs.notes | Not known | en_US |
pubs.publisher-url | http://dx.doi.org/10.1109/TIP.2023.3247181 | en_US |