Matches in SemOpenAlex for { <https://semopenalex.org/work/W4385520134> ?p ?o ?g. }
- W4385520134 abstract "The recently proposed sparsifying transform (ST) models incur low computational cost and have been applied to medical imaging. Meanwhile, deep models with nested network structure reveal great potential for learning features in different layers. In this study, we propose a network-structured ST learning approach for X-ray computed tomography (CT), which we refer to as multi-layer clustering-based residual sparsifying transform (MCST) learning. The proposed MCST scheme learns multiple different unitary transforms in each layer by dividing each layer's input into several classes. We apply the MCST model to low-dose CT (LDCT) reconstruction by deploying the learned MCST model into the regularizer in penalized weighted least squares (PWLS) reconstruction.The proposed MCST model combines a multi-layer sparse representation structure with multiple clusters for the features in each layer that are modeled by a rich collection of transforms. We train the MCST model in an unsupervised manner via a block coordinate descent (BCD) algorithm. Since our method is patch-based, the training can be performed with a limited set of images. For CT image reconstruction, we devise a novel algorithm called PWLS-MCST by integrating the pre-learned MCST signal model with PWLS optimization.We conducted LDCT reconstruction experiments on XCAT phantom data, Numerical Mayo Clinical CT dataset and LDCT image and projection dataset (Clinical LDCT dataset). We trained the MCST model with two (or three) layers and with five clusters in each layer. The learned transforms in the same layer showed rich features while additional information is extracted from representation residuals. Our simulation results and clinical results demonstrate that PWLS-MCST achieves better image reconstruction quality than the conventional filtered back-projection (FBP) method and PWLS with edge-preserving (EP) regularizer. It also outperformed recent advanced methods like PWLS with a learned multi-layer residual sparsifying transform (MARS) prior and PWLS with a union of learned transforms (ULTRA), especially for displaying clear edges and preserving subtle details.In this work, a multi-layer sparse signal model with a nested network structure is proposed. We refer this novel model as the MCST model that exploits multi-layer residual maps to sparsify the underlying image and clusters the inputs in each layer for accurate sparsification. We presented a new PWLS framework with a learned MCST regularizer for LDCT reconstruction. Experimental results show that the proposed PWLS-MCST provides clearer reconstructions than several baseline methods. The code for PWLS-MCST is released at https://github.com/Xikai97/PWLS-MCST." @default.
- W4385520134 created "2023-08-04" @default.
- W4385520134 creator A5026237893 @default.
- W4385520134 creator A5046094203 @default.
- W4385520134 creator A5057762207 @default.
- W4385520134 creator A5058676706 @default.
- W4385520134 creator A5085257241 @default.
- W4385520134 date "2023-08-03" @default.
- W4385520134 modified "2023-10-12" @default.
- W4385520134 title "Multi‐layer clustering‐based residual sparsifying transform for low‐dose CT image reconstruction" @default.
- W4385520134 cites W1994281301 @default.
- W4385520134 cites W1995758426 @default.
- W4385520134 cites W2003624223 @default.
- W4385520134 cites W2017692724 @default.
- W4385520134 cites W2053958583 @default.
- W4385520134 cites W2057069782 @default.
- W4385520134 cites W2066771519 @default.
- W4385520134 cites W2075157914 @default.
- W4385520134 cites W2090457307 @default.
- W4385520134 cites W2094366314 @default.
- W4385520134 cites W2114122776 @default.
- W4385520134 cites W2120047933 @default.
- W4385520134 cites W2121058967 @default.
- W4385520134 cites W2128659236 @default.
- W4385520134 cites W2141039087 @default.
- W4385520134 cites W2142419873 @default.
- W4385520134 cites W2153663612 @default.
- W4385520134 cites W2157812230 @default.
- W4385520134 cites W2160547390 @default.
- W4385520134 cites W2189938900 @default.
- W4385520134 cites W2469946482 @default.
- W4385520134 cites W2512266304 @default.
- W4385520134 cites W2574952845 @default.
- W4385520134 cites W2584483805 @default.
- W4385520134 cites W2589565096 @default.
- W4385520134 cites W2952026614 @default.
- W4385520134 cites W2964182159 @default.
- W4385520134 cites W3103528285 @default.
- W4385520134 cites W3103702404 @default.
- W4385520134 cites W3177546765 @default.
- W4385520134 cites W3199623138 @default.
- W4385520134 doi "https://doi.org/10.1002/mp.16645" @default.
- W4385520134 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/37535932" @default.
- W4385520134 hasPublicationYear "2023" @default.
- W4385520134 type Work @default.
- W4385520134 citedByCount "0" @default.
- W4385520134 crossrefType "journal-article" @default.
- W4385520134 hasAuthorship W4385520134A5026237893 @default.
- W4385520134 hasAuthorship W4385520134A5046094203 @default.
- W4385520134 hasAuthorship W4385520134A5057762207 @default.
- W4385520134 hasAuthorship W4385520134A5058676706 @default.
- W4385520134 hasAuthorship W4385520134A5085257241 @default.
- W4385520134 hasBestOaLocation W43855201342 @default.
- W4385520134 hasConcept C104293457 @default.
- W4385520134 hasConcept C11413529 @default.
- W4385520134 hasConcept C120665830 @default.
- W4385520134 hasConcept C121332964 @default.
- W4385520134 hasConcept C141379421 @default.
- W4385520134 hasConcept C153180895 @default.
- W4385520134 hasConcept C154945302 @default.
- W4385520134 hasConcept C155512373 @default.
- W4385520134 hasConcept C17744445 @default.
- W4385520134 hasConcept C199539241 @default.
- W4385520134 hasConcept C2776359362 @default.
- W4385520134 hasConcept C31972630 @default.
- W4385520134 hasConcept C41008148 @default.
- W4385520134 hasConcept C57493831 @default.
- W4385520134 hasConcept C73555534 @default.
- W4385520134 hasConcept C94625758 @default.
- W4385520134 hasConceptScore W4385520134C104293457 @default.
- W4385520134 hasConceptScore W4385520134C11413529 @default.
- W4385520134 hasConceptScore W4385520134C120665830 @default.
- W4385520134 hasConceptScore W4385520134C121332964 @default.
- W4385520134 hasConceptScore W4385520134C141379421 @default.
- W4385520134 hasConceptScore W4385520134C153180895 @default.
- W4385520134 hasConceptScore W4385520134C154945302 @default.
- W4385520134 hasConceptScore W4385520134C155512373 @default.
- W4385520134 hasConceptScore W4385520134C17744445 @default.
- W4385520134 hasConceptScore W4385520134C199539241 @default.
- W4385520134 hasConceptScore W4385520134C2776359362 @default.
- W4385520134 hasConceptScore W4385520134C31972630 @default.
- W4385520134 hasConceptScore W4385520134C41008148 @default.
- W4385520134 hasConceptScore W4385520134C57493831 @default.
- W4385520134 hasConceptScore W4385520134C73555534 @default.
- W4385520134 hasConceptScore W4385520134C94625758 @default.
- W4385520134 hasLocation W43855201341 @default.
- W4385520134 hasLocation W43855201342 @default.
- W4385520134 hasLocation W43855201343 @default.
- W4385520134 hasOpenAccess W4385520134 @default.
- W4385520134 hasPrimaryLocation W43855201341 @default.
- W4385520134 hasRelatedWork W2004988775 @default.
- W4385520134 hasRelatedWork W2109481748 @default.
- W4385520134 hasRelatedWork W2144778520 @default.
- W4385520134 hasRelatedWork W2381719890 @default.
- W4385520134 hasRelatedWork W2417440389 @default.
- W4385520134 hasRelatedWork W2734382758 @default.
- W4385520134 hasRelatedWork W275168305 @default.
- W4385520134 hasRelatedWork W4206096448 @default.
- W4385520134 hasRelatedWork W4378746257 @default.
- W4385520134 hasRelatedWork W4385556839 @default.