Matches in SemOpenAlex for { <https://semopenalex.org/work/W3168283161> ?p ?o ?g. }
Showing items 1 to 100 of
100
with 100 items per page.
- W3168283161 endingPage "102624" @default.
- W3168283161 startingPage "102624" @default.
- W3168283161 abstract "An important challenge and limiting factor in deep learning methods for medical imaging segmentation is the lack of available of annotated data to properly train models. For the specific task of tumor segmentation, the process entails clinicians labeling every slice of volumetric scans for every patient, which becomes prohibitive at the scale of datasets required to train neural networks to optimal performance. To address this, we propose a novel semi-supervised framework that allows training any segmentation (encoder-decoder) model using only information readily available in radiological data, namely the presence of a tumor in the image, in addition to a few annotated images. Specifically, we conjecture that a generative model performing domain translation on this weak label - healthy vs diseased scans - helps achieve tumor segmentation. The proposed GenSeg method first disentangles tumoral tissue from healthy background tissue. The latent representation is separated into (1) the common background information across both domains, and (2) the unique tumoral information. GenSeg then achieves diseased-to-healthy image translation by decoding a healthy version of the image from just the common representation, as well as a residual image that allows adding back the tumors. The same decoder that produces this residual tumor image, also outputs a tumor segmentation. Implicit data augmentation is achieved by re-using the same framework for healthy-to-diseased image translation, where a residual tumor image is produced from a prior distribution. By performing both image translation and segmentation simultaneously, GenSeg allows training on only partially annotated datasets. To test the framework, we trained U-Net-like architectures using GenSeg and evaluated their performance on 3 variants of a synthetic task, as well as on 2 benchmark datasets: brain tumor segmentation in MRI (derived from BraTS) and liver metastasis segmentation in CT (derived from LiTS). Our method outperforms the baseline semi-supervised (autoencoder and mean teacher) and supervised segmentation methods, with improvements ranging between 8-14% Dice score on the brain task and 5-8% on the liver task, when only 1% of the training images were annotated. These results show the proposed framework is ideal at addressing the problem of training deep segmentation models when a large portion of the available data is unlabeled and unpaired, a common issue in tumor segmentation." @default.
- W3168283161 created "2021-06-22" @default.
- W3168283161 creator A5008902493 @default.
- W3168283161 creator A5031092503 @default.
- W3168283161 creator A5039187492 @default.
- W3168283161 creator A5056503617 @default.
- W3168283161 creator A5064825756 @default.
- W3168283161 creator A5066945976 @default.
- W3168283161 date "2022-11-01" @default.
- W3168283161 modified "2023-10-05" @default.
- W3168283161 title "Towards annotation-efficient segmentation via image-to-image translation" @default.
- W3168283161 cites W1677182931 @default.
- W3168283161 cites W2475287302 @default.
- W3168283161 cites W2794022343 @default.
- W3168283161 cites W2962793481 @default.
- W3168283161 cites W2963767194 @default.
- W3168283161 cites W2963800363 @default.
- W3168283161 cites W3101639073 @default.
- W3168283161 cites W3112701542 @default.
- W3168283161 cites W3159890710 @default.
- W3168283161 doi "https://doi.org/10.1016/j.media.2022.102624" @default.
- W3168283161 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/36208571" @default.
- W3168283161 hasPublicationYear "2022" @default.
- W3168283161 type Work @default.
- W3168283161 sameAs 3168283161 @default.
- W3168283161 citedByCount "6" @default.
- W3168283161 countsByYear W31682831612022 @default.
- W3168283161 countsByYear W31682831612023 @default.
- W3168283161 crossrefType "journal-article" @default.
- W3168283161 hasAuthorship W3168283161A5008902493 @default.
- W3168283161 hasAuthorship W3168283161A5031092503 @default.
- W3168283161 hasAuthorship W3168283161A5039187492 @default.
- W3168283161 hasAuthorship W3168283161A5056503617 @default.
- W3168283161 hasAuthorship W3168283161A5064825756 @default.
- W3168283161 hasAuthorship W3168283161A5066945976 @default.
- W3168283161 hasBestOaLocation W31682831612 @default.
- W3168283161 hasConcept C104317684 @default.
- W3168283161 hasConcept C105580179 @default.
- W3168283161 hasConcept C108583219 @default.
- W3168283161 hasConcept C111919701 @default.
- W3168283161 hasConcept C11413529 @default.
- W3168283161 hasConcept C115961682 @default.
- W3168283161 hasConcept C118505674 @default.
- W3168283161 hasConcept C124504099 @default.
- W3168283161 hasConcept C149364088 @default.
- W3168283161 hasConcept C153180895 @default.
- W3168283161 hasConcept C154945302 @default.
- W3168283161 hasConcept C155512373 @default.
- W3168283161 hasConcept C185592680 @default.
- W3168283161 hasConcept C199579030 @default.
- W3168283161 hasConcept C2776321320 @default.
- W3168283161 hasConcept C2779757391 @default.
- W3168283161 hasConcept C31972630 @default.
- W3168283161 hasConcept C41008148 @default.
- W3168283161 hasConcept C55493867 @default.
- W3168283161 hasConcept C89600930 @default.
- W3168283161 hasConcept C9417928 @default.
- W3168283161 hasConceptScore W3168283161C104317684 @default.
- W3168283161 hasConceptScore W3168283161C105580179 @default.
- W3168283161 hasConceptScore W3168283161C108583219 @default.
- W3168283161 hasConceptScore W3168283161C111919701 @default.
- W3168283161 hasConceptScore W3168283161C11413529 @default.
- W3168283161 hasConceptScore W3168283161C115961682 @default.
- W3168283161 hasConceptScore W3168283161C118505674 @default.
- W3168283161 hasConceptScore W3168283161C124504099 @default.
- W3168283161 hasConceptScore W3168283161C149364088 @default.
- W3168283161 hasConceptScore W3168283161C153180895 @default.
- W3168283161 hasConceptScore W3168283161C154945302 @default.
- W3168283161 hasConceptScore W3168283161C155512373 @default.
- W3168283161 hasConceptScore W3168283161C185592680 @default.
- W3168283161 hasConceptScore W3168283161C199579030 @default.
- W3168283161 hasConceptScore W3168283161C2776321320 @default.
- W3168283161 hasConceptScore W3168283161C2779757391 @default.
- W3168283161 hasConceptScore W3168283161C31972630 @default.
- W3168283161 hasConceptScore W3168283161C41008148 @default.
- W3168283161 hasConceptScore W3168283161C55493867 @default.
- W3168283161 hasConceptScore W3168283161C89600930 @default.
- W3168283161 hasConceptScore W3168283161C9417928 @default.
- W3168283161 hasLocation W31682831611 @default.
- W3168283161 hasLocation W31682831612 @default.
- W3168283161 hasLocation W31682831613 @default.
- W3168283161 hasOpenAccess W3168283161 @default.
- W3168283161 hasPrimaryLocation W31682831611 @default.
- W3168283161 hasRelatedWork W1669643531 @default.
- W3168283161 hasRelatedWork W2005437358 @default.
- W3168283161 hasRelatedWork W2120195071 @default.
- W3168283161 hasRelatedWork W2517104666 @default.
- W3168283161 hasRelatedWork W2790662084 @default.
- W3168283161 hasRelatedWork W2960184797 @default.
- W3168283161 hasRelatedWork W3048725906 @default.
- W3168283161 hasRelatedWork W3163744672 @default.
- W3168283161 hasRelatedWork W4224023746 @default.
- W3168283161 hasRelatedWork W4285827401 @default.
- W3168283161 hasVolume "82" @default.
- W3168283161 isParatext "false" @default.
- W3168283161 isRetracted "false" @default.
- W3168283161 magId "3168283161" @default.
- W3168283161 workType "article" @default.