Matches in SemOpenAlex for { <https://semopenalex.org/work/W4312961672> ?p ?o ?g. }
- W4312961672 endingPage "133094" @default.
- W4312961672 startingPage "133078" @default.
- W4312961672 abstract "Rapid development in sketch-to-image translation methods boosts the investigation procedure in law enforcement agencies. But, the large modality gap between manually generated sketches makes this task challenging. Generative adversarial network (GAN) and encoder-decoder approach are usually incorporated to accomplish sketch-to-image generation with promising results. This paper targets the sketch-to-image translation with heterogeneous face angles and lighting effects using a multi-level conditional generative adversarial network. The proposed multi-level cGAN work in four different phases. Three independent cGANs’ networks are incorporated separately into each stage, followed by a CNN classifier. The Adam stochastic gradient descent mechanism was used for training with a learning rate of 0.0002 and momentum estimates β1 and β2 as 0.5 and 0.999, respectively. The multi-level 3D-convolutional architecture help to preserve spatial facial attributes and pixel-level details. The 3D convolution and deconvolution guide the <italic xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>G1</i> , <italic xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>G2</i> and <italic xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>G3</i> to use additional features and attributes for encoding and decoding. This helps to preserve the direction, postures of targeted image attributes and special relationships among the whole image’s features. The proposed framework process the 3D-Convolution and 3D-Deconvolution using vectorization. This process takes the same time as 2D convolution but extracts more features and facial attributes. We used pre-trained ResNet-50, ResNet-101, and Mobile-Net to classify generated high-resolution images from sketches. We have also developed, and state-of-the-art Pakistani Politicians Face-sketch Dataset (PPFD) for experimental purposes. Result reveals that the proposed cGAN model’s framework outperforms with respect to Accuracy, Structural similarity index measure (SSIM), Signal to noise ratio (SNR), and Peak signal-to-noise ratio (PSNR)." @default.
- W4312961672 created "2023-01-05" @default.
- W4312961672 creator A5058146554 @default.
- W4312961672 creator A5063522177 @default.
- W4312961672 creator A5068097238 @default.
- W4312961672 creator A5070832037 @default.
- W4312961672 creator A5073073869 @default.
- W4312961672 creator A5080561649 @default.
- W4312961672 creator A5080610155 @default.
- W4312961672 date "2022-01-01" @default.
- W4312961672 modified "2023-09-25" @default.
- W4312961672 title "Face Recognition via Multi-Level 3D-GAN Colorization" @default.
- W4312961672 cites W1529520183 @default.
- W4312961672 cites W1980093854 @default.
- W4312961672 cites W1999360130 @default.
- W4312961672 cites W2040708213 @default.
- W4312961672 cites W2069062751 @default.
- W4312961672 cites W2079349076 @default.
- W4312961672 cites W2130933227 @default.
- W4312961672 cites W2185498755 @default.
- W4312961672 cites W2242218935 @default.
- W4312961672 cites W2275363859 @default.
- W4312961672 cites W2326925005 @default.
- W4312961672 cites W2341755855 @default.
- W4312961672 cites W2560481159 @default.
- W4312961672 cites W2570189907 @default.
- W4312961672 cites W2579578355 @default.
- W4312961672 cites W2588361969 @default.
- W4312961672 cites W2612843093 @default.
- W4312961672 cites W2619170298 @default.
- W4312961672 cites W2734984521 @default.
- W4312961672 cites W2736633948 @default.
- W4312961672 cites W2750979989 @default.
- W4312961672 cites W2799839429 @default.
- W4312961672 cites W2891833067 @default.
- W4312961672 cites W2900551388 @default.
- W4312961672 cites W2913044631 @default.
- W4312961672 cites W2957909335 @default.
- W4312961672 cites W2962753657 @default.
- W4312961672 cites W2962793481 @default.
- W4312961672 cites W2962927829 @default.
- W4312961672 cites W2962931900 @default.
- W4312961672 cites W2963012812 @default.
- W4312961672 cites W2963470893 @default.
- W4312961672 cites W2963609056 @default.
- W4312961672 cites W2963626105 @default.
- W4312961672 cites W2963671154 @default.
- W4312961672 cites W2963800363 @default.
- W4312961672 cites W2964154847 @default.
- W4312961672 cites W2964204305 @default.
- W4312961672 cites W2964833232 @default.
- W4312961672 cites W2971124384 @default.
- W4312961672 cites W2972003492 @default.
- W4312961672 cites W2976288813 @default.
- W4312961672 cites W297909767 @default.
- W4312961672 cites W2989505547 @default.
- W4312961672 cites W3009852267 @default.
- W4312961672 cites W3082166846 @default.
- W4312961672 cites W3087866654 @default.
- W4312961672 cites W3096831136 @default.
- W4312961672 cites W3098848838 @default.
- W4312961672 cites W3099206234 @default.
- W4312961672 cites W3101162162 @default.
- W4312961672 cites W3105145243 @default.
- W4312961672 cites W3121667743 @default.
- W4312961672 cites W3131481615 @default.
- W4312961672 cites W4225696264 @default.
- W4312961672 cites W4225888087 @default.
- W4312961672 cites W4226348722 @default.
- W4312961672 cites W4282937861 @default.
- W4312961672 doi "https://doi.org/10.1109/access.2022.3226453" @default.
- W4312961672 hasPublicationYear "2022" @default.
- W4312961672 type Work @default.
- W4312961672 citedByCount "0" @default.
- W4312961672 crossrefType "journal-article" @default.
- W4312961672 hasAuthorship W4312961672A5058146554 @default.
- W4312961672 hasAuthorship W4312961672A5063522177 @default.
- W4312961672 hasAuthorship W4312961672A5068097238 @default.
- W4312961672 hasAuthorship W4312961672A5070832037 @default.
- W4312961672 hasAuthorship W4312961672A5073073869 @default.
- W4312961672 hasAuthorship W4312961672A5080561649 @default.
- W4312961672 hasAuthorship W4312961672A5080610155 @default.
- W4312961672 hasBestOaLocation W43129616721 @default.
- W4312961672 hasConcept C11413529 @default.
- W4312961672 hasConcept C153180895 @default.
- W4312961672 hasConcept C154945302 @default.
- W4312961672 hasConcept C160633673 @default.
- W4312961672 hasConcept C199360897 @default.
- W4312961672 hasConcept C2779231336 @default.
- W4312961672 hasConcept C41008148 @default.
- W4312961672 hasConcept C519991488 @default.
- W4312961672 hasConceptScore W4312961672C11413529 @default.
- W4312961672 hasConceptScore W4312961672C153180895 @default.
- W4312961672 hasConceptScore W4312961672C154945302 @default.
- W4312961672 hasConceptScore W4312961672C160633673 @default.
- W4312961672 hasConceptScore W4312961672C199360897 @default.
- W4312961672 hasConceptScore W4312961672C2779231336 @default.
- W4312961672 hasConceptScore W4312961672C41008148 @default.