Matches in SemOpenAlex for { <https://semopenalex.org/work/W2000100290> ?p ?o ?g. }
Showing items 1 to 68 of
68
with 100 items per page.
- W2000100290 abstract "According to the feature of infrared and visible image , a method of image fusion based on wavelet transform is proposed in this paper . Firstly, we make wavelet transform for the infrared and visible image after Preprocessing and regist-ration, so we get the different frequency wavelet coefficients of source images , then we separately do image fusion for different frequency domain after decomposition according to the rules: The low frequency wavelet coefficients are weighted and on high frequency wavelet coefficient we select large. Meanwhile we com-pare it with Space domain fusion algorithm according to the rules :The pixel are weighted and selected large. The simulation experiment result shows that using wavelet second fusion method has a good effect on the infrared and visible image fusion. Index Terms - Image fusion, Wavelet transform, Infrared image and visible image, Simulation experiment I. Introduction Image fusion (1) is a method which can merge multiple feature images from different imaging devices or sensors to one image. This image or scene is more complete. Based on different stages of image fusion, we usually dispose it on three levels : The pixel level fusion, the feature level fusion and the decision level fusion. On different level of image fusion we use different method. There is lots of methods on pixel level fusion, including: The simple method of weighted average to get fusion image; Laplace Pyramid (2)(3) method proposed by Burt; Ratio low pass Pyramid method, contrast Pyramid meth- od, gradient Pyramid method (3) etc. In the 90's , with the theory of wavelet transform (4) widely used in image processing , the technique of wavelet transform also has been successfully used in image fusion. The processing based on different resolution ratio of wavelet transform turn to be hot issue in signal and image processing research field in recent years. Method of wavelet transform in this paper belongs to the pixel level fusion, which is fundamental and frequented-used. It has the advantages of holding original information of fusion image, meanwhile the fusion accuracy is at a high level and the loss of information less than the other level fusion. This paper is based on the multi-resolution of wavelet decompose- tion, and use the method according to the rules: The low frequency wavelet coefficients are weighted and on high frequency wavelet coefficient we select large. Then we simulate algorithm by using Matlab and analyzed the result. Fusion method researched in this paper is aiming at the infrared image and visible image, with the same objects at the same distance with different sensors to capture the image, which can compensate for a single infrared sensor or a CCD sensor imaging defects, and it meet the specific needs of the image scene. Using this method we can make full use of the same scene complementary and redundant information of different images. Accordingly we achieve a more detailed , more specific explanation of the scene, meanwhile it provid the conditions for target recognition and tracking further. In this paper, the following image processing experiments we use: the image after pretreatment and registration, which is the same size. II. Common image fusion algorithm A. Space domain fusion method Pixel level image fusion technology (5) mainly includes the method based on space domain and frequency domain. Fusion method in space domain is a selection processing directly to the source image pixel value. This paper adopt two algorithm: pixel value weighted average fusion method and the pixel value selected large (small) fusion method." @default.
- W2000100290 created "2016-06-24" @default.
- W2000100290 creator A5016413021 @default.
- W2000100290 creator A5035274514 @default.
- W2000100290 date "2013-01-01" @default.
- W2000100290 modified "2023-09-23" @default.
- W2000100290 title "Infrared and Vsible Image Fusion Method Based on Wavelet Transform" @default.
- W2000100290 cites W1996278907 @default.
- W2000100290 cites W2102954654 @default.
- W2000100290 cites W2132984323 @default.
- W2000100290 cites W2142796063 @default.
- W2000100290 cites W2148125137 @default.
- W2000100290 doi "https://doi.org/10.2991/icacsei.2013.151" @default.
- W2000100290 hasPublicationYear "2013" @default.
- W2000100290 type Work @default.
- W2000100290 sameAs 2000100290 @default.
- W2000100290 citedByCount "1" @default.
- W2000100290 countsByYear W20001002902015 @default.
- W2000100290 crossrefType "proceedings-article" @default.
- W2000100290 hasAuthorship W2000100290A5016413021 @default.
- W2000100290 hasAuthorship W2000100290A5035274514 @default.
- W2000100290 hasBestOaLocation W20001002901 @default.
- W2000100290 hasConcept C115961682 @default.
- W2000100290 hasConcept C120665830 @default.
- W2000100290 hasConcept C121332964 @default.
- W2000100290 hasConcept C138885662 @default.
- W2000100290 hasConcept C153180895 @default.
- W2000100290 hasConcept C154945302 @default.
- W2000100290 hasConcept C158355884 @default.
- W2000100290 hasConcept C158525013 @default.
- W2000100290 hasConcept C196216189 @default.
- W2000100290 hasConcept C31972630 @default.
- W2000100290 hasConcept C41008148 @default.
- W2000100290 hasConcept C41895202 @default.
- W2000100290 hasConcept C47432892 @default.
- W2000100290 hasConcept C69744172 @default.
- W2000100290 hasConceptScore W2000100290C115961682 @default.
- W2000100290 hasConceptScore W2000100290C120665830 @default.
- W2000100290 hasConceptScore W2000100290C121332964 @default.
- W2000100290 hasConceptScore W2000100290C138885662 @default.
- W2000100290 hasConceptScore W2000100290C153180895 @default.
- W2000100290 hasConceptScore W2000100290C154945302 @default.
- W2000100290 hasConceptScore W2000100290C158355884 @default.
- W2000100290 hasConceptScore W2000100290C158525013 @default.
- W2000100290 hasConceptScore W2000100290C196216189 @default.
- W2000100290 hasConceptScore W2000100290C31972630 @default.
- W2000100290 hasConceptScore W2000100290C41008148 @default.
- W2000100290 hasConceptScore W2000100290C41895202 @default.
- W2000100290 hasConceptScore W2000100290C47432892 @default.
- W2000100290 hasConceptScore W2000100290C69744172 @default.
- W2000100290 hasLocation W20001002901 @default.
- W2000100290 hasLocation W20001002902 @default.
- W2000100290 hasOpenAccess W2000100290 @default.
- W2000100290 hasPrimaryLocation W20001002901 @default.
- W2000100290 hasRelatedWork W2004163240 @default.
- W2000100290 hasRelatedWork W2057200091 @default.
- W2000100290 hasRelatedWork W2077145522 @default.
- W2000100290 hasRelatedWork W2349027074 @default.
- W2000100290 hasRelatedWork W2359631359 @default.
- W2000100290 hasRelatedWork W2387809053 @default.
- W2000100290 hasRelatedWork W2388838478 @default.
- W2000100290 hasRelatedWork W2389551095 @default.
- W2000100290 hasRelatedWork W3008421596 @default.
- W2000100290 hasRelatedWork W4382139744 @default.
- W2000100290 isParatext "false" @default.
- W2000100290 isRetracted "false" @default.
- W2000100290 magId "2000100290" @default.
- W2000100290 workType "article" @default.