Matches in SemOpenAlex for { <https://semopenalex.org/work/W2156812330> ?p ?o ?g. }
- W2156812330 abstract "Night vision cameras are widely used for military and law enforcement applications related to surveillance, reconnaissance, intelligence gathering, and security. The two most common night-time imaging systems are low-light-level (e.g., image-intensified) cameras, which amplify the reflected visible to near infrared (VNIR) light, and thermal infrared (IR) cameras, which convert thermal energy from the midwave (3 to 5 microns) or the long wave (8 to 12 microns) part of the spectrum into a visible image. These systems create images with a single (one-dimensional) output per pixel. As a result their ability to discriminate different materials is limited. This can be improved by combining systems that are sensitive to different parts of the electromagnetic spectrum, resulting in multiband or hyperspectral imagers. The number of different outputs increases dramatically by combining multiple sensors (e.g. up to N2 for two sensors, when the number of different outputs for each sensor is N), which in turn leads to a significant increase in the number of materials that can be discriminated. The combination of multiple bands allows for meaningful color representation of the system output. It is therefore not surprising that the increasing availability of fused and multiband infrared and visual nightvision systems (e.g. Bandara et al., 2003; Breiter et al., 2002; Cho et al., 2003; Cohen et al., 2005; Goldberg et al., 2003) has led to a growing interest in the (false) color display of night vision imagery (Li & Wang, 2007; Shi et al., 2005a; Shi et al., 2005b; Tsagaris & Anastasopoulos, 2006; Zheng et al., 2005). In principle, color imagery has several benefits over monochrome imagery for surveillance, reconnaissance, and security applications. The human eye can only distinguish about 100 shades of gray at any instant. As a result, grayscale nightvision images are sometimes hard to interpret and may give rise to visual illusions and loss of situational awareness. Since people can discriminate several thousands of colors defined by varying hue, saturation, and brightness, a false color representation may facilitate nightvision image recognition and interpretation. For instance, color may improve feature contrast, thus enabling better scene segmentation and object detection (Walls, 2006). This may allow an observer to construct a more complete mental representation of the perceived scene, resulting in better situational awareness. It has indeed been found that scene understanding and recognition, reaction time, and object identification are faster and more accurate with color imagery than with monochrome imagery (Cavanillas, 1999; Gegenfurtner & Rieger, 2000; Goffaux et al., 2005; Oliva & Schyns, 2000; Rousselet et al., 2005; Sampson, 1996; Spence et al., 2006; Wichmann et al., 2002). Also, observers are able to selectively attend to task-relevant color targets and to" @default.
- W2156812330 created "2016-06-24" @default.
- W2156812330 creator A5023873625 @default.
- W2156812330 creator A5059611394 @default.
- W2156812330 date "2010-08-12" @default.
- W2156812330 modified "2023-10-16" @default.
- W2156812330 title "Real-Time Full Color Multiband Night Vision" @default.
- W2156812330 cites W1489483414 @default.
- W2156812330 cites W1497597921 @default.
- W2156812330 cites W1628496745 @default.
- W2156812330 cites W1745487727 @default.
- W2156812330 cites W1943665722 @default.
- W2156812330 cites W1963709092 @default.
- W2156812330 cites W1968608328 @default.
- W2156812330 cites W1973667959 @default.
- W2156812330 cites W1975971601 @default.
- W2156812330 cites W1978524192 @default.
- W2156812330 cites W1979184482 @default.
- W2156812330 cites W1982302436 @default.
- W2156812330 cites W1990944834 @default.
- W2156812330 cites W1994714721 @default.
- W2156812330 cites W1999697401 @default.
- W2156812330 cites W2005702598 @default.
- W2156812330 cites W2010413353 @default.
- W2156812330 cites W2013651838 @default.
- W2156812330 cites W2013930083 @default.
- W2156812330 cites W2018272558 @default.
- W2156812330 cites W2018670717 @default.
- W2156812330 cites W2023156624 @default.
- W2156812330 cites W2024099448 @default.
- W2156812330 cites W2024445944 @default.
- W2156812330 cites W2024791204 @default.
- W2156812330 cites W2024935827 @default.
- W2156812330 cites W2028713401 @default.
- W2156812330 cites W2031906945 @default.
- W2156812330 cites W2032608955 @default.
- W2156812330 cites W2032713147 @default.
- W2156812330 cites W2032995696 @default.
- W2156812330 cites W2033387570 @default.
- W2156812330 cites W2045779956 @default.
- W2156812330 cites W2048397240 @default.
- W2156812330 cites W2050211795 @default.
- W2156812330 cites W2050320786 @default.
- W2156812330 cites W2056578470 @default.
- W2156812330 cites W2058215225 @default.
- W2156812330 cites W2059388575 @default.
- W2156812330 cites W2063310877 @default.
- W2156812330 cites W2063516968 @default.
- W2156812330 cites W2065930117 @default.
- W2156812330 cites W2066606046 @default.
- W2156812330 cites W2067945592 @default.
- W2156812330 cites W2070358294 @default.
- W2156812330 cites W2078509321 @default.
- W2156812330 cites W2079811652 @default.
- W2156812330 cites W2086373106 @default.
- W2156812330 cites W2091991399 @default.
- W2156812330 cites W2092407328 @default.
- W2156812330 cites W2095491228 @default.
- W2156812330 cites W2101787804 @default.
- W2156812330 cites W2104026718 @default.
- W2156812330 cites W2104713750 @default.
- W2156812330 cites W2107722806 @default.
- W2156812330 cites W2108395757 @default.
- W2156812330 cites W2118832297 @default.
- W2156812330 cites W2119076421 @default.
- W2156812330 cites W2120216513 @default.
- W2156812330 cites W2120461818 @default.
- W2156812330 cites W2121614221 @default.
- W2156812330 cites W2124864015 @default.
- W2156812330 cites W2127254134 @default.
- W2156812330 cites W2129112648 @default.
- W2156812330 cites W2132900222 @default.
- W2156812330 cites W2137186219 @default.
- W2156812330 cites W2155560762 @default.
- W2156812330 cites W2159630815 @default.
- W2156812330 cites W2167441249 @default.
- W2156812330 cites W243362898 @default.
- W2156812330 cites W266428288 @default.
- W2156812330 cites W28030013 @default.
- W2156812330 cites W311018198 @default.
- W2156812330 cites W313422068 @default.
- W2156812330 cites W3163022734 @default.
- W2156812330 cites W575889657 @default.
- W2156812330 cites W1984532694 @default.
- W2156812330 cites W2082964902 @default.
- W2156812330 cites W2084575960 @default.
- W2156812330 cites W2200814162 @default.
- W2156812330 doi "https://doi.org/10.5772/10136" @default.
- W2156812330 hasPublicationYear "2010" @default.
- W2156812330 type Work @default.
- W2156812330 sameAs 2156812330 @default.
- W2156812330 citedByCount "0" @default.
- W2156812330 crossrefType "book-chapter" @default.
- W2156812330 hasAuthorship W2156812330A5023873625 @default.
- W2156812330 hasAuthorship W2156812330A5059611394 @default.
- W2156812330 hasBestOaLocation W21568123301 @default.
- W2156812330 hasConcept C154945302 @default.
- W2156812330 hasConcept C2983470273 @default.
- W2156812330 hasConcept C31972630 @default.
- W2156812330 hasConcept C41008148 @default.