Matches in SemOpenAlex for { <https://semopenalex.org/work/W2015390216> ?p ?o ?g. }
Showing items 1 to 88 of
88
with 100 items per page.
- W2015390216 abstract "Video retrieval is mostly based on using text from dialogue and this remains the most significant component, despite progress in other aspects. One problem with this is when a searcher wants to locate video based on what is appearing in the video rather than what is being spoken about. Alternatives such as automatically-detected features and image-based keyframe matching can be used, though these still need further improvement in quality. One other modality for video retrieval is based on segmenting objects from video and allowing endusers to use these as part of querying. This uses similarity between query objects and objects from video, and in theory allows retrieval based on what is actually appearing on-screen. The main hurdles to greater use of this are the overhead of object segmentation on large amounts of video and the issue of whether we can actually achieve effective object-based retrieval. We describe a system to support object-based video retrieval where a user selects example video objects as part of the query. During a search a user builds up a set of these which are matched against objects previously segmented from a video library. This match is based on MPEG-7 Dominant Colour, Shape Compaction and Texture Browsing descriptors. We use a user-driven semi-automated segmentation process to segment the video archive which is very accurate and is faster than conventional video annotation." @default.
- W2015390216 created "2016-06-24" @default.
- W2015390216 creator A5006537336 @default.
- W2015390216 creator A5007592127 @default.
- W2015390216 creator A5055254350 @default.
- W2015390216 creator A5068280279 @default.
- W2015390216 creator A5081902636 @default.
- W2015390216 date "2005-10-23" @default.
- W2015390216 modified "2023-10-05" @default.
- W2015390216 title "<title>Using video objects and relevance feedback in video retrieval</title>" @default.
- W2015390216 cites W1485800236 @default.
- W2015390216 cites W1538510386 @default.
- W2015390216 cites W2010377559 @default.
- W2015390216 cites W2104830611 @default.
- W2015390216 cites W2107930181 @default.
- W2015390216 cites W2108411619 @default.
- W2015390216 cites W2112079408 @default.
- W2015390216 cites W2125148312 @default.
- W2015390216 cites W2131846894 @default.
- W2015390216 cites W2151575618 @default.
- W2015390216 cites W2161305861 @default.
- W2015390216 cites W2161765871 @default.
- W2015390216 cites W1560140077 @default.
- W2015390216 doi "https://doi.org/10.1117/12.629654" @default.
- W2015390216 hasPublicationYear "2005" @default.
- W2015390216 type Work @default.
- W2015390216 sameAs 2015390216 @default.
- W2015390216 citedByCount "6" @default.
- W2015390216 countsByYear W20153902162021 @default.
- W2015390216 crossrefType "proceedings-article" @default.
- W2015390216 hasAuthorship W2015390216A5006537336 @default.
- W2015390216 hasAuthorship W2015390216A5007592127 @default.
- W2015390216 hasAuthorship W2015390216A5055254350 @default.
- W2015390216 hasAuthorship W2015390216A5068280279 @default.
- W2015390216 hasAuthorship W2015390216A5081902636 @default.
- W2015390216 hasBestOaLocation W20153902162 @default.
- W2015390216 hasConcept C105795698 @default.
- W2015390216 hasConcept C106030495 @default.
- W2015390216 hasConcept C115961682 @default.
- W2015390216 hasConcept C154945302 @default.
- W2015390216 hasConcept C165064840 @default.
- W2015390216 hasConcept C1667742 @default.
- W2015390216 hasConcept C177264268 @default.
- W2015390216 hasConcept C199360897 @default.
- W2015390216 hasConcept C202474056 @default.
- W2015390216 hasConcept C23123220 @default.
- W2015390216 hasConcept C2775856596 @default.
- W2015390216 hasConcept C2779532271 @default.
- W2015390216 hasConcept C2781238097 @default.
- W2015390216 hasConcept C31972630 @default.
- W2015390216 hasConcept C33923547 @default.
- W2015390216 hasConcept C41008148 @default.
- W2015390216 hasConcept C89600930 @default.
- W2015390216 hasConceptScore W2015390216C105795698 @default.
- W2015390216 hasConceptScore W2015390216C106030495 @default.
- W2015390216 hasConceptScore W2015390216C115961682 @default.
- W2015390216 hasConceptScore W2015390216C154945302 @default.
- W2015390216 hasConceptScore W2015390216C165064840 @default.
- W2015390216 hasConceptScore W2015390216C1667742 @default.
- W2015390216 hasConceptScore W2015390216C177264268 @default.
- W2015390216 hasConceptScore W2015390216C199360897 @default.
- W2015390216 hasConceptScore W2015390216C202474056 @default.
- W2015390216 hasConceptScore W2015390216C23123220 @default.
- W2015390216 hasConceptScore W2015390216C2775856596 @default.
- W2015390216 hasConceptScore W2015390216C2779532271 @default.
- W2015390216 hasConceptScore W2015390216C2781238097 @default.
- W2015390216 hasConceptScore W2015390216C31972630 @default.
- W2015390216 hasConceptScore W2015390216C33923547 @default.
- W2015390216 hasConceptScore W2015390216C41008148 @default.
- W2015390216 hasConceptScore W2015390216C89600930 @default.
- W2015390216 hasLocation W20153902161 @default.
- W2015390216 hasLocation W20153902162 @default.
- W2015390216 hasOpenAccess W2015390216 @default.
- W2015390216 hasPrimaryLocation W20153902161 @default.
- W2015390216 hasRelatedWork W1487175407 @default.
- W2015390216 hasRelatedWork W1536471031 @default.
- W2015390216 hasRelatedWork W2015390216 @default.
- W2015390216 hasRelatedWork W2019566805 @default.
- W2015390216 hasRelatedWork W2099736636 @default.
- W2015390216 hasRelatedWork W2124913745 @default.
- W2015390216 hasRelatedWork W2383464976 @default.
- W2015390216 hasRelatedWork W2385949326 @default.
- W2015390216 hasRelatedWork W2789220062 @default.
- W2015390216 hasRelatedWork W1967061043 @default.
- W2015390216 isParatext "false" @default.
- W2015390216 isRetracted "false" @default.
- W2015390216 magId "2015390216" @default.
- W2015390216 workType "article" @default.