Matches in SemOpenAlex for { <https://semopenalex.org/work/W1538577929> ?p ?o ?g. }
Showing items 1 to 100 of
100
with 100 items per page.
- W1538577929 abstract "Almost from the very beginning of the digital age, people have sought better ways to communicate with computers. This research investigates how computers might be enabled to understand natural language in a more humanlike way. Based, in part, on cognitive development in infants, we introduce an open computational framework for visual perception and grounded language acquisition called Experience-Based Language Acquisition (EBLA). EBLA can “watch” a series of short videos and acquire a simple language of nouns and verbs corresponding to the objects and object-object relations in those videos. Upon acquiring this protolanguage, EBLA can perform basic scene analysis to generate descriptions of novel videos. The general architecture of EBLA is comprised of three stages: vision processing, entity extraction, and lexical resolution. In the vision processing stage, EBLA processes the individual frames in short videos, using a variation of the mean shift analysis image segmentation algorithm to identify and store information about significant objects. In the entity extraction stage, EBLA abstracts information about the significant objects in each video and the relationships among those objects into internal representations called entities. Finally, in the lexical acquisition stage, EBLA extracts the individual lexemes (words) from simple descriptions of each video and attempts to generate entity-lexeme mappings using an inference technique called cross-situational learning. EBLA is not primed with a base lexicon, so it faces the task of bootstrapping its lexicon from scratch. The performance of EBLA has been evaluated based on acquisition speed and accuracy of scene descriptions. For a test set of simple animations, EBLA had average acquisition success rates as high as 100% and average description success rates as high as 96.7%. For a larger set of real videos, EBLA had average acquisition success rates as high as 95.8% and average description success rates as high as 65.3%. The lower description success rate for the videos is attributed to the wide variance in entities across the videos. While there have been several systems capable of learning object or event labels for videos, EBLA is the first known system to acquire both nouns and verbs using a grounded computer vision system." @default.
- W1538577929 created "2016-06-24" @default.
- W1538577929 creator A5009505287 @default.
- W1538577929 creator A5017008492 @default.
- W1538577929 date "2022-06-10" @default.
- W1538577929 modified "2023-10-18" @default.
- W1538577929 title "Experience-based language acquisition: a computational model of human language acquisition" @default.
- W1538577929 cites W1034657477 @default.
- W1538577929 cites W1484557201 @default.
- W1538577929 cites W1485243506 @default.
- W1538577929 cites W1494492530 @default.
- W1538577929 cites W1499050190 @default.
- W1538577929 cites W1522692919 @default.
- W1538577929 cites W1537671687 @default.
- W1538577929 cites W1574901103 @default.
- W1538577929 cites W1576541020 @default.
- W1538577929 cites W1579838312 @default.
- W1538577929 cites W1581719612 @default.
- W1538577929 cites W159378994 @default.
- W1538577929 cites W1604972505 @default.
- W1538577929 cites W1608429706 @default.
- W1538577929 cites W1704165438 @default.
- W1538577929 cites W1771552368 @default.
- W1538577929 cites W1964443764 @default.
- W1538577929 cites W1968211338 @default.
- W1538577929 cites W1976274322 @default.
- W1538577929 cites W1984314602 @default.
- W1538577929 cites W2008318777 @default.
- W1538577929 cites W2044929988 @default.
- W1538577929 cites W2053000838 @default.
- W1538577929 cites W2067191022 @default.
- W1538577929 cites W2071402670 @default.
- W1538577929 cites W2100317815 @default.
- W1538577929 cites W2100435576 @default.
- W1538577929 cites W2107917162 @default.
- W1538577929 cites W2108020239 @default.
- W1538577929 cites W2119171928 @default.
- W1538577929 cites W2119232785 @default.
- W1538577929 cites W2160783091 @default.
- W1538577929 cites W2160995891 @default.
- W1538577929 cites W2165934840 @default.
- W1538577929 cites W2167077256 @default.
- W1538577929 cites W2323385789 @default.
- W1538577929 cites W2432517183 @default.
- W1538577929 cites W2467565240 @default.
- W1538577929 cites W2800024569 @default.
- W1538577929 cites W3129857645 @default.
- W1538577929 doi "https://doi.org/10.31390/gradschool_dissertations.2162" @default.
- W1538577929 hasPublicationYear "2022" @default.
- W1538577929 type Work @default.
- W1538577929 sameAs 1538577929 @default.
- W1538577929 citedByCount "2" @default.
- W1538577929 countsByYear W15385779292022 @default.
- W1538577929 crossrefType "dissertation" @default.
- W1538577929 hasAuthorship W1538577929A5009505287 @default.
- W1538577929 hasAuthorship W1538577929A5017008492 @default.
- W1538577929 hasBestOaLocation W15385779291 @default.
- W1538577929 hasConcept C106159729 @default.
- W1538577929 hasConcept C121934690 @default.
- W1538577929 hasConcept C138885662 @default.
- W1538577929 hasConcept C154945302 @default.
- W1538577929 hasConcept C162324750 @default.
- W1538577929 hasConcept C204321447 @default.
- W1538577929 hasConcept C207609745 @default.
- W1538577929 hasConcept C2775837122 @default.
- W1538577929 hasConcept C2778121359 @default.
- W1538577929 hasConcept C2781238097 @default.
- W1538577929 hasConcept C28490314 @default.
- W1538577929 hasConcept C41008148 @default.
- W1538577929 hasConcept C41895202 @default.
- W1538577929 hasConceptScore W1538577929C106159729 @default.
- W1538577929 hasConceptScore W1538577929C121934690 @default.
- W1538577929 hasConceptScore W1538577929C138885662 @default.
- W1538577929 hasConceptScore W1538577929C154945302 @default.
- W1538577929 hasConceptScore W1538577929C162324750 @default.
- W1538577929 hasConceptScore W1538577929C204321447 @default.
- W1538577929 hasConceptScore W1538577929C207609745 @default.
- W1538577929 hasConceptScore W1538577929C2775837122 @default.
- W1538577929 hasConceptScore W1538577929C2778121359 @default.
- W1538577929 hasConceptScore W1538577929C2781238097 @default.
- W1538577929 hasConceptScore W1538577929C28490314 @default.
- W1538577929 hasConceptScore W1538577929C41008148 @default.
- W1538577929 hasConceptScore W1538577929C41895202 @default.
- W1538577929 hasLocation W15385779291 @default.
- W1538577929 hasOpenAccess W1538577929 @default.
- W1538577929 hasPrimaryLocation W15385779291 @default.
- W1538577929 hasRelatedWork W1985896407 @default.
- W1538577929 hasRelatedWork W1993243953 @default.
- W1538577929 hasRelatedWork W2005254050 @default.
- W1538577929 hasRelatedWork W2029044131 @default.
- W1538577929 hasRelatedWork W2080588850 @default.
- W1538577929 hasRelatedWork W2338796508 @default.
- W1538577929 hasRelatedWork W2402539320 @default.
- W1538577929 hasRelatedWork W2757280630 @default.
- W1538577929 hasRelatedWork W2902818737 @default.
- W1538577929 hasRelatedWork W125387651 @default.
- W1538577929 isParatext "false" @default.
- W1538577929 isRetracted "false" @default.
- W1538577929 magId "1538577929" @default.
- W1538577929 workType "dissertation" @default.