Matches in SemOpenAlex for { <https://semopenalex.org/work/W2294979170> ?p ?o ?g. }
- W2294979170 abstract "Words are polysemous. However, most approaches to representation learning for lexical semantics assign a single vector to every surface word type. Meanwhile, lexical ontologies such as WordNet provide a source of complementary knowledge to distributional information, including a word sense inventory. In this paper we propose two novel and general approaches for generating sense-specific word embeddings that are grounded in an ontology. The first applies graph smoothing as a postprocessing step to tease the vectors of different senses apart, and is applicable to any vector space model. The second adapts predictive maximum likelihood models that learn word embeddings with latent variables representing senses grounded in an specified ontology. Empirical results on lexical semantic tasks show that our approaches effectively captures information from both the ontology and distributional statistics. Moreover, in most cases our sense-specific models outperform other models we compare against." @default.
- W2294979170 created "2016-06-24" @default.
- W2294979170 creator A5060225743 @default.
- W2294979170 creator A5061131385 @default.
- W2294979170 creator A5076052920 @default.
- W2294979170 date "2015-01-01" @default.
- W2294979170 modified "2023-10-13" @default.
- W2294979170 title "Ontologically Grounded Multi-sense Representation Learning for Semantic Vector Space Models" @default.
- W2294979170 cites W124595812 @default.
- W2294979170 cites W1503259811 @default.
- W2294979170 cites W1532325895 @default.
- W2294979170 cites W1567365482 @default.
- W2294979170 cites W1731542712 @default.
- W2294979170 cites W1973942085 @default.
- W2294979170 cites W1983578042 @default.
- W2294979170 cites W2038084848 @default.
- W2294979170 cites W2053921957 @default.
- W2294979170 cites W2077428231 @default.
- W2294979170 cites W2080100102 @default.
- W2294979170 cites W2081580037 @default.
- W2294979170 cites W2100062901 @default.
- W2294979170 cites W2101293500 @default.
- W2294979170 cites W2103318667 @default.
- W2294979170 cites W2108061274 @default.
- W2294979170 cites W2112184938 @default.
- W2294979170 cites W2117130368 @default.
- W2294979170 cites W2120861206 @default.
- W2294979170 cites W2137607259 @default.
- W2294979170 cites W2139823104 @default.
- W2294979170 cites W2141365610 @default.
- W2294979170 cites W2141599568 @default.
- W2294979170 cites W2143645432 @default.
- W2294979170 cites W2147152072 @default.
- W2294979170 cites W2162456950 @default.
- W2294979170 cites W2164019165 @default.
- W2294979170 cites W2250930514 @default.
- W2294979170 cites W2251044566 @default.
- W2294979170 cites W2251762914 @default.
- W2294979170 cites W2251797829 @default.
- W2294979170 cites W2251997481 @default.
- W2294979170 cites W2950577311 @default.
- W2294979170 cites W2951193962 @default.
- W2294979170 cites W2962689487 @default.
- W2294979170 cites W2962769333 @default.
- W2294979170 cites W2963056604 @default.
- W2294979170 cites W60571609 @default.
- W2294979170 doi "https://doi.org/10.3115/v1/n15-1070" @default.
- W2294979170 hasPublicationYear "2015" @default.
- W2294979170 type Work @default.
- W2294979170 sameAs 2294979170 @default.
- W2294979170 citedByCount "93" @default.
- W2294979170 countsByYear W22949791702015 @default.
- W2294979170 countsByYear W22949791702016 @default.
- W2294979170 countsByYear W22949791702017 @default.
- W2294979170 countsByYear W22949791702018 @default.
- W2294979170 countsByYear W22949791702019 @default.
- W2294979170 countsByYear W22949791702020 @default.
- W2294979170 countsByYear W22949791702021 @default.
- W2294979170 countsByYear W22949791702022 @default.
- W2294979170 countsByYear W22949791702023 @default.
- W2294979170 crossrefType "proceedings-article" @default.
- W2294979170 hasAuthorship W2294979170A5060225743 @default.
- W2294979170 hasAuthorship W2294979170A5061131385 @default.
- W2294979170 hasAuthorship W2294979170A5076052920 @default.
- W2294979170 hasBestOaLocation W22949791701 @default.
- W2294979170 hasConcept C111472728 @default.
- W2294979170 hasConcept C130318100 @default.
- W2294979170 hasConcept C138885662 @default.
- W2294979170 hasConcept C154945302 @default.
- W2294979170 hasConcept C157659113 @default.
- W2294979170 hasConcept C17744445 @default.
- W2294979170 hasConcept C184337299 @default.
- W2294979170 hasConcept C199360897 @default.
- W2294979170 hasConcept C199539241 @default.
- W2294979170 hasConcept C204321447 @default.
- W2294979170 hasConcept C2524010 @default.
- W2294979170 hasConcept C25810664 @default.
- W2294979170 hasConcept C2776359362 @default.
- W2294979170 hasConcept C2778828372 @default.
- W2294979170 hasConcept C2780276568 @default.
- W2294979170 hasConcept C33923547 @default.
- W2294979170 hasConcept C41008148 @default.
- W2294979170 hasConcept C89686163 @default.
- W2294979170 hasConcept C90805587 @default.
- W2294979170 hasConcept C94625758 @default.
- W2294979170 hasConceptScore W2294979170C111472728 @default.
- W2294979170 hasConceptScore W2294979170C130318100 @default.
- W2294979170 hasConceptScore W2294979170C138885662 @default.
- W2294979170 hasConceptScore W2294979170C154945302 @default.
- W2294979170 hasConceptScore W2294979170C157659113 @default.
- W2294979170 hasConceptScore W2294979170C17744445 @default.
- W2294979170 hasConceptScore W2294979170C184337299 @default.
- W2294979170 hasConceptScore W2294979170C199360897 @default.
- W2294979170 hasConceptScore W2294979170C199539241 @default.
- W2294979170 hasConceptScore W2294979170C204321447 @default.
- W2294979170 hasConceptScore W2294979170C2524010 @default.
- W2294979170 hasConceptScore W2294979170C25810664 @default.
- W2294979170 hasConceptScore W2294979170C2776359362 @default.
- W2294979170 hasConceptScore W2294979170C2778828372 @default.
- W2294979170 hasConceptScore W2294979170C2780276568 @default.