Matches in SemOpenAlex for { <https://semopenalex.org/work/W3117713602> ?p ?o ?g. }
- W3117713602 endingPage "424" @default.
- W3117713602 startingPage "409" @default.
- W3117713602 abstract "Transformer-based pre-trained language models (PLMs) have dramatically improved the state of the art in NLP across many tasks. This has led to substantial interest in analyzing the syntactic knowledge PLMs learn. Previous approaches to this question have been limited, mostly using test suites or probes. Here, we propose a novel fully unsupervised parsing approach that extracts constituency trees from PLM attention heads. We rank transformer attention heads based on their inherent properties, and create an ensemble of high-ranking heads to produce the final tree. Our method is adaptable to low-resource languages, as it does not rely on development sets, which can be expensive to annotate. Our experiments show that the proposed method often outperform existing approaches if there is no development set present. Our unsupervised parser can also be used as a tool to analyze the grammars PLMs learn implicitly. For this, we use the parse trees induced by our method to train a neural PCFG and compare it to a grammar derived from a human-annotated treebank." @default.
- W3117713602 created "2021-01-05" @default.
- W3117713602 creator A5009993083 @default.
- W3117713602 creator A5018370260 @default.
- W3117713602 creator A5043883660 @default.
- W3117713602 creator A5054936589 @default.
- W3117713602 date "2020-12-01" @default.
- W3117713602 modified "2023-09-27" @default.
- W3117713602 title "Heads-up! Unsupervised Constituency Parsing via Self-Attention Heads" @default.
- W3117713602 cites W1495446613 @default.
- W3117713602 cites W1502293651 @default.
- W3117713602 cites W1632114991 @default.
- W3117713602 cites W1970961429 @default.
- W3117713602 cites W2047706513 @default.
- W3117713602 cites W2152907450 @default.
- W3117713602 cites W2798569372 @default.
- W3117713602 cites W2888844359 @default.
- W3117713602 cites W2889260178 @default.
- W3117713602 cites W2910243263 @default.
- W3117713602 cites W2912206855 @default.
- W3117713602 cites W2914924671 @default.
- W3117713602 cites W2932376173 @default.
- W3117713602 cites W2932637973 @default.
- W3117713602 cites W2946359678 @default.
- W3117713602 cites W2949399644 @default.
- W3117713602 cites W2950784811 @default.
- W3117713602 cites W2963341956 @default.
- W3117713602 cites W2963403868 @default.
- W3117713602 cites W2963411763 @default.
- W3117713602 cites W2963580443 @default.
- W3117713602 cites W2963754491 @default.
- W3117713602 cites W2964303116 @default.
- W3117713602 cites W2965373594 @default.
- W3117713602 cites W2970120757 @default.
- W3117713602 cites W2970597249 @default.
- W3117713602 cites W2970862333 @default.
- W3117713602 cites W2971418718 @default.
- W3117713602 cites W2972324944 @default.
- W3117713602 cites W2972342261 @default.
- W3117713602 cites W2995856824 @default.
- W3117713602 cites W3023063853 @default.
- W3117713602 cites W3023593493 @default.
- W3117713602 cites W3034503989 @default.
- W3117713602 cites W3034763191 @default.
- W3117713602 cites W3035305735 @default.
- W3117713602 cites W3035390927 @default.
- W3117713602 cites W3099862735 @default.
- W3117713602 cites W3103427938 @default.
- W3117713602 cites W3118485687 @default.
- W3117713602 hasPublicationYear "2020" @default.
- W3117713602 type Work @default.
- W3117713602 sameAs 3117713602 @default.
- W3117713602 citedByCount "8" @default.
- W3117713602 countsByYear W31177136022020 @default.
- W3117713602 countsByYear W31177136022021 @default.
- W3117713602 countsByYear W31177136022022 @default.
- W3117713602 crossrefType "proceedings-article" @default.
- W3117713602 hasAuthorship W3117713602A5009993083 @default.
- W3117713602 hasAuthorship W3117713602A5018370260 @default.
- W3117713602 hasAuthorship W3117713602A5043883660 @default.
- W3117713602 hasAuthorship W3117713602A5054936589 @default.
- W3117713602 hasConcept C119857082 @default.
- W3117713602 hasConcept C121332964 @default.
- W3117713602 hasConcept C137293760 @default.
- W3117713602 hasConcept C154945302 @default.
- W3117713602 hasConcept C165801399 @default.
- W3117713602 hasConcept C186644900 @default.
- W3117713602 hasConcept C204321447 @default.
- W3117713602 hasConcept C206134035 @default.
- W3117713602 hasConcept C41008148 @default.
- W3117713602 hasConcept C53893814 @default.
- W3117713602 hasConcept C62520636 @default.
- W3117713602 hasConcept C66322947 @default.
- W3117713602 hasConceptScore W3117713602C119857082 @default.
- W3117713602 hasConceptScore W3117713602C121332964 @default.
- W3117713602 hasConceptScore W3117713602C137293760 @default.
- W3117713602 hasConceptScore W3117713602C154945302 @default.
- W3117713602 hasConceptScore W3117713602C165801399 @default.
- W3117713602 hasConceptScore W3117713602C186644900 @default.
- W3117713602 hasConceptScore W3117713602C204321447 @default.
- W3117713602 hasConceptScore W3117713602C206134035 @default.
- W3117713602 hasConceptScore W3117713602C41008148 @default.
- W3117713602 hasConceptScore W3117713602C53893814 @default.
- W3117713602 hasConceptScore W3117713602C62520636 @default.
- W3117713602 hasConceptScore W3117713602C66322947 @default.
- W3117713602 hasLocation W31177136021 @default.
- W3117713602 hasOpenAccess W3117713602 @default.
- W3117713602 hasPrimaryLocation W31177136021 @default.
- W3117713602 hasRelatedWork W2152346074 @default.
- W3117713602 hasRelatedWork W2250398693 @default.
- W3117713602 hasRelatedWork W2251314312 @default.
- W3117713602 hasRelatedWork W2252099096 @default.
- W3117713602 hasRelatedWork W2566475580 @default.
- W3117713602 hasRelatedWork W2612953412 @default.
- W3117713602 hasRelatedWork W2765558935 @default.
- W3117713602 hasRelatedWork W2948992961 @default.
- W3117713602 hasRelatedWork W2949399644 @default.
- W3117713602 hasRelatedWork W2963341956 @default.