Matches in SemOpenAlex for { <https://semopenalex.org/work/W3215788502> ?p ?o ?g. }
- W3215788502 abstract "The point cloud learning community witnesses a modeling shift from CNNs to Transformers, where pure Transformer architectures have achieved top accuracy on the major learning benchmarks. However, existing point Transformers are computationally expensive since they need to generate a large attention map, which has quadratic complexity (both in space and time) with respect to input size. To solve this shortcoming, we introduce Patch ATtention (PAT) to adaptively learn a much smaller set of bases upon which the attention maps are computed. By a weighted summation upon these bases, PAT not only captures the global shape context but also achieves linear complexity to input size. In addition, we propose a lightweight Multi-Scale aTtention (MST) block to build attentions among features of different scales, providing the model with multi-scale features. Equipped with the PAT and MST, we construct our neural architecture called PatchFormer that integrates both modules into a joint framework for point cloud learning. Extensive experiments demonstrate that our network achieves comparable accuracy on general point cloud learning tasks with 9.2x speed-up than previous point Transformers." @default.
- W3215788502 created "2021-12-06" @default.
- W3215788502 creator A5003096607 @default.
- W3215788502 creator A5003800748 @default.
- W3215788502 creator A5039878333 @default.
- W3215788502 creator A5091153454 @default.
- W3215788502 date "2021-10-30" @default.
- W3215788502 modified "2023-09-27" @default.
- W3215788502 title "PatchFormer: An Efficient Point Transformer with Patch Attention" @default.
- W3215788502 cites W1920022804 @default.
- W3215788502 cites W2553307952 @default.
- W3215788502 cites W2560609797 @default.
- W3215788502 cites W2612445135 @default.
- W3215788502 cites W2798777114 @default.
- W3215788502 cites W2902302021 @default.
- W3215788502 cites W2911489562 @default.
- W3215788502 cites W2942498895 @default.
- W3215788502 cites W2960986959 @default.
- W3215788502 cites W2962731536 @default.
- W3215788502 cites W2963046128 @default.
- W3215788502 cites W2963121255 @default.
- W3215788502 cites W2963125977 @default.
- W3215788502 cites W2963158438 @default.
- W3215788502 cites W2963341956 @default.
- W3215788502 cites W2963403868 @default.
- W3215788502 cites W2963509914 @default.
- W3215788502 cites W2963719584 @default.
- W3215788502 cites W2963727135 @default.
- W3215788502 cites W2963925437 @default.
- W3215788502 cites W2964110616 @default.
- W3215788502 cites W2970597249 @default.
- W3215788502 cites W2971230666 @default.
- W3215788502 cites W2979750740 @default.
- W3215788502 cites W2981199548 @default.
- W3215788502 cites W2990613095 @default.
- W3215788502 cites W2990775046 @default.
- W3215788502 cites W2996167479 @default.
- W3215788502 cites W3034239841 @default.
- W3215788502 cites W3034314779 @default.
- W3215788502 cites W3044594905 @default.
- W3215788502 cites W3119786062 @default.
- W3215788502 cites W3153465022 @default.
- W3215788502 cites W3171215128 @default.
- W3215788502 cites W3176258108 @default.
- W3215788502 cites W3179869055 @default.
- W3215788502 cites W3190216403 @default.
- W3215788502 cites W3202053489 @default.
- W3215788502 cites W3202406646 @default.
- W3215788502 cites W3203606893 @default.
- W3215788502 cites W3203701986 @default.
- W3215788502 doi "https://doi.org/10.48550/arxiv.2111.00207" @default.
- W3215788502 hasPublicationYear "2021" @default.
- W3215788502 type Work @default.
- W3215788502 sameAs 3215788502 @default.
- W3215788502 citedByCount "0" @default.
- W3215788502 crossrefType "posted-content" @default.
- W3215788502 hasAuthorship W3215788502A5003096607 @default.
- W3215788502 hasAuthorship W3215788502A5003800748 @default.
- W3215788502 hasAuthorship W3215788502A5039878333 @default.
- W3215788502 hasAuthorship W3215788502A5091153454 @default.
- W3215788502 hasBestOaLocation W32157885021 @default.
- W3215788502 hasConcept C11413529 @default.
- W3215788502 hasConcept C119599485 @default.
- W3215788502 hasConcept C123657996 @default.
- W3215788502 hasConcept C127413603 @default.
- W3215788502 hasConcept C129844170 @default.
- W3215788502 hasConcept C131979681 @default.
- W3215788502 hasConcept C142362112 @default.
- W3215788502 hasConcept C153349607 @default.
- W3215788502 hasConcept C154945302 @default.
- W3215788502 hasConcept C165801399 @default.
- W3215788502 hasConcept C2524010 @default.
- W3215788502 hasConcept C33923547 @default.
- W3215788502 hasConcept C41008148 @default.
- W3215788502 hasConcept C66322947 @default.
- W3215788502 hasConcept C80444323 @default.
- W3215788502 hasConceptScore W3215788502C11413529 @default.
- W3215788502 hasConceptScore W3215788502C119599485 @default.
- W3215788502 hasConceptScore W3215788502C123657996 @default.
- W3215788502 hasConceptScore W3215788502C127413603 @default.
- W3215788502 hasConceptScore W3215788502C129844170 @default.
- W3215788502 hasConceptScore W3215788502C131979681 @default.
- W3215788502 hasConceptScore W3215788502C142362112 @default.
- W3215788502 hasConceptScore W3215788502C153349607 @default.
- W3215788502 hasConceptScore W3215788502C154945302 @default.
- W3215788502 hasConceptScore W3215788502C165801399 @default.
- W3215788502 hasConceptScore W3215788502C2524010 @default.
- W3215788502 hasConceptScore W3215788502C33923547 @default.
- W3215788502 hasConceptScore W3215788502C41008148 @default.
- W3215788502 hasConceptScore W3215788502C66322947 @default.
- W3215788502 hasConceptScore W3215788502C80444323 @default.
- W3215788502 hasLocation W32157885021 @default.
- W3215788502 hasOpenAccess W3215788502 @default.
- W3215788502 hasPrimaryLocation W32157885021 @default.
- W3215788502 hasRelatedWork W1819952937 @default.
- W3215788502 hasRelatedWork W1887934433 @default.
- W3215788502 hasRelatedWork W2127108126 @default.
- W3215788502 hasRelatedWork W2364531466 @default.
- W3215788502 hasRelatedWork W2383828164 @default.
- W3215788502 hasRelatedWork W2919233342 @default.