Matches in SemOpenAlex for { <https://semopenalex.org/work/W4293232189> ?p ?o ?g. }
Showing items 1 to 43 of
43
with 100 items per page.
- W4293232189 abstract "Sequential recommendation models the dynamics of a user's previous behaviors in order to forecast the next item, and has drawn a lot of attention. Transformer-based approaches, which embed items as vectors and use dot-product self-attention to measure the relationship between items, demonstrate superior capabilities among existing sequential methods. However, users' real-world sequential behaviors are textit{textbf{uncertain}} rather than deterministic, posing a significant challenge to present techniques. We further suggest that dot-product-based approaches cannot fully capture textit{textbf{collaborative transitivity}}, which can be derived in item-item transitions inside sequences and is beneficial for cold start items. We further argue that BPR loss has no constraint on positive and sampled negative items, which misleads the optimization. We propose a novel textbf{STO}chastic textbf{S}elf-textbf{A}ttention~(STOSA) to overcome these issues. STOSA, in particular, embeds each item as a stochastic Gaussian distribution, the covariance of which encodes the uncertainty. We devise a novel Wasserstein Self-Attention module to characterize item-item position-wise relationships in sequences, which effectively incorporates uncertainty into model training. Wasserstein attentions also enlighten the collaborative transitivity learning as it satisfies triangle inequality. Moreover, we introduce a novel regularization term to the ranking loss, which assures the dissimilarity between positive and the negative items. Extensive experiments on five real-world benchmark datasets demonstrate the superiority of the proposed model over state-of-the-art baselines, especially on cold start items. The code is available in url{https://github.com/zfan20/STOSA}." @default.
- W4293232189 created "2022-08-27" @default.
- W4293232189 creator A5002591086 @default.
- W4293232189 creator A5017213607 @default.
- W4293232189 creator A5036357902 @default.
- W4293232189 creator A5055127234 @default.
- W4293232189 creator A5060335470 @default.
- W4293232189 creator A5069093595 @default.
- W4293232189 creator A5090233891 @default.
- W4293232189 date "2022-01-16" @default.
- W4293232189 modified "2023-09-26" @default.
- W4293232189 title "Sequential Recommendation via Stochastic Self-Attention" @default.
- W4293232189 doi "https://doi.org/10.48550/arxiv.2201.06035" @default.
- W4293232189 hasPublicationYear "2022" @default.
- W4293232189 type Work @default.
- W4293232189 citedByCount "0" @default.
- W4293232189 crossrefType "posted-content" @default.
- W4293232189 hasAuthorship W4293232189A5002591086 @default.
- W4293232189 hasAuthorship W4293232189A5017213607 @default.
- W4293232189 hasAuthorship W4293232189A5036357902 @default.
- W4293232189 hasAuthorship W4293232189A5055127234 @default.
- W4293232189 hasAuthorship W4293232189A5060335470 @default.
- W4293232189 hasAuthorship W4293232189A5069093595 @default.
- W4293232189 hasAuthorship W4293232189A5090233891 @default.
- W4293232189 hasBestOaLocation W42932321891 @default.
- W4293232189 hasConcept C41008148 @default.
- W4293232189 hasConceptScore W4293232189C41008148 @default.
- W4293232189 hasLocation W42932321891 @default.
- W4293232189 hasOpenAccess W4293232189 @default.
- W4293232189 hasPrimaryLocation W42932321891 @default.
- W4293232189 hasRelatedWork W1509640521 @default.
- W4293232189 hasRelatedWork W1521701603 @default.
- W4293232189 hasRelatedWork W1536489457 @default.
- W4293232189 hasRelatedWork W1562941967 @default.
- W4293232189 hasRelatedWork W1737489018 @default.
- W4293232189 hasRelatedWork W2137757517 @default.
- W4293232189 hasRelatedWork W2965938919 @default.
- W4293232189 hasRelatedWork W2990311505 @default.
- W4293232189 hasRelatedWork W3107474891 @default.
- W4293232189 hasRelatedWork W2523186608 @default.
- W4293232189 isParatext "false" @default.
- W4293232189 isRetracted "false" @default.
- W4293232189 workType "article" @default.