A transformer–convolution model for enhanced session-based recommendation

Jingjing Wang, Haoran Xie, Fu Lee Wang, Lap Kei Lee

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)

Abstract

Session-based recommendation aims to predict a user's next action based on a series of anonymous sequences and plays an essential role in various online applications, such as e-commerce and music applications. Recently, transformer-based models have obtained results that are competitive with or even surpass those of recurrent neural networks, because of the good performance of transformer models in capturing long-distance dependencies. However, a transformer has a limited ability to mine local contextual information, which can be regarded as collective features. Researchers are seeking to address this limitation by augmenting the contextual transition to boost session representation learning. Accordingly, in this paper, we enhance the capabilities of a transformer in a session-based recommendation task by introducing convolutional neural networks (CNNs) at the stage of aggregating the item features with long- and short-distance dependencies. We first borrow a self-attention module from the classic transformer model to explore the long-distance dependencies. We next propose horizontal and vertical convolutions for enhancing the local collective information and then obtain a session representation by integrating the two types of features. Extensive experiments on real-world datasets show that our method outperforms those that rely on a transformer or a CNN alone.
Original languageEnglish
Pages (from-to)21-33
Number of pages13
JournalNeurocomputing
Volume531
DOIs
Publication statusPublished - 2023

Keywords

  • Convolutional neural network
  • Recommender system
  • Session-based recommendation
  • Transformer

Fingerprint

Dive into the research topics of 'A transformer–convolution model for enhanced session-based recommendation'. Together they form a unique fingerprint.

Cite this