TY - JOUR
T1 - A transformer–convolution model for enhanced session-based recommendation
AU - Wang, Jingjing
AU - Xie, Haoran
AU - Wang, Fu Lee
AU - Lee, Lap Kei
PY - 2023
Y1 - 2023
N2 - Session-based recommendation aims to predict a user's next action based on a series of anonymous sequences and plays an essential role in various online applications, such as e-commerce and music applications. Recently, transformer-based models have obtained results that are competitive with or even surpass those of recurrent neural networks, because of the good performance of transformer models in capturing long-distance dependencies. However, a transformer has a limited ability to mine local contextual information, which can be regarded as collective features. Researchers are seeking to address this limitation by augmenting the contextual transition to boost session representation learning. Accordingly, in this paper, we enhance the capabilities of a transformer in a session-based recommendation task by introducing convolutional neural networks (CNNs) at the stage of aggregating the item features with long- and short-distance dependencies. We first borrow a self-attention module from the classic transformer model to explore the long-distance dependencies. We next propose horizontal and vertical convolutions for enhancing the local collective information and then obtain a session representation by integrating the two types of features. Extensive experiments on real-world datasets show that our method outperforms those that rely on a transformer or a CNN alone.
AB - Session-based recommendation aims to predict a user's next action based on a series of anonymous sequences and plays an essential role in various online applications, such as e-commerce and music applications. Recently, transformer-based models have obtained results that are competitive with or even surpass those of recurrent neural networks, because of the good performance of transformer models in capturing long-distance dependencies. However, a transformer has a limited ability to mine local contextual information, which can be regarded as collective features. Researchers are seeking to address this limitation by augmenting the contextual transition to boost session representation learning. Accordingly, in this paper, we enhance the capabilities of a transformer in a session-based recommendation task by introducing convolutional neural networks (CNNs) at the stage of aggregating the item features with long- and short-distance dependencies. We first borrow a self-attention module from the classic transformer model to explore the long-distance dependencies. We next propose horizontal and vertical convolutions for enhancing the local collective information and then obtain a session representation by integrating the two types of features. Extensive experiments on real-world datasets show that our method outperforms those that rely on a transformer or a CNN alone.
KW - Convolutional neural network
KW - Recommender system
KW - Session-based recommendation
KW - Transformer
UR - https://www.mendeley.com/catalogue/a914409a-9e01-3759-ba4c-682c63dd5c6e/
U2 - 10.1016/j.neucom.2023.01.083
DO - 10.1016/j.neucom.2023.01.083
M3 - Article
SN - 0925-2312
VL - 531
SP - 21
EP - 33
JO - Neurocomputing
JF - Neurocomputing
ER -