Multi-task Contrastive Learning Model With Semantic Enhancement for Sequential Recommendation
-
Graphical Abstract
-
Abstract
To address the data sparsity problem in sequential recommendation task, a multi-task contrastive learning model with semantic enhancement for sequential recommendation (MCLM-SE4SRec) is proposed. The method employed multi-task joint training to combine two contrastive learning tasks with recommendation task. The contrastive learning with data-level augmentation task performed data augmentation operations on users' sequences by combining item correlation and sequence length. The contrastive learning with semantic-level clustering task mined users' potential semantic information through semantic information clustering to learn better vector representation features. Additionally, a negative sampling selection optimization strategy was used in the contrastive learning with data-level augmentation task. By identifying false negative samples, a more reasonable set of negative samples was obtained, and the model performance was further improved. The model has achieved excellent performance on three public datasets.
-
-