Author Login Chief Editor Login Reviewer Login Editor Login Remote Office

Computer Engineering ›› 2026, Vol. 52 ›› Issue (4): 163-175. doi: 10.19678/j.issn.1000-3428.0070143

• Computational Intelligence and Pattern Recognition • Previous Articles     Next Articles

XSGCL: A Lightweight Graph Contrastive Learning Framework for Recommendation

ZHANG Zhen1, YOU Lan1, PENG Qingxi2,*(), JIN Hong1, ZENG Haoqiu1, XIA Yuchun1   

  1. 1. School of Computer Science and Information Engineering, Hubei University, Wuhan 430062, Hubei, China
    2. School of Information Engineering, Wuhan College, Wuhan 430212, Hubei, China
  • Received:2024-07-17 Revised:2024-10-11 Online:2026-04-15 Published:2024-11-25
  • Contact: PENG Qingxi

XSGCL: 用于推荐的轻量级图对比学习框架

张震1, 游兰1, 彭庆喜2,*(), 金红1, 曾昊秋1, 夏宇春1   

  1. 1. 湖北大学计算机与信息工程学院, 湖北 武汉 430062
    2. 武汉学院信息工程学院, 湖北 武汉 430212
  • 通讯作者: 彭庆喜
  • 作者简介:

    张震, 男, 硕士研究生, 主研方向为推荐系统、深度学习

    游兰, 教授、博士

    彭庆喜(CCF会员、通信作者), 教授、博士

    金红, 讲师、博士

    曾昊秋, 硕士研究生

    夏宇春, 硕士研究生

  • 基金资助:
    湖北省重点研发计划(2022BAA044); 湖北省高校优秀中青年科技创新团队项目(T2022055)

Abstract:

Traditional recommendation models based on contrastive learning first perform data augmentation on the original interaction graph and then strive to improve the consistency of representations encoded from different views. Although this method has been proven effective, recent research has found that graph augmentation often introduces bias owing to the power-law distribution of node edges in graph data: such biases are detrimental to contrastive learning. In addition, the graph structure distribution makes the processing of large-scale datasets computationally intensive, limiting the flexibility of contrastive learning models. To address these challenges, this study proposes a High-Low Variance Separation feature enhancement method (HLVS), which not only avoids direct perturbations to the graph structure but also alleviates the semantic bias problem that exists in traditional feature perturbation methods. Simultaneously, to alleviate the issue of popularity bias in recommendation systems, popularity metrics are introduced into the main task, and a new loss function, Popularity Bayesian Personalized Ranking (PBPR) loss, is designed to balance the representation of popular and unpopular nodes. Finally, by integrating contrastive learning, HLVS, and PBPR, a lightweight and parameter-free graph contrastive learning framework, eXtremely Simple Graph Contrastive Learning (XSGCL), is designed, which can be naturally integrated into recommendation models to improve training efficiency and performance. Extensive experiments on five public datasets prove that integrating XSGCL into LightGCN not only significantly improves training efficiency but also achieves a performance that is better or comparable to that of advanced models. For example, on the Yelp2018 dataset, compared to LightGCN, the proposed model improves training efficiency by 91.2%. On the Alibaba-iFashion dataset, Recall@10 and NDCG@10 indicators increase by 32.21% and 33.73%, respectively.

Key words: recommendation system, contrastive learning, data augmentation, popularity bias, Graph Neural Network (GNN), collaborative filtering

摘要:

传统的基于对比学习的推荐模型通常首先对原始交互图进行数据增强, 然后尽可能提高不同视图编码后的表示一致性。虽然这种方法已被证明是有效的, 但最近的研究发现, 由于图数据中节点的边遵循幂律分布, 图增强往往会引入对对比学习不利的偏差。此外, 图结构的扰动使得处理大规模数据集变得计算密集, 限制了基于对比学习模型的灵活性。为了应对这些挑战, 提出一种高低方差分离的特征增强方法(HLVS), 该方法不仅避免了对图结构的直接扰动, 还减轻了传统特征扰动方法中存在的语义偏差问题。同时, 为了缓解推荐系统中的流行度偏差问题, 在主任务中引入流行度指标, 并设计一种新的损失函数——基于物品流行度的贝叶斯个性化排序(PBPR)损失, 以实现对热门与冷门节点表示的平衡。最后, 通过整合对比学习、HLVS和PBPR, 设计一个轻量级的无参数图对比学习框架(XSGCL), 该框架可以自然地集成到推荐模型中, 以提高训练效率和性能。通过在5个公共数据集上的广泛实验, 证明了将XSGCL集成到LightGCN后, 不仅显著提升了训练效率, 并且相较于先进模型具有更好或者相当的性能, 例如在Yelp2018数据集上, 相比于LightGCN, 模型训练效率提升了91.2%;在Alibaba-iFashion数据集上, Recall@10和NDCG@10指标分别提高了32.21%和33.73%。

关键词: 推荐系统, 对比学习, 数据增强, 流行度偏差, 图神经网络, 协同过滤