作者投稿和查稿 主编审稿 专家审稿 编委审稿 远程编辑

计算机工程

• •    

融合节点相似性和超图注意力机制的推荐研究

  • 发布日期:2025-09-09

Recommendation Research Combining Node Similarity Metrics with Hypergraph Attention Mechanism

  • Published:2025-09-09

摘要: 针对传统图神经网络在建模高阶关系和多元交互作用中的局限性,本文提出一种融合节点相似性关联与超图注意力机制的异质超图推荐模型——HNSGCN。该模型将用户抽象为超边、商品抽象为节点,结合用户与商品的上下文语义特征,利用余弦相似度与Jaccard相似度构建用户-用户和商品-商品的相似性矩阵,将普通的二元网络重构为异质超网络。在此基础上,模型引入超图卷积操作和层级注意力机制,自适应地聚合了不同层次的结构信息,有效捕捉了用户与商品间的高阶潜在关系,显著提升了推荐结果的准确性。为验证模型的有效性,本文在Amazon和Yelp-1K两个真实数据集上进行对比实验,通过与多种主流推荐模型进行对比,结果表明,本文所提的HNSGCN推荐模型在Recall@K、Precision@K和NDCG@K三种评估指标上均显著优于现有方法。进一步的消融实验表明,节点相似性关联的引入以及多层注意力聚合机制对模型性能提升具有关键作用。

Abstract: Addressing the limitations of traditional Graph Neural Networks (GNNs) in modeling higher-order relationships and multi-way interactions, this paper proposes a novel heterogeneous hypergraph recommendation model, termed HNSGCN, which integrates node similarity associations with a hypergraph attention mechanism. Within this framework, users are abstracted as hyperedges and items as nodes. Leveraging contextual semantic features of both users and items, the model constructs user-user and item-item similarity matrices utilizing cosine similarity and Jaccard similarity coefficients. This process effectively transforms the conventional dyadic interaction network into a heterogeneous hypernetwork. Building upon this hypergraph structure, the model incorporates hypergraph convolutional operations and a hierarchical attention mechanism. This enables the adaptive aggregation of structural information across different levels, thereby effectively capturing complex higher-order latent relationships between users and items and significantly enhancing recommendation accuracy. To rigorously validate the model's efficacy, comprehensive comparative experiments were conducted on two real-world datasets, Amazon and Yelp-1K. Comparisons against multiple state-of-the-art recommendation baselines demonstrate that the proposed HNSGCN model achieves significantly superior performance across all three evaluation metrics: Recall@K, Precision@K, and NDCG@K. Furthermore, ablation studies confirm that both the incorporation of node similarity associations and the multi-layer attention aggregation mechanism play crucial roles in driving the model's performance gains.