作者投稿和查稿 主编审稿 专家审稿 编委审稿 远程编辑

计算机工程

• •    

基于动态低秩注意力的鲁棒文本分类

  • 发布日期:2026-04-02

Robust Text Classification with Dynamic Low-Rank Attention

  • Published:2026-04-02

摘要: 针对高效Transformer模型在噪声环境下文本分类性能退化的问题,提出了一种结合动态低秩注意力与双视图一致性约束的高效鲁棒文本分类方法。该方法通过输入特征的方差信息自适应地调整注意力秩值,对语义复杂样本分配更高秩以增强表达能力,对简单样本使用较低秩以维持近线性计算复杂度,从而在表示能力与效率之间实现动态平衡。同时,在训练阶段引入双视图一致性约束机制,通过构造干净与受扰动文本视图并约束其语义表示一致,抑制噪声对模型判别边界的干扰,进一步提升模型鲁棒性。本文在多组中英文文本分类数据集上进行了系统实验,包括情感分析、主题识别及细粒度情绪分类等任务。实验结果表明,所提方法在准确率等指标上均优于固定低秩基线模型,并在多种噪声类型与强度下表现出更稳定的分类性能。该研究为在复杂噪声环境下实现高效鲁棒的文本分类提供了一种新的解决思路。

Abstract: To address the performance degradation of efficient Transformer models in noisy text classification scenarios, this study proposes a robust and efficient classification method that integrates a dynamic low-rank attention mechanism with a dual-view consistency constraint. The proposed approach adaptively adjusts the attention rank based on the variance of input features, allocating higher ranks to semantically complex samples to enhance representation capacity and lower ranks to simpler samples to maintain near-linear computational complexity, thus achieving a dynamic balance between expressiveness and efficiency. During training, a dual-view consistency mechanism is introduced by constructing clean and perturbed text views and enforcing consistency between their semantic representations, which suppresses noise-induced shifts in the decision boundary and further improves robustness. Extensive experiments on multiple Chinese and English text classification datasets — including sentiment analysis, topic identification, and fine-grained emotion classification — demonstrate that the proposed method outperforms fixed-rank baselines in terms of accuracy and exhibits more stable performance across various noise types and intensities. This study provides a novel solution for achieving efficient and robust text classification in complex noisy environments.