Author Login Chief Editor Login Reviewer Login Editor Login Remote Office

Computer Engineering ›› 2025, Vol. 51 ›› Issue (8): 120-130. doi: 10.19678/j.issn.1000-3428.0069739

• Artificial Intelligence and Pattern Recognition • Previous Articles     Next Articles

Personalized Forgetting Modeling for Knowledge Tracing via Transformers

ZHANG Zhaoli1, LI Jiahao1, LIU Hai1,2,*(), SHI Fobo1, HE Jiawen1   

  1. 1. Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430000, Hubei, China
    2. Shenzhen Research Institute of Central China Normal University, Shenzhen 518000, Guangdong, China
  • Received:2024-04-15 Revised:2024-07-01 Online:2025-08-15 Published:2025-08-15
  • Contact: LIU Hai

基于个性化遗忘建模的知识追踪方法

张昭理1, 李家豪1, 刘海1,2,*(), 石佛波1, 何嘉文1   

  1. 1. 华中师范大学人工智能教育学部, 湖北 武汉 430000
    2. 华中师范大学深圳研究院, 广东 深圳 518000
  • 通讯作者: 刘海
  • 基金资助:
    国家自然科学基金面上项目(6247077114); 国家自然科学基金面上项目(62377037); 国家自然科学基金面上项目(62277041); 国家自然科学基金面上项目(62173286); 国家自然科学基金面上项目(62177019); 国家自然科学基金面上项目(62177018); 湖北省自然科学基金面上项目(2022CFB971); 湖北省自然科学基金面上项目(2022CFB529); 深圳市自然科学基金面上项目(JCYJ20230807152900001); 江西省自然科学基金青年项目(20242BAB2S107); 江西省自然科学基金青年项目(20232BAB212026); 江西省高等学校教育教学改革研究课题(JXJG-23-27-6); 湖北省自然科学基金-创新发展联合基金项目(2025AFD621); 广东省基础与应用基础研究基金(2025A1515010266); 华中师范大学中央高校基本科研业务费项目(CCNU25ai012)

Abstract:

It is very difficult for traditional Knowledge Tracing (KT) models to model learners' knowledge state changes in long interaction sequences. This study introduces an attention mechanism model represented by a Transformer to capture potential information in learners' long interaction sequences that exhibits good performance. However, when modeling the learning process, existing models often ignore the differences in learners' abilities and focus mainly on the accumulation of knowledge mastery states, failing to fully model the forgetting benefit of learners. In this study, a Knowledge Tracing Method based on Personalized Forgetting Modeling (PFKT) is proposed that models learners' answering ability by introducing additional characteristic information and further explores learners' differentiated memory-forgetting ability. Specifically, this method starts with the historical interaction sequence of learners and comprehensively considers the acquisition and forgetting of knowledge points to capture the state of the learners' real knowledge mastery. Simultaneously, combined with additional characteristic information, personalized forgetting phenomenon modeling is realized more accurately. Experimental results demonstrate that the proposed PFKT model achieves better performance than existing models on the ASSISTments2017 and Algebra 2005-2006 datasets.

Key words: Knowledge Tracing (KT), attention mechanism, forgetting modeling, Transformer, feature information

摘要:

针对传统知识追踪(KT)模型难以有效建模学习者在长交互序列下的知识状态变化的问题, 引入以Transformer为代表的注意力机制模型能够实现对学习者长交互序列中潜在信息的捕获, 并展现出良好的性能。然而, 现有模型在建模学习过程时, 往往忽视了学习者的能力差异, 且主要关注知识掌握状态的累加, 未能充分建模学习者的遗忘效益。因此, 提出一种基于个性化遗忘建模的知识追踪方法(PFKT), 通过引入额外的特征信息来建模学习者的答题能力, 并进一步探究学习者差异化的记忆遗忘能力。从学习者的历史交互序列出发, 综合考虑知识点的获取与遗忘现象, 以捕获学习者的真实知识掌握状态。同时, 结合额外的特征信息, 实现更为准确的个性化遗忘现象建模。实验结果显示, 所提出的PFKT模型在ASSISTments2017和Algebra 2005-2006数据集上均取得了较现有模型更优的性能。

关键词: 知识追踪, 注意力机制, 遗忘建模, Transformer, 特征信息