作者投稿和查稿 主编审稿 专家审稿 编委审稿 远程编辑

计算机工程

• •    

基于可泛化图知识蒸馏的油浸式电力变压器故障检测

  • 发布日期:2025-04-21

Generalizable graph knowledge distillation for fault detection in oil-immersed power transformers

  • Published:2025-04-21

摘要: 溶解气体分析(Dissolved Gas Analysis, DGA)旨在通过监测绝缘油中的溶解气体来识别潜在的故障类型。然而,现有DGA方法受到有限标记数据制约导致性能不佳。在这项工作中,提出一种新的图知识蒸馏方法(GKDG),旨在提高DGA的准确性和效率。具体而言,采用了双视角图构建策略从样本邻域中获得额外的监督,通过传播直接从其他样本中聚合信息。进一步地,将教师图神经网络中的知识蒸馏到学生图神经网络模型中,确保学生模型能够有效地捕捉并解释溶解气体之间的复杂关系。此外,为了对齐嵌入空间中的学生图和教师图,引入了多种知识,从而增强学生模型的学习能力,可以更好地从教师模型中学习。广泛的实验验证了本文方法在提升DGA性能方面的显著效果,为电力设备的维护和故障检测提供了有力支持。

Abstract: Dissolved Gas Analysis (DGA) aim to identify faults in oil-immersed power transformers by monitoring the dissolved gases in the insulating oil. However, the effectiveness of current DGA methods is hindered by the lack of labeled data, resulting in sub-optimal performance. To address this limitation, this study introduces a novel Graph Knowledge Distillation approach (GKDG) to enhance the accuracy and efficiency of DGA. The approach employs a dual-perspective graph construction strategy to leverage additional supervision from sample neighborhoods, facilitating direct information aggregation through propagation. Furthermore, by distilling knowledge from a teacher Graph Neural Network(GNN) into the student GNN model, the student model is equipped to adeptly capture and interpret the intricate relationships among dissolved gases. We then introduce diverse knowledge to align the student and teacher graphs in the embedding space to enhance the learning capacity of the student model, enabling it to benefit more effectively from the teacher model. Extensive experiments have confirmed the significant effectiveness of this method in enhancing DGA performance, thereby providing robust support for the maintenance and fault detection of power equipment.