[1] Liao W, Yang D, Wang Y, et al. Fault diagnosis of power
transformers using graph convolutional net-work[J].
CSEE Journal of Power and Energy Systems, 2020, 7(2):
241-249.
[2] 罗文萱. 基于胶囊神经网络的电力变压器故障诊断方
法研究[J].高压电器, 2024, 60(5): 92-98.
Luo Wen-xuan. Research on Fault Diagnosis Method of
Power Transformer Based on Capsule Networks [J]. High
Voltage Apparatus, 2024,60(5):92-98.
[3] Chen H C, Zhang Y. Rethinking Shallow and Deep
Learnings for Transformer Dissolved Gas Analysis: A
Review[J]. IEEE Transactions on Dielectrics and
Electrical Insulation, 2025.
[4] Patil M, Paramane A, Das S, et al. Hybrid algorithm for
dynamic fault prediction of HVDC converter transformer
using DGA data[J]. IEEE Transactions on Dielectrics and
Electrical Insulation, 2024.
[5] 毛业栋, 张春辉, 陈杰. 融合特征分析及机器学习的可
演进变压器故障诊断模型[J]. 计算机工程, 2024, 50(8):
379-388.
Yedong MAO, Chunhui ZHANG, Jie CHEN. Evolvable
Transformer Fault Diagnosis Model Combining FeatureAnalysis and Machine Learning[J]. Computer
Engineering, 2024, 50(8): 379-388.
[6] Zhang X, Yang K. Transformer fault diagnosis method
based on MTF and GhostNet[J]. Measurement, 2025:
117056.
[7] Gouda O E, El‐Hoshy S H, EL‐Tamaly H H. Condition
assessment of power transformers based on dissolved gas
analysis[J]. IET Generation, Transmission & Dis-tribution,
2019, 13(12): 2299-2310.
[8] 王宇, 祁琦, 王纯, 许才. 储能变流器信号高精度故障
诊断方法[J]. 计算机工程, 2024, 50(8): 389-396.
Yu WANG, Qi QI, Chun WANG, Cai XU. High-Precision
Fault Diagnosis Method for Energy Storage Inverter
Signals[J]. Computer Engineering, 2024, 50(8): 389-396.
[9] Chen H C, Zhang Y. Dissolved Gas Analysis Using
Knowledge-Filtered Oversampling-Based Diverse Stack
Learning[J]. IEEE Transactions on Instrumentation and
Measurement, 2025.
[10] Thote P B, Daigavane M B, Daigavane P M, et al. An
intelligent hybrid approach using KNN-GA to enhance the
performance of digital protection transformer scheme[J].
Canadian journal of electrical and com-puter engineering,
2017, 40(3): 151-161.
[11] Menezes A G C, Araujo M M, Almeida O M, et al.
In-duction of decision trees to diagnose incipient faults in
power transformers[J]. IEEE Transactions on Dielec-trics
and Electrical Insulation, 2022, 29(1): 279-286.
[12] Li K , Yang G , Wang K ,et al.Fault Diagnosis Method of
Transformer Based on WSO and SVM[C]//2023 IEEE 7th
Conference on Energy Internet and Energy System
Integration (EI2).0[2024-12-25].
[13] Jin Y, Wu H, Zheng J, et al. Power transformer fault
diagnosis based on improved BP neural network[J].
Electronics, 2023, 12(16): 3526.
[14] Shu K, Ma H, Yang J, et al. GraphSmin: Imbalanced
dissolved gas analysis with contrastive dual-channel graph
filters[J]. Advanced Engineering Informatics, 2024, 62:
102839.
[15] Zhang Y, Ma H, Zhang D, et al. Graph Contrastive
Learning for Dissolved Gas Analysis[C]//International
Conference on Advanced Data Mining and Applications.
Singapore: Springer Nature Singapore, 2024: 178-190.
[16] Jin W, Ma H, Zhang Y, et al. Multi-view discriminative
edge heterophily contrastive learning network for
at-tributed graph anomaly detection[J]. Expert Systems
with Applications, 2024: 124460.
[17] Tian Y, Pei S, Zhang X, et al. Knowledge distillation on
graphs: A survey[J]. ACM Computing Surveys, 2023.
[18] Hinton G. Distilling the Knowledge in a Neural
Network[J]. arXiv preprint arXiv:1503.02531, 2015.
[19] Liu J, Ke W, Wang P, et al. Towards continual knowledge
graph embedding via incremental distilla-tion
[C]//Proceedings of the AAAI Conference on Arti-ficial
Intelligence. 2024, 38(8): 8759-8768.
[20] Sun W, Chen D, Lyu S, et al. Knowledge distillation with
refined logits[J]. arXiv preprint arXiv:2408.07703, 2024.
[21] Wang Q, Zhou J. Multi-perspective Contrastive Logit
Distillation[J]. arXiv preprint arXiv:2411.10693, 2024.
[22] Tian Y, Xu S, Li M. Decoupled graph knowledge
distillation: A general logits-based method for learning
mlps on graphs[J]. Neural Networks, 2024, 179: 106567.
[23] Guo Z, Wang D, He Q, et al. Leveraging logit uncertainty
for better knowledge distillation[J]. Scientific Reports,
2024, 14(1): 31249.
[24] Xu L, Wang Z, Bai L, et al. Multi-Level Knowledge
Distillation with Positional Encoding Enhancement[J].
Pattern Recognition, 2025: 111458.
[25] Ma Y, Chen Y, Akata Z. Distilling knowledge from
self-supervised teacher by embedding graph alignment[J].
arXiv preprint arXiv:2211.13264, 2022.
|