Transformer surrogate model based on attention and long-short term memory
JIN Liang1,2, FENG Yulin1, CAO Jiahao1, WANG Yanyang1
1. Tianjin Key Laboratory of Advanced Technology of Electrical Engineering and Energy, Tiangong University, Tianjin 300387; 2. State Key Laboratory of Reliability and Intelligence of Electrical Equipment, Hebei University of Technology, Tianjin 300130
Abstract:The design parameters and performance data of power transformers are often very complex considering many factors such as energy exchange efficiency, noise, volume and weight. Therefore, how to establish transformer surrogate model is an urgent problem to be solved. The surrogate-based optimization (SBO) can effectively solve the problem of long optimization time. In this paper, the surrogate model of transformer design parameters and performance data is established by deep learning to achieve high precision prediction of transformer performance optimization objectives and effectively reduce the time required for transformer performance analysis and optimization. Firstly, based on the deep learning model of long-short term memory network (LSTM), the nonlinear mapping between various parameters of amorphous alloy transformer is established, and the attention mechanism is added to enhance the prediction effect of the model. Finally, the proposed deep learning surrogate model is verified by finite element simulation experiment and compared with other commonly used surrogate models. The results show that the attention and long-short term memory surrogate model is superior in prediction accuracy.
金亮, 冯裕霖, 曹佳豪, 王艳阳. 基于注意力与长短期记忆网络的变压器代理模型[J]. 电气技术, 2021, 22(7): 65-71.
JIN Liang, FENG Yulin, CAO Jiahao, WANG Yanyang. Transformer surrogate model based on attention and long-short term memory. Electrical Engineering, 2021, 22(7): 65-71.