2024 article

BERT-PIN: A BERT-Based Framework for Recovering Missing Data Segments in Time-Series Load Profiles

Hu, Y., Ye, K., Kim, H., & Lu, N. (2024, July 1). IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS.

By: Y. Hu n, K. Ye n, H. Kim n & N. Lu n

author keywords: Load modeling; Transformers; Encoding; Data models; Bidirectional control; Power systems; Adaptation models; Bidirectional encoder representations from transformers (BERT); conservation voltage reduction; machine learning; missing data restoration; power system; transformer
Source: Web Of Science
Added: July 17, 2024

Restoring missing data holds paramount importance in power system analysis. Traditional recovery methods typically offer only a singular solution, lacking adaptability and depth. To bridge this gap, we introduce BERT-PIN, a pioneering approach harnessing bidirectional encoder representations from transformers for profile inpainting. This innovative technique enables the recovery of multiple segments of missing data by leveraging power system load and temperature profiles. Our findings demonstrate that BERT-PIN enhances accuracy by 5%–30% compared to existing techniques, showcasing its ability to restore numerous missing data segments across extended periods. We have successfully applied BERT-PIN in two critical power system applications: recovering missing data segments and estimating conservation voltage reduction baselines. Serving as a versatile pretrained model, BERT-PIN supports various downstream tasks, including classification and super-resolution, thereby reducing the necessity for extensive training data, and minimizing training time.