Computer and Modernization ›› 2025, Vol. 0 ›› Issue (07): 55-62.doi: 10.3969/j.issn.1006-2475.2025.07.008

Previous Articles     Next Articles

Long-and Short-Term Air Pollutant Concentration Forecasting Based on Optimized Transformer

  



  1. (College of Information Science and Technology, Hangzhou Normal University, Hangzhou 311121, China)
  • Online:2025-07-22 Published:2025-07-22

Abstract: Abstract:Addressing the issues of low prediction accuracy, short timeliness, and difficulties in capturing spatiotemporal features for air pollutant concentration prediction, a Transformer architecture based on conditional mask self-attention is proposed, named CondMSA-Transformer. This paper improves the multi-head self-attention mechanism in the Transformer model, introduces the sparse attention concepts. By integrating critical environmental factors such as wind speed and wind direction, it implements intelligent “masking” of unnecessary site data, focusing on extracting the most valuable information within the spatiotemporal dimension. This strategy effectively avoids interference from weak signals of remote stations, reduces computational complexity, and enhances the model’s ability to capture core features. Comprehensive experimental evaluations on two real datasets in Beijing demonstrate that CondMSA-Transformer exhibits robust performance in both short-term and long-term prediction scenarios, providing up to 14.67% improvement in mean absolute error (MAE) for PM2.5 prediction compared to other existing methods. This shows its vast application potential and advancement in the field of air quality prediction.

Key words: Key words: air pollutant concentration prediction, long-term predictions, Transformer, attention mechanism, conditional mask

CLC Number: