[1] |
刘晓燕,刘扬,孙建刚,等. 输油管道运行优化研究[J]. 工程热物理学报, 2004,25(4):558-561.
|
[2] |
杨树人,廖震,敖娟,等. 混输管道数据库压降插值计算方法研究[J]. 数学的实践与认识, 2017,47(18):107-110.
|
[3] |
袁志萍. 塔河油田稠油外输管道水力热力计算[J]. 油气田地面工程, 2011,30(7):31-33.
|
[4] |
张维志. 超稠油管道输送水力及热力计算[J]. 油气储运, 2007(4):11-13.
|
[5] |
柳歆,张劲军,宇波. 热油管道间歇输送热力水力特性[J]. 油气储运, 2011,30(6):419-422.
|
[6] |
战征. 顺北油气田集输管网气液混输特性模拟[J]. 油气储运, 2022,41(4):424-430.
|
[7] |
赵洪洋,魏立新,刘书孟,等. 油田污水管道水力计算修正研究[J]. 当代化工, 2020,49(2):441-445.
|
[8] |
杨凯. 庆哈输油管道混输条件下的安全经济运行方案研究[D]. 大庆:东北石油大学, 2017.
|
[9] |
ATTIA M, ABDULRAHEEM A, MAHMOUD M A. Pressure drop due to multiphase flow using four artificial intelligence methods[C]// SPE North Africa Technical Conference and Exhibition. 2015:170-179.
|
[10] |
ZABIHI R, MOWLA D, KARAMI H R. Artificial intelligence approach to predict drag reduction in crude oil pipelines[J]. Journal of Petroleum Science and Engineering, 2019,178:586-593.
|
[11] |
WEI L X, ZHANG Y, JI L L, et al. Pressure drop prediction of crude oil pipeline based on PSO-BP neural network[J]. Energies, 2022,15(16):5880. DOI: 10.3390/en15165880.
|
[12] |
MOAYEDI H, FOONG L K, NGUYEN H. Soft computing method for predicting pressure drop reduction in crude oil pipelines based on machine learning methods[J]. Journal of the Brazilian Society of Mechanical Sciences, 2020,42(11):562. DOI: 10.1007/s40430-020-02613-x.
|
[13] |
陈新果,冷绪林,安云朋,等. 基于深度学习结构网络的输气管道水力预测模型[J]. 油气田地面工程, 2018,37(8):52-57.
|
[14] |
李树杉,张宇,周明,等. 混沌粒子群优化的RBF神经网络在热油管道仿真中的应用[J]. 纳米技术与精密工程, 2017,15(3):181-186.
|
[15] |
王力,陈双庆,王佳楠,等. 基于面向对象和二叉树的油气集输管网水力计算[J]. 油气储运, 2019,38(12):1359-1365.
|
[16] |
SHADLOO M S, RAHMAT A, KARIMIPOUR A, et al. Estimation of pressure drop of two-phase flow in horizontal long pipes using artificial neural networks[J]. Journal of Energy Resources Technology, Transactions of the ASME, 2020,142(11). DOI: 10.1115/1.4047593.
|
[17] |
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[J]// Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017: 6000-6010.
|
[18] |
WANG C J, CHEN Y Y, ZHANG S Q, et al. Stock market index prediction using deep Transformer model[J]. Expert Systems with Applications, 2022,208. DOI: 10.1016/j.eswa.
|
|
2022.118128.
|
[19] |
DAIYA D, LIN C. Stock movement prediction and portfolio management via multimodal learning with Transformer[C]// 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 2021:3305-3309.
|
[20] |
XU M X, DAI W R, LIU C M, et al. Spatial-temporal transformer networks for traffic flow forecasting[J]. arXiv preprint arXiv:2001.02908, 2020.
|
[21] |
ZHANG Z Z, SONG W, LI Q Q. Dual-aspect self-attention based on transformer for remaining useful life prediction[J]. IEEE Transactions on Instrumentation and Measurement, 2022,71. DOI: 10.1109/TIM.2022.3160561.
|
[22] |
MO Y, WU Q H, LI X, et al. Remaining useful life estimation via transformer encoder enhanced by a gated convolutional unit[J]. Journal of Intelligent Manufacturing, 2021,32(7):1997-2006.
|
[23] |
ZHOU H Y, ZHANG S H, PENG J Q, et al. Informer: beyond efficient transformer for long sequence time-series forecasting[J]. arXiv preprint arXiv:2012.07436, 2020.
|
[24] |
CHUA L O, ROSKA T. The CNN paradigm[J]. IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications, 1993,40(3):147-156.
|
[25] |
FABBRI M, MORO G. Dow jones trading with deep learning: The unreasonable effectiveness of recurrent neural networks[C]// Proceedings of the 7th International Conference on Data Science, Technology and Applications. 2018:142-153.
|
[26] |
LIN M, CHEN Q, YAN S C. Network in network[J]. a arXiv preprint arXiv:1312.4400, 2013.
|
[27] |
HU J, SHEN L, ALBANIE S, et al. Squeeze-and-excitation networks.[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020,42(8):2011-2023.
|
[28] |
KINGMA D, BA J. Adam: A method for stochastic optimization[J]. arXiv preprint arXiv :1412.6980, 2014.
|
[29] |
LIAW A, WIENER M. Classification and regression by randomForest[J]. R News, 2002,2(3):18-22.
|
[30] |
KE G L, MENG Q, FINLEY T, et al. LightGBM: A highly efficient gradient boosting decision tree[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017:3149-3157.
|
[31] |
CHEN T Q, HE T, BENESTY M, et al. XGBoost: Extreme Gradient Boosting[EB/OL]. [2022-10-18]. https://cran.r-project.org/web/packages/xgboost/xgboost.pdf.
|
[32] |
STEINBERG D. CART: Classification and regression trees[M]// The Top Ten Algorithms in Data Mining. Chapman and Hall/CRC, 2009:193-216.
|
[33] |
SMOLA A J, SCHOLKOPF B. A tutorial on support vector regression[J]. Statistics and Computing,2004,14(3):199-222.
|
[34] |
ZHU J, ZOU H, ROSSET S, et al. Multi-class AdaBoost[J]. Statistics and its Interface, 2009,2(3):349-360.
|
[35] |
RAMCHOUN H, IDRISSI J M A, GHANOU Y, et al. Multilayer perceptron: Architecture optimization and training[C]// Proceedings of the 2nd International Conference on Big Data, Cloud and Applications. 2017:1-6.
|