[1] HOSSEINI S, TURHAN B, MANTYLA M. A benchmark study on the effectiveness of search-based data selection and feature selection for cross project defect prediction[J]. Information and Software Technology, 2018,95:296-312.
[2] GANGANWAR V. An overview of classification algorithms for imbalanced datasets[J]. International Journal of Emerging Technology and Advanced Engineering, 2012,2(4):42-47.
[3] MAHMOOD Z, BOWES D, HALL T, et al. Reproducibility and replicability of software defect prediction studies[J]. Information and Software Technology, 2018,99(7):148
-163.
[4] 简艺恒,余啸. 一种基于数据过采样和集成学习的软件缺陷数目预测方法[J]. 计算机应用, 2018,38(9):2637-2643.
[5] MALHOTRA R, KAMAL S. An empirical study to investigate oversampling methods for improving software defect prediction using imbalanced data[J]. Neurocomputing, 2019,343(4):120-140.
[6] 戴翔,毛宇光. 基于集成混合采样的软件缺陷预测研究[J]. 计算机工程与科学, 2015,37(5):930-936.
[7] OH J H, HONG J Y, BAEK J G. Oversampling method using outlier detectable generative adversarial network[J]. Expert Systems with Applications, 2019,133(1):1-8.
[8] DONG Y, WANG X. A new over-sampling approach: Random-SMOTE for learning from imbalanced data sets[C]// International Conference on Knowledge Science, Engineering and Management. 2011:343-352.
[9] CHAWLA N V, BOWYER K W, HALL L O, et al. SMOTE: Synthetic minority over-sampling technique[J]. Journal of Artificial Intelligence Research, 2002,16(1):321-357.
[10]HAN H, WANG W Y, MAO B H. Borderline-SMOTE: A new over-sampling method in imbalanced data sets learning[C]// International Conference on Intelligent Computing. 2005:878-887.
[11]HE H, BAI Y, GARCIA E A, et al. ADASYN: Adaptive synthetic sampling approach for imbalanced learning[C]// 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence). 2008:1322-1328.
[12]封化民,李明伟,侯晓莲,等. 基于SMOTE和GBDT的网络入侵检测方法研究[J]. 计算机应用研究, 2017,34(12):3745-3748.
[13]WANG S, YAO X. Using class imbalance learning for software defect prediction[J]. IEEE Transactions on Reliability, 2013,62(2):434-443.
[14]JAYANTHI R, FLORENCE L. Software defect prediction techniques using metrics based on neural network classifier[J]. Cluster Computing, 2019,22(1):77-88.
[15]BHAGAT R C, PATIL S S. Enhanced SMOTE algorithm for classification of imbalanced big-data using random forest[C]// 2015 IEEE International Advance Computing Conference. 2015:403-408.
[16]NGUYEN H M, COOPER E W, KAMEI K. Borderline over-sampling for imbalanced data classification[C]// Proceedings of the 5th IEEE International Workshop on Computational Intelligence & Applications. 2009:24-29.
[17]BAUER E, KOHAVI R. An empirical comparison of voting classification algorithms: Bagging, boosting, and variants[J]. Machine Learning, 1999,36(1-2):105-139.
[18]DIETTERICH T G. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization[J]. Machine Learning, 2000,40(2):139-157.
[19]杨杰,燕雪峰,张德平. 基于Boosting的代价敏感软件缺陷预测方法[J]. 计算机科学, 2017,44(8):176-180.
[20]BENNIN K E, KEUNG J W, MONDEN A. On the relative value of data resampling approaches for software defect prediction[J]. Empirical Software Engineering, 2019,24(2):602-636.
[21]FENTON N E, NEIL M. A critique of software defect prediction models[J]. IEEE Transactions on Software Engineering, 1999,25(5):675-689.
[22]WAN H , WU G , YU M , et al. Software defect prediction based on cost-sensitive dictionary learning[J]. International Journal of Software Engineering and Knowledge Engineering, 2019,29(9):1219-1243.
[23]RODRIGUEZ-TORRES F,CARRASCO-OCHOA J A, MARTNEZ-TRINIDAD J F. Deterministic oversampling methods based on SMOTE[J]. Journal of Intelligent & Fuzzy Systems, 2019,36(5):4945-4955
[24]ALSAEEDI A, KHAN M Z. Software defect prediction using supervised machine learning and ensemble techniques: A comparative study[J]. Journal of Software Engineering and Applications, 2019,12(5):85-100.
|