[1] Waegeman W, Dembczynski K, Hullermeier E. Tutorial on multi-target prediction[C]// Proceedings of the International Conference on Machine Learning (ICML). 2013.
[2] Dembczyński K, Waegeman W, Cheng W, et al. On label dependence and loss minimization in multi-label classification[J]. Machine Learning, 2012,88(1-2):5-45.
[3] Balasubramanian K, Lebanon G. The landmark selection method for multiple output prediction[C]// Proceedings of the 29th International Conference on Machine Learning. 2012.
[4] Kuznar D, Mozina M, Bratko I.Curve prediction with kernel regression[C]// Proceeding of the 1st Workshop on Learning from Multi-Label Data. 2009:61-68.
[5] Deroski S, Kobler A, Gjorgjioski V, et al. Using decision trees to predict forest stand height and canopy cover from LANDSAT and LIDAR data[C]// The 20th International Conference on Informatics of Environmental Protection. 2006:125-133.
[6] Kocev D, Deroski S, White M D, et al. Using single and multi-target regression trees and ensembles to model a compound index of vegetation condition[J]. Ecological Modelling, 2009,220(8):1159-1168.
[7] Izenman A J.Modern Multivariate Statistical Techniques: Regression, Classification, and Manifold Learning[M]. New York: Springer, 2008.
[8] Godbole S, Sarawagi S. Discriminative methods for multi-labeled classification[C]// Pacific-Asia Conference on Knowledge Discovery and Data Mining. Springer Berlin Heidelberg, 2004:22-30.
[9] Read J, Pfahringer B, Holmes G, et al. Classifier chains for multi-label classification[C]// Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Dataset. 2009:254-269.
[10] Grossmann E. AdaTree: Boosting a weak classifier into a decision tree[C]// Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop. 2004:105.
[11] Read J, Pfahringer B, Holmes G, et al. Classifier chains for multi-label classification[C]// Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer Berlin Heidelberg, 2009:254-269.
[12] Kocev D, Vens C, Struyf J, et al. Tree ensembles for predicting structured outputs[J]. Pattern Recognition, 2013,46(3):817-833.
[13] Tsoumakas G, Spyromitros-Xioufis E, Vilcek J, et al. Mulan: A Java library for multilabel learning[J]. Journal of Machine Learning Research, 2011,12(7):2411-2414.
[14] Witten IH, Frank E. Data Mining: Practical machine learning tools and techniques[M]. Morgan Kaufmann, 2005.
[15] Tsoumakas G, Spyromitros-Xioufis E, Vrekou A, et al. Multi-target regression via random linear target combinations[C]//Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer Berlin Heidelberg, 2014:225-240.
[16] Breiman L, Friedman J, Stone C J, et al. Classification and regression trees[M]. CRC Press, 1984.
[17] Demar J. Statistical comparisons of classifiers over multiple data sets[J]. Journal of Machine Learning Research, 2006,7(1):1-30.
[18] Litchfield J T, Wilcoxon F. A simplified method of evaluating dose-effect experiments[J]. Journal of Pharmacology and Experimental Therapeutics, 1999,96(2):99-113.
[19] 〖JP2〗Sun Y, Xiong Y, Xu Q, et al. A hadoop-based method to predict potential effective drug combination[J]. BioMed Research International, 2014, doi:10.1155/2014/196858.
[20] Sharmila K, Vethamanickam S A. Survey on data mining algorithm and its application in healthcare sectors using Hadoop platform[J]. International Journal of Emerging Technogy and Advanced Engineering, 2015,5(1):567-71.
[21] Bifet A. Mining big data in real time[J]. Informatica, 2013,37(1):15-20.
[22] Agneeswaran V S. Big Data Analytic Beyond Hadoop: Real-Time Applications with Storm, Spark, and More Hadoop Alternatives[M]. Pearson Education, 2014.
|