[1]LeCun Y, Bottou L, Bengio Y. Reading checks with multilayer graph transformer networks[C]// 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing. 1997,1:151-154.
[2]LeCun Y, Bottou L, Bengio Y, et al. Gradient-based learning applied to document recognition[J]. Proceedings of the IEEE, 1998,86(11):2278-2324.
[3]Krizhevsky A, Sutskever I, Hinton G E. ImageNet classification with deep convolutional neural networks[C]// Proceedings of the 25th International Conference on Neural Information Processing Systems. 2012:1097-1105.
[4]Sun Yi, Chen Yuheng, Wang Xiaogang, et al. Deep learning face representation by joint identification-verification[C]// Advances in Neural Information Processing Systems. 2014:1988-1996.
[5]Lin M, Chen Q, Yan S. Network in network[J]. arXiv preprint arXiv:1312. 4400, 2013.
[6]Glorot X, Bordes A, Bengio Y. Deep sparse rectifier neural networks[C]// Proceedings of the 4th International Conference on Artificial Intelligence and Statistics. 2011:315-323.
[7]Hinton G E, Salakhutdinov R R. Reducing the dimensionality of data with neural networks[J]. Science, 2006,313(5786):504-507.
[8]Masci J, Meier U, Ciresn D, et al. Stacked convolutional auto-encoders for hierarchical feature extraction[M]// Artificial Neural Networks and Machine Learning-ICANN 2011. Springer Berlin Heidelberg, 2011:52-59.
[9]Makhzani A, Frey B J. Winner-Take-All autoencoders[C]// Advances in Neural Information Processing Systems. 2015:2773-2781.
[10]Makhzani A, Frey B. K-sparse autoencoders[J]. arXiv preprint arXiv:1312. 5663, 2013.
[11]Ngiam J, Chen Z, Bhaskar S A, et al. Sparse filtering[C]// Advances in Neural Information Processing Systems. 2011:1125-1133.
[12]Zeiler M D, Krishnan D, Taylor G W, et al. Deconvolutional networks[C]// 2010 IEEE Conference on Computer Vision and Pattern Recognition(CVPR). 2010:2528-2535.
[13]Saxe A M, Wei Pang, Koh Zhenghao, et al. On random weights and unsupervised feature learning[C]// Proceedings of the 28th International Conference on Machine Learning (ICML-11). 2011:1089-1096. |