[1]薛晖,史娜.基于Kreǐn空间的两阶段子空间学习[J].东南大学学报(自然科学版),2019,49(3):589-594.[doi:10.3969/j.issn.1001-0505.2019.03.026]
 Xue Hui,Shi Na.Two-phase subspace learning based on Kreǐn spaces[J].Journal of Southeast University (Natural Science Edition),2019,49(3):589-594.[doi:10.3969/j.issn.1001-0505.2019.03.026]
点击复制

基于Kreǐn空间的两阶段子空间学习()
分享到:

《东南大学学报(自然科学版)》[ISSN:1001-0505/CN:32-1178/N]

卷:
49
期数:
2019年第3期
页码:
589-594
栏目:
计算机科学与工程
出版日期:
2019-05-20

文章信息/Info

Title:
Two-phase subspace learning based on Kreǐn spaces
作者:
薛晖史娜
东南大学计算机科学与工程学院, 南京 211189; 东南大学计算机网络和信息集成教育部重点实验室, 南京 211189
Author(s):
Xue Hui Shi Na
School of Computer Science and Engineering, Southeast University, Nanjing 211189, China
Key Laboratory of Computer Network and Information Integration, Southeast University, Nanjing 211189, China
关键词:
Kreǐn空间 不定核 不定核Fisher判别分析 不定核典型相关分析 不定核主成分分析
Keywords:
Kreǐn spaces indefinite kernel indefinite kernel Fisher discriminant analysis indefinite kernel canonical correlation analysis indefinite kernel principal component analysis
分类号:
TP391.4
DOI:
10.3969/j.issn.1001-0505.2019.03.026
摘要:
为了解决不定核Fisher判别分析(IKFDA)在处理高维小样本数据时的病态问题,基于Kreǐn空间提出了两阶段的IKFDA学习框架TP-IKFDA;为了解决不定核典型相关分析(IKCCA)在处理高维小样本数据时的过拟合问题,提出了两阶段的IKCCA学习框架TP-IKCCA.通过不定核主成分分析(IKPCA)进行降维处理,减弱高维特征所带来的负面影响;然后,在降维后的特征空间中进行Fisher判别分析(FDA)或典型相关分析(CCA).真实数据集上的试验结果表明,与IKPCA、IKFDA以及IKFDA的改进算法相比,TP-IKFDA的分类精度明显提高;TP-IKCCA相较于现有的IKCCA模型泛化性能得到了进一步改善.因此,在处理高维小样本数据时,TP-IKFDA和TP-IKCCA的实际泛化性能优于现有的不定核子空间学习技术.
Abstract:
To solve the ill-conditioned problem of indefinite kernel fisher discriminant analysis(IKFDA)when processing the data with a small sample size and high dimension, a two-phase IKFDA(TP-IKFDA)learning framework was proposed based on Kreǐn spaces. To solve the over-fitting problem of indefinite kernel canonical correlation analysis(IKCCA)when processing the data with a small sample size and high dimension, a two-phase IKCCA(TP-IKCCA)learning framework was proposed. Indefinite kernel principal component analysis(IKPCA)was used to reduce the dimension of data to alleviate the negative impact caused by high-dimensional features. Then, fisher discriminant analysis(FDA)or canonical correlation analysis(CCA)was performed in the reduced dimensional space. The experimental results on real-world datasets demonstrate that compared with IKPCA, IKFDA and improved IKFDA algorithms, the classification accuracy of TP-IKFDA is obviously improved. Compared with the existing IKCCA models, the generalization performance of TP-IKCCA is also improved. Therefore, when processing the data with a small sample size and high dimension, the generalization performance of TP-IKFDA and TP-IKCCA in practice is higher than that of the related indefinite kernel subspace learning technology.

参考文献/References:

[1] Yang J,Frangi A F, Yang J Y, et al. KPCA plus LDA: A complete kernel fisher discriminant framework for feature extraction and recognition[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 27(2): 230-244. DOI:10.1109/tpami.2005.33.
[2] Liwicki S, Zafeiriou S, Tzimiropoulos G, et al. Efficient online subspace learning with an indefinite kernel for visual tracking and recognition[J]. IEEE Transactions on Neural Networks and Learning Systems, 2012, 23(10): 1624-1636. DOI:10.1109/tnnls.2012.2208654.
[3] Ling H, Jacobs D W. Using the inner-distance for classification of articulated shapes[C]//2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. San Diego, CA, USA, 2005: 719-726. DOI:10.1109/CVPR.2005.362.
[4] Xu W P, Wilson R C, Hancock E R. Determining the cause of negative dissimilarity eigenvalues[M]// Computer Analysis of Images and Patterns. Berlin: Heidelberg. 2011: 589-597. DOI:10.1007/978-3-642-23672-3_71.
[5] Huang X L,Suykens J A K, Wang S N, et al. Classification with truncated distance kernel[J]. IEEE Transactions on Neural Networks and Learning Systems, 2018, 29(5): 2025-2030. DOI:10.1109/tnnls.2017.2668610.
[6] Simard P Y, Le Cun Y A, Denker J S, et al. Transformation invariance in pattern recognition: Tangent distance and propagation[J]. International Journal of Imaging Systems and Technology, 2000, 11(3): 181-197. DOI:10.1002/1098-1098(2000)11:3181::aid-ima1003>3.3.co;2-5.
[7] Jacobs D W,Weinshall D, Gdalyahu Y. Classification with nonmetric distances: Image retrieval and class representation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(6): 583-600. DOI:10.1109/34.862197.
[8] Samarov D, Marron J S, Liu Y F, et al. Local kernel canonical correlation analysis with application to virtual drug screening[J]. The Annals of Applied Statistics, 2011, 5(3): 2169-2196. DOI:10.1214/11-aoas472.
[9] Haasdonk B, Pekalska E. Indefinite kernel fisher discriminant[C]//2008 19th International Conference on Pattern Recognition. Tampa, FL, USA, 2008: 1-4. DOI:10.1109/ICPR.2008.4761718.
[10] Haasdonk B, Pçkalska E. Indefinite kernel discriminant analysis[C]//Proceedings of COMPSTAT’2010. Paris,France, 2010: 221-230. DOI:10.1007/978-3-7908-2604-3_20.
[11] Yang J, Fan L Y. A novel indefinite kernel dimensionality reduction algorithm: Weighted generalized indefinite kernel discriminant analysis[J]. Neural Processing Letters, 2014, 40(3): 301-313. DOI:10.1007/s11063-013-9330-9.
[12] Schleif F M, Gisbrecht A, Tino P. Large scale indefinite kernel fisher discriminant[M]//Schleif F M, Gisbrecht A, Tino P, ed. Similarity-Based Pattern Recognition. Cham: Springer, 2015: 160-170. DOI:10.1007/978-3-319-24261-3_13.
[13] Zafeiriou S. Subspace learning in kreǐn spaces: Complete kernel fisher discriminant analysis with indefinite kernels[C]//2012 European Conference on Computer Vision. Florence, Italy, 2012: 488-501. DOI: 10.1007/978-3-642-33765-9_35.
[14] 崔燕,范丽亚. 高维数据正定核与不定核的KPCA变换阵比较[J]. 山东大学学报(工学版),2011,41(1): 17-23, 39.
  Cui Y, Fan L Y. Comparison of KPCA transformation matrices with definite and indefinite kernels for high-dimensional data[J]. Journal of Shandong University(Engineering Science), 2011, 41(1): 17-23, 39.(in Chinese)
[15] Lancewicki T. Regularization of the kernel matrix via covariance matrix shrinkage estimation[EB/OL].(2019-07-19)[2018-05-20]. https://arxiv.org/pdf/1707.06156.pdf.
[16] 孙廷凯. 增强型典型相关分析研究与应用[D]. 南京: 南京航空航天大学,2006.
  Sun T K. Research on enhanced canonical correlation analysis with applications[D]. Nanjing: Nanjing University of Aeronautics and Astronautics,2006.(in Chinese)
[17] Huang X L, Maier A,Hornegger J, et al. Indefinite kernels in least squares support vector machines and principal component analysis[J]. Applied and Computational Harmonic Analysis, 2017, 43(1): 162-172. DOI:10.1016/j.acha.2016.09.001.
[18] Relator R, Kato T, Lemence R. Improved protein-ligand prediction using kernel weighted canonical correlation analysis[J]. IPSJ Transactions on Bioinformatics, 2013, 6: 18-28. DOI:10.2197/ipsjtbio.6.18.
[19] Melzer T, Reiter M, Bischof H. Appearance models based on kernel canonical correlation analysis[J]. Pattern Recognition, 2003, 36(9): 1961-1971. DOI:10.1016/s0031-3203(03)00058-x.
[20] Li J D, Cheng K W, Wang S H, et al. Feature selection[J].ACM Computing Surveys, 2018, 50(6): 1-45. DOI:10.1145/3136625.
[21] Kuncheva L I, Whitaker C J. Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy[J]. Machine Learning, 2003, 51(2): 181-207. DOI:10.1023/A:1022859003006.
[22] Blum A, Mitchell T. Combining labeled and unlabeled data with co-training[C]//Proceedings of the Eleventh Annual Conference on Computational Learning Theory. Madison, USA, 1998: 92-100. DOI: 10.1145/279943.279962.
[23] Li F F, Fergus R, Perona P. One-shot learning of object categories[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2006, 28(4): 594-611. DOI:10.1109/tpami.2006.79.

备注/Memo

备注/Memo:
收稿日期: 2018-10-21.
作者简介: 薛晖(1979—),女,博士,副教授,博士生导师,hxue@seu.edu.cn.
基金项目: 国家重点研发计划资助项目(2017YFB1002801)、国家自然科学基金资助项目(61876091).
引用本文: 薛晖,史娜.基于Kreǐn空间的两阶段子空间学习[J].东南大学学报(自然科学版),2019,49(3):589-594. DOI:10.3969/j.issn.1001-0505.2019.03.026.
更新日期/Last Update: 2019-05-20