Comparison of Matrix Decomposition in Null Space-Based LDA Method

  • Carissa Devina Usman Diponegoro University
  • Farikhin Diponegoro University
  • Titi Udjiani Diponegoro University
Keywords: linear discriminant analysis, small sample size, singular value decomposition (SVD), cholesky decomposition, QR decomposition

Abstract

Problems with small sample sizes and high dimensionality are common in pattern recognition. Almost all machine learning algorithms degrade in high-dimensional data, so that singularities in the scatter matrices, the main problem of the Linear Discriminant Analysis (LDA) technique, might result. A null space-based LDA (NLDA) has been conceived to address the singularity issue. NLDA aims to maximize the distance between classes in the null space of the within-class scatter matrix. In the first research, the NLDA method was performed by computing the eigenvalue decomposition and singular value decomposition (SVD). This research led to several new implementations of the NLDA method that use other matrix decompositions. The new implementations include NLDA using Cholesky decomposition and NLDA using QR decomposition. This paper compares the performance of three NLDA methods that use different matrix decompositions, namely, SVD, Cholesky decomposition, and QR decomposition. Two sets of data were used in the experiments that used three different NLDA algorithms. To determine the performance of the NLDA methods, the classification accuracy of the three methods was measured using the confusion matrix. The results show that the NLDA method using SVD has the best performance when compared to the other two methods, achieving a precision of 77.8% accuracy for the Colon dataset and a precision of 98.8% accuracy for the TKI-resistance dataset.

Downloads

Download data is not yet available.

References

A. Benouareth, “An efficient face recognition approach combining likelihood-based sufficient dimension reduction and LDA,” Multimed. Tools Appl., vol. 80, no. 1, pp. 1457–1486, 2021, doi: 10.1007/s11042-020-09527-9.

B. Li, Z. T. Fan, X. L. Zhang, and D. S. Huang, “Robust dimensionality reduction via feature space to feature space distance metric learning,” Neural Networks, vol. 112, pp. 1–14, 2019, doi: 10.1016/j.neunet.2019.01.001.

K. S. Chandrasekar and P. Geetha, “A new formation of supervised dimensionality reduction method for moving vehicle classification,” Neural Comput. Appl., vol. 33, no. 13, pp. 7839–7850, 2021, doi: 10.1007/s00521-020-05524-z.

J. P. Gygi, S. H. Kleinstein, and L. Guan, “Predictive overfitting in immunological applications: Pitfalls and solutions,” Hum. Vaccines Immunother., vol. 19, no. 2, 2023, doi: 10.1080/21645515.2023.2251830.

Y. Li, B. Liu, Y. Yu, H. Li, J. Sun, and J. Cui, “3E-LDA: Three Enhancements to Linear Discriminant Analysis,” ACM Trans. Knowl. Discov. Data, vol. 15, no. 4, 2021, doi: 10.1145/3442347.

Z. He, M. Wu, X. Zhao, S. Zhang, and J. Tan, “Representative null space LDA for discriminative dimensionality reduction,” Pattern Recognit., vol. 111, p. 107664, 2021, doi: 10.1016/j.patcog.2020.107664.

R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification (2nd ed), 2nd ed., no. April. New York: John Wiley & Sons, 2000.

A. Sharma and K. K. Paliwal, “Linear discriminant analysis for the small sample size problem: an overview,” Int. J. Mach. Learn. Cybern., vol. 6, no. 3, pp. 443–454, 2015, doi: 10.1007/s13042-013-0226-9.

I. Husein and R. Widyasari, “Algorithm Symmetric 2-DLDA for Recognizing Handwritten Capital Letters,” MATRIK J. Manajemen, Tek. Inform. dan Rekayasa Komput., vol. 21, no. 2, pp. 389–402, 2022, doi: 10.30812/matrik.v21i2.1254.

S. Benkhaira and A. Layeb, “Face recognition using RLDA method based on mutated cuckoo search algorithm to extract optimal features,” Int. J. Appl. Metaheuristic Comput., vol. 11, no. 2, pp. 118–133, 2020, doi: 10.4018/IJAMC.2020040106.

M. I. Afjal, M. N. I. Mondal, and M. Al Mamun, “Segmented Linear Discriminant Analysis for Hyperspectral Image Classification,” 12th Int. Conf. Electr. Comput. Eng. ICECE 2022, no. December, pp. 204–207, 2022, doi: 10.1109/ICECE57408.2022.10088677.

J. Ye and T. Xiong, “Computational and theoretical analysis of null space and orthogonal linear discriminant analysis,” J. Mach. Learn. Res., vol. 7, pp. 1183–1204, 2006.

A. A. Joseph et al., “Online Person Identification based on Multitask Learning,” Int. J. Integr. Eng., vol. 13, no. 2, pp. 119–126, 2021, doi: 10.30880/ijie.2021.13.02.014.

J. Liu, X. Cai, and M. Niranjan, “GO-LDA : Generalised Optimal Linear Discriminant Analysis,” pp. 1–15, 2023.

R. Huang, Q. Liu, H. Lu, and S. Ma, “Solving the small sample size problem of LDA,” Proc. - Int. Conf. Pattern Recognit., vol. 16, no. 3, pp. 29–32, 2002, doi: 10.1109/icpr.2002.1047787.

D. Chu and G. S. Thye, “A new and fast implementation for null space based linear discriminant analysis,” Pattern Recognit., vol. 43, no. 4, pp. 1373–1379, 2010, doi: 10.1016/j.patcog.2009.10.004.

G. F. Lu and Y. Wang, “Feature extraction using a fast null space based linear discriminant analysis algorithm,” Inf. Sci. (Ny)., vol. 193, pp. 72–80, 2012, doi: 10.1016/j.ins.2012.01.015.

A. Sharma and K. K. Paliwal, “A new perspective to null linear discriminant analysis method and its fast implementation using random matrix multiplication with scatter matrices,” Pattern Recognit., vol. 45, no. 6, pp. 2205–2213, 2012, doi: 10.1016/j.patcog.2011.11.018.

G. F. Lu and W. Zheng, “Complexity-reduced implementations of complete and null-space-based linear discriminant analysis,” Neural Networks, vol. 46, pp. 165–171, 2013, doi: 10.1016/j.neunet.2013.05.010.

A. Zaib, T. Ballal, S. Khattak, and T. Y. Al-Naffouri, “A doubly regularized linear discriminant analysis classifier with automatic parameter selection,” IEEE Access, vol. 9, pp. 51343–51354, 2021, doi: 10.1109/ACCESS.2021.3068611.

L. F. Chen, H. Y. M. Liao, M. T. Ko, J. C. Lin, and G. J. Yu, “New LDA-based face recognition system which can solve the small sample size problem,” Pattern Recognit., vol. 33, no. 10, pp. 1713–1726, 2000, doi: 10.1016/S0031-3203(99)00139-9.

B. H. S. Utami, T. Trisnawati, R. Pratiwi, and M. Gumanti, “Robust Singular Value Decomposition Method on Minor Outlier Data,” J. Varian, vol. 4, no. 1, pp. 19–24, 2020, doi: 10.30812/varian.v4i1.857.

M. Jurek and M. Katzfuss, “Hierarchical sparse Cholesky decomposition with applications to high-dimensional spatio-temporal filtering,” Stat. Comput., vol. 32, no. 1, pp. 1–29, 2022, doi: 10.1007/s11222-021-10077-9.

I. A. Olajide, “Examination of QR Decomposition and the Singular Value Decomposition Methods,” J. Multidiscip. Eng. Sci. Stud., vol. 7, no. 4, 2021.

F. Konietschke, K. Schwab, and M. Pauly, “Small sample sizes: A big data problem in high-dimensional data analysis,” Stat. Methods Med. Res., vol. 30, no. 3, pp. 687–701, 2021, doi: 10.1177/0962280220970228.

P. Kokol, M. Kokol, and S. Zagoranski, “Machine learning on small size samples: A synthetic knowledge synthesis,” Sci. Prog., vol. 105, no. 1, pp. 1–16, 2022, doi: 10.1177/00368504211029777.

I. Markoulidakis, I. Rallis, I. Georgoulas, G. Kopsiaftis, A. Doulamis, and N. Doulamis, “Multiclass Confusion Matrix Reduction Method and Its Application on Net Promoter Score Classification Problem,” Technologies, vol. 9, no. 4, 2021, doi: 10.3390/technologies9040081.

R. Sistem, “JURNAL RESTI Analysis And Classification of Customer Churn Using Machine Learning,” vol. 5, no. 158, pp. 1253–1259, 2023.

Published
2024-06-04
How to Cite
Usman, C. D., Farikhin, & Titi Udjiani. (2024). Comparison of Matrix Decomposition in Null Space-Based LDA Method. Jurnal RESTI (Rekayasa Sistem Dan Teknologi Informasi), 8(3), 361 - 367. https://doi.org/10.29207/resti.v8i3.5637
Section
Information Technology Articles