Performance of Proposed Ridge – PCA Estimators: Simulation Evidence and Real Data Applications
DOI:
https://doi.org/10.62054/ijdm/0301.11Abstract
This study examines the longstanding problem of multicollinearity in Gaussian linear regression models and introduce novel estimation techniques aimed at improving estimator stability and predictive accuracy. While Ordinary Least Squares (OLS) estimators are efficient under ideal conditions, their performance deteriorates in the presence of high collinearity among explanatory variables. The study proposes hybrid estimators that combine Ridge Regression and Principal Component Analysis (PCA). Four new ridge parameters were developed and integrated with PCA to construct hybrid estimators designed to address multicollinearity. Monte Carlo simulations were conducted under varying levels of multicollinearity, sample sizes, and error variances. The performance of these proposed estimators was compared to existing methods using Mean Squared Error (MSE) as the evaluation criterion. The results consistently indicate that the proposed Ridge PCA hybrid estimators, particularly minimum version of the Chand and Kibria (2024) ridge parameter combined with Principal Component estimator (PCARCK2MIN) and Chand and Kibria (2024) ridge parameter combined with Principal Component estimator (PCARCK1) outperform existing methods. Applications to real – life datasets including Portland cement and Longley data validates the efficiency and practical relevance of the proposed estimators for regression analysis under multicollinearity.
References
Alabi, R. E., Alabi, O. O. and Ojo, O. O. (2025). Development of hybrid ridge–PCA estimators for addressing multicollinearity in Gaussian linear regression models. World Journal of Advanced Research and Reviews, 27(1), 942–957. https://doi.org/10.30574/wjarr.2025.27.1.2559
Aladesuyi, A., Ayinde, K. and Fayose, T. S. (2025). Assessing the role of significant roots in parameter estimation of linear regression models under multicollinearity. Tech–Sphere Journal of Pure and Applied Sciences, 2(1), 1–16. https://doi.org/10.5281/zenodo.15470100
Ayinde, K., Lukman, A. F., Alabi, O. O. and Bello, H. A. (2020). A new approach of principal component regression estimator with applications to collinear data. International Journal of Engineering Research and Technology, 13(7), 1616–1622.
Badawaire, A. B., Dawoud, I., Lukman, A. F., Laoye, V., and Arowolo, O. (2023). Biasing estimator to mitigate multicollinearity in linear regression model. Al-Bahir Journal for Engineering and Pure Sciences, 2(1), 1–9. https://doi.org/10.55810/2313-0083.1011
Belsley, D. A., Kuh, E. and Welsch, R. E. (1980). Regression diagnostics: Identifying influential data and sources of collinearity. John Wiley and Sons.
Bühlmann, P. and van de Geer, S. (2011). Statistics for high-dimensional data: Methods, theory and applications. Springer.
Buonaccorsi, J. P. (1996). A modified estimating equation approach to correcting for measurement error in regression. Biometrika, 83, 433–440.
Chand, S. and Kibria, B. M. G. (2024). A new ridge type estimator and its performance for the linear regression model: Simulation and application. Hacettepe Journal of Mathematics and Statistics, 53(3), 837–850. https://doi.org/10.15672/hujms.1359446
Chang, Y. and Yang, Y. (2012). Analysis of multicollinearity in regression models with an application to a real data example. Journal of Statistical Theory and Practice, 6(3), 455–470.
Chatterjee, S. and Hadi, A. S. (1977). Regression analysis by example. Wiley.
Chatterjee, S., Hadi, A. S., and Price, B. (2000). Regression analysis by example (3rd ed.). Wiley.
Dawoud, I. and Kibria, B. M. G. (2020). A new biased estimator to combat multicollinearity of the Gaussian linear regression model. Stats, 3, 526–541. https://doi.org/10.3390/stats3030032
Dawoud, I., Lukman, A. F., and Haadi, A. (2022). A new biased regression estimator: Theory, simulation and application. Scientific African, 15, e01100. https://doi.org/10.1016/j.sciaf.2022.e01100
Fayose, T. S., and Ayinde, K. (2019). Different forms of biasing parameter for generalized ridge regression estimator. International Journal of Computer Applications, 181, 21–29.
Fayose, T. S., Ayinde, K. and Alabi, O. O. (2023a). M-robust weighted ridge estimator in linear regression model. African Scientific Reports, 2(123), 1–28.
Fayose, T. S., Ayinde, K., Alabi, O. O. and Bello, A. H. (2023b). Robust weighted ridge regression based on S-estimator. African Scientific Reports, 2(126), 1–28.
Gujarati, D. N., Porter, D. C. and Gunasekar, S. (2012). Basic econometrics (5th ed.). Tata McGraw-Hill.
Hoerl, A. E. and Kennard, R. W. (1970). Ridge regression: Biased estimation for non-orthogonal problems. Technometrics, 12(1), 55–67. https://doi.org/10.1080/00401706.1970.10488634
Hoerl, A. E. and Kennard, R. W. (1975). Ridge regression: Iterative estimation of the biasing parameter. Communications in Statistics – Theory and Methods, 4(1), 77–88. https://doi.org/10.1080/03610927508827232
Huang, J. and Wang, H. (2018). A regularized estimator for high-dimensional generalized linear models with PCA. Journal of Statistical Computation and Simulation, 88(4), 675–686. https://doi.org/10.1080/00949655.2017.1417839
Jolliffe, I. T. (1986). Principal component analysis. Springer.
Jolliffe, I. T. (2002). Principal component analysis (2nd ed.). Springer.
Kibria, B. M. G. and Lukman, A. F. (2020). A new ridge-type estimator for the linear regression model: Simulations and applications. Scientifica. https://doi.org/10.1155/2020/9758378
Li, Y. and Yang, H. (2012). A new Liu-type estimator in linear regression model. Statistical Papers, 53, 427–437. https://doi.org/10.1007/s00362-010-0336-4
Liu, K. (1993). A new class of biased estimate in linear regression. Communications in Statistics – Theory and Methods, 22, 393–402. https://doi.org/10.1080/03610929308831027
Longley, J. W. (1967). An appraisal of least squares programs for the electronic computer. Journal of the American Statistical Association, 62, 819–841. https://doi.org/10.1080/01621459.1967.10482997
Trenkler,G.(1984). On the performance of Biased Estimators in the Linear Regression Model with Correlated or Heteroscedastic Errors. Journal of Econometrics, 25, 179 -190
Zou, H. and Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B, 67(2), 301–320. https://doi.org/10.1111/j.1467-9868.2005.00503.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Remilekun E. Alabi , Olatayo O. Alabi, Oluwadare O. Ojo (Author)

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors are solely responsible for obtaining permission to reproduce any copyrighted material contained in the manuscript as submitted. Any instance of possible prior publication in any form must be disclosed at the time the manuscript is submitted and a
copy or link to the publication must be provided.
The Journal articles are open access and are distributed under the terms of the Creative
Commons Attribution-NonCommercial-NoDerivs 4.0 IGO License, which permits use,
distribution, and reproduction in any medium, provided the original work is properly cited.
No modifications or commercial use of the articles are permitted.




