| Peer-Reviewed

A Comparative Study of Methods of Remedying Multicolinearity

Received: 13 July 2023     Accepted: 3 August 2023     Published: 28 August 2023
Views:       Downloads:
Abstract

This study is aimed at investigate the impact of multicollinearity on a model's predictive accuracy and assess the effectiveness of different techniques in handling multicollinearity. The purpose of this study is to compare several methods of addressing multicollinearity in regression analysis and to determine their effectiveness in improving the accuracy and reliability of the results. The specific methods to be compared include OLS regression, Two-stage regression Ridge regression and Lasso regression. The study simulated six predictor variables with high levels of multicollinearity and compared the performance of four regression models: Ordinary Least Square (OLS), Two-Stage Least Squares (Two-Stage), Ridge regression, and Lasso regression. The models were evaluated using metrics such as the Variance Inflation Factor (VIF), root mean squared error (RMSE), Akaike information criterion (AIC), Bayesian information criterion (BIC), and adjusted R-squared. The results showed that Ridge and Lasso regression models were more effective in handling multicollinearity than OLS and Two-Stage regression models. Ridge regression had the lowest RMSE and best predictive performance among the models, and Ridge and Lasso regression had better model fit and were more effective in handling overfitting than OLS and Two-Stage regression models. The study concludes that using Ridge and Lasso regression models can improve a model's predictive accuracy and reduce the impact of multicollinearity on the model.

Published in American Journal of Theoretical and Applied Statistics (Volume 12, Issue 4)
DOI 10.11648/j.ajtas.20231204.14
Page(s) 87-91
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2023. Published by Science Publishing Group

Keywords

Multicolinearity, Two-Stage Least Squares, Lasso Regression, Ridge Regression

References
[1] Marquardt, D. W. (1970) Generalized Inverses, Ridge Regression, Biased Linear Estimation, and Nonlinear Estimation. Technometrics, 12, 591-612.
[2] Gujarati, D. N. and Porter, D. C. (2009) Basic Econometrics. 5th Edition, McGraw-Hill Education, New York.
[3] Belsley, D. A., Kuh, E., &Welsch, R. E. (1980). Regression diagnostics: Identifying influential data and sources of collinearity. New York: John Wiley & Sons.
[4] Kennedy PE. Eliminating problems caused by multicollinearity: a warning. J Econ Educ 1982, 13 (1): 62- 64.
[5] Johnston, J. (1972). Econometric Methods (Second ed.). New York: McGraw-Hill. pp. 159-168.
[6] Aiken, L. S. and West, S. G. (1991) Multiple Regression: Testing and Interpreting Interactions. Sage, Newbury Park.
[7] Chatterjee, S., &Hadi, A. S. (2015). Regression analysis by example. John Wiley & Sons.
[8] Davidson, R., & MacKinnon, J. G. (1993). Estimation and inference in econometrics. Oxford: Oxford University Press.
[9] Joreskog, K. G., & Sorbom, D. (1989). LISREL 7: User’s Reference Guide. Chicago, IL: Scientific Software.
[10] Hoerl, A. E., & Kennard, R. W. (1970). Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12 (1), 55-67.
[11] Miller AJ (1984) Selection of subsets of regression variables. J R Statist Soc A 147: 389-425.
[12] Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological), 58 (1), 267-288.
[13] Hair, J. F. (2010). Multivariate data analysis: a global perspective. Upper Saddle River, New Jersey, USA: Person Prentice Hall.
[14] Kim, J. O., & Mueller, C. W., (1978). Introduction to factor analysis: What it is and how to do it. Beverly Hills, CA: Sage.
[15] Kutner, M. H., Nachtsheim, C. J., Neter, J., & Li, W. (2005). Applied linear statistical models (5th ed.). McGraw-Hill.
[16] Efron, B. (1979). Bootstrap methods: Another look at the jackknife. The Annals of Statistics, 7 (1), 1-26.
[17] Montgomery, D. C., Peck, E. A., & Vining, G. G. (2012). Introduction to linear regression analysis. Hoboken, NJ: Wiley.
[18] Adnan, N (2006). Comparing three methods of handling multicollinearity using simulation approach. Masters thesis, Faculty of Sciences, Universiti teknologi Malaysia.
[19] Barrios, E. B and Vargas, G. A (2007). Forecasting from an Additive Model in the Presence of Multicollinearity, 10th National Convention on Statistics (NCS) EDSA Shangri-La Hotel.
[20] Chatelainy J, Kirsten R (2012) “Spurious Regressions andNear-Multicollinearity, with anApplication to Aid, Policies and Growth” MPRA Paper No. 42533, posted 11. November 2012 07: 43.
[21] Courville, T. & Thompson, B (2001). Use of structure coefficient in published multiple regression articles: B is not enough. Educational and Psychological measurements, 61, 229-248.
[22] Zou, H., & Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 67 (2), 301-320.
Cite This Article
  • APA Style

    Rita Obikimari Efeizomor. (2023). A Comparative Study of Methods of Remedying Multicolinearity. American Journal of Theoretical and Applied Statistics, 12(4), 87-91. https://doi.org/10.11648/j.ajtas.20231204.14

    Copy | Download

    ACS Style

    Rita Obikimari Efeizomor. A Comparative Study of Methods of Remedying Multicolinearity. Am. J. Theor. Appl. Stat. 2023, 12(4), 87-91. doi: 10.11648/j.ajtas.20231204.14

    Copy | Download

    AMA Style

    Rita Obikimari Efeizomor. A Comparative Study of Methods of Remedying Multicolinearity. Am J Theor Appl Stat. 2023;12(4):87-91. doi: 10.11648/j.ajtas.20231204.14

    Copy | Download

  • @article{10.11648/j.ajtas.20231204.14,
      author = {Rita Obikimari Efeizomor},
      title = {A Comparative Study of Methods of Remedying Multicolinearity},
      journal = {American Journal of Theoretical and Applied Statistics},
      volume = {12},
      number = {4},
      pages = {87-91},
      doi = {10.11648/j.ajtas.20231204.14},
      url = {https://doi.org/10.11648/j.ajtas.20231204.14},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajtas.20231204.14},
      abstract = {This study is aimed at investigate the impact of multicollinearity on a model's predictive accuracy and assess the effectiveness of different techniques in handling multicollinearity. The purpose of this study is to compare several methods of addressing multicollinearity in regression analysis and to determine their effectiveness in improving the accuracy and reliability of the results. The specific methods to be compared include OLS regression, Two-stage regression Ridge regression and Lasso regression. The study simulated six predictor variables with high levels of multicollinearity and compared the performance of four regression models: Ordinary Least Square (OLS), Two-Stage Least Squares (Two-Stage), Ridge regression, and Lasso regression. The models were evaluated using metrics such as the Variance Inflation Factor (VIF), root mean squared error (RMSE), Akaike information criterion (AIC), Bayesian information criterion (BIC), and adjusted R-squared. The results showed that Ridge and Lasso regression models were more effective in handling multicollinearity than OLS and Two-Stage regression models. Ridge regression had the lowest RMSE and best predictive performance among the models, and Ridge and Lasso regression had better model fit and were more effective in handling overfitting than OLS and Two-Stage regression models. The study concludes that using Ridge and Lasso regression models can improve a model's predictive accuracy and reduce the impact of multicollinearity on the model.},
     year = {2023}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - A Comparative Study of Methods of Remedying Multicolinearity
    AU  - Rita Obikimari Efeizomor
    Y1  - 2023/08/28
    PY  - 2023
    N1  - https://doi.org/10.11648/j.ajtas.20231204.14
    DO  - 10.11648/j.ajtas.20231204.14
    T2  - American Journal of Theoretical and Applied Statistics
    JF  - American Journal of Theoretical and Applied Statistics
    JO  - American Journal of Theoretical and Applied Statistics
    SP  - 87
    EP  - 91
    PB  - Science Publishing Group
    SN  - 2326-9006
    UR  - https://doi.org/10.11648/j.ajtas.20231204.14
    AB  - This study is aimed at investigate the impact of multicollinearity on a model's predictive accuracy and assess the effectiveness of different techniques in handling multicollinearity. The purpose of this study is to compare several methods of addressing multicollinearity in regression analysis and to determine their effectiveness in improving the accuracy and reliability of the results. The specific methods to be compared include OLS regression, Two-stage regression Ridge regression and Lasso regression. The study simulated six predictor variables with high levels of multicollinearity and compared the performance of four regression models: Ordinary Least Square (OLS), Two-Stage Least Squares (Two-Stage), Ridge regression, and Lasso regression. The models were evaluated using metrics such as the Variance Inflation Factor (VIF), root mean squared error (RMSE), Akaike information criterion (AIC), Bayesian information criterion (BIC), and adjusted R-squared. The results showed that Ridge and Lasso regression models were more effective in handling multicollinearity than OLS and Two-Stage regression models. Ridge regression had the lowest RMSE and best predictive performance among the models, and Ridge and Lasso regression had better model fit and were more effective in handling overfitting than OLS and Two-Stage regression models. The study concludes that using Ridge and Lasso regression models can improve a model's predictive accuracy and reduce the impact of multicollinearity on the model.
    VL  - 12
    IS  - 4
    ER  - 

    Copy | Download

Author Information
  • Department of Sociology, Faculty of Management and Social Sciences, University of Delta, Agbor, Nigeria

  • Sections