The GLMSELECT Procedure

References

  • Breiman, L. (1992), “The Little Bootstrap and Other Methods for Dimensionality Selection in Regression: X-Fixed Prediction Error,” Journal of the American Statistical Association, 87, 738–754.

  • Burnham, K. P. and Anderson, D. R. (2002), Model Selection and Multimodel Inference, Second Edition, New York: Springer-Verlag.

  • Darlington, R. B. (1968), “Multiple Regression in Psychological Research and Practice,” Psychological Bulletin, 69, 161–182.

  • Donoho, D. L. and Johnstone, I. M. (1994), “Ideal Spatial Adaptation via Wavelet Shrinkage,” Biometrika, 81, 425–455.

  • Draper, N. R., Guttman, I., and Kanemasu, H. (1971), “The Distribution of Certain Regression Statistics,” Biometrika, 58, 295–298.

  • Efron, B., Hastie, T., Johnstone, I., and Tibshirani, R. (2004), “Least Angle Regression (with Discussion),” Annals of Statistics, 32, 407–499.

  • Efron, B. and Tibshirani, R. (1993), An Introduction to the Bootstrap, New York: Chapman & Hall.

  • Eilers, P. H. C. and Marx, B. D. (1996), “Flexible Smoothing with B-Splines and Penalties,” Statistical Science, 11, 89–121, with discussion.

  • Foster, D. P. and Stine, R. A. (2004), “Variable Selection in Data Mining: Building a Predictive Model for Bankruptcy,” Journal of the American Statistical Association, 99, 303–313.

  • Harrell, F. E. (2001), Regression Modeling Strategies, New York: Springer-Verlag.

  • Hastie, T., Tibshirani, R., and Friedman, J. (2001), The Elements of Statistical Learning, New York: Springer-Verlag.

  • Hocking, R. R. (1976), “The Analysis and Selection of Variables in a Linear Regression,” Biometrics, 32, 1–50.

  • Hurvich, C. M., Simonoff, J. S., and Tsai, C. L. (1998), “Smoothing Parameter Selection in Nonparametric Regression Using an Improved Akaike Information Criterion,” Journal of the Royal Statistical Society, Series B, 60, 271–293.

  • Hurvich, C. M. and Tsai, C.-L. (1989), “Regression and Time Series Model Selection in Small Samples,” Biometrika, 76, 297–307.

  • Judge, G. G., Griffiths, W. E., Hill, R. C., Lutkepohl, H., and Lee, T. C. (1985), The Theory and Practice of Econometrics, Second Edition, New York: John Wiley & Sons.

  • Mallows, C. L. (1967), “Choosing a Subset Regression,” Bell Telephone Laboratories.

  • Mallows, C. L. (1973), “Some Comments on $C_ p$,” Technometrics, 15, 661–675.

  • Miller, A. (2002), Subset Selection in Regression, Second Edition, Chapman & Hall/CRC.

  • Osborne, M., Presnell, B., and Turlach, B. (2000), “A New Approach to Variable Selection in Least Squares Problems,” IMA Journal of Numerical Analysis, 20, 389–404.

  • Raftery, A. E., Madigan, D., and Hoeting, J. A. (1997), “Bayesian Model Averaging for Linear Regression Models,” Journal of the American Statistical Association, 92, 179–191.

  • Reichler, J. L., ed. (1987), The 1987 Baseball Encyclopedia Update, New York: Macmillan.

  • Sarle, W. S. (2001), “Donoho-Johnstone Benchmarks: Neural Net Results,” ftp://ftp.sas.com/pub/neural/dojo/dojo.html, last accessed March 27, 2007.

  • Sawa, T. (1978), “Information Criteria for Discriminating Among Alternative Regression Models,” Econometrica, 46, 1273–1282.

  • Schwarz, G. (1978), “Estimating the Dimension of a Model,” Annals of Statistics, 6, 461–464.

  • Tibshirani, R. (1996), “Regression Shrinkage and Selection via the Lasso,” Journal of the Royal Statistical Society, Series B, 58, 267–288.

  • Time Inc. (1987), “What They Make,” Sports Illustrated, 54–81.

  • Zou, H. (2006), “The Adaptive Lasso and Its Oracle Properties,” Journal of the American Statistical Association, 101, 1418–1429.