%PDF-1.5
%
461 0 obj
<>
endobj
490 0 obj
<>stream
The last 30 years have seen extraordinary development of new tools for the prediction of numerical and binary responses. Examples include the LASSO method and elastic net for regularization in regression and variable selection, quantile regression for heteroscedastic data, and machine learning predictive methods such as classification and regression trees (CART), multivariate adaptive regression splines (MARS), random forests, gradient boosting machines (GBM), and support vector machines (SVM). All these methods are implemented in SAS®, giving the user an amazing toolkit of predictive methods. In fact, the set of available methods is so rich it begs the question, When should I use one or a subset of these methods instead of the other methods? In this talk I hope to provide a partial answer to this question through the application of several of these methods in the analysis of several real data sets with numerical and binary response variables. <br/><br/>Richard Cutler, Utah State University
Session 1967
en
jeff.foxx@sas.com
The last 30 years have seen extraordinary development of new tools for the prediction of numerical and binary responses. Examples include the LASSO method and elastic net for regularization in regression and variable selection, quantile regression for heteroscedastic data, and machine learning predictive methods such as classification and regression trees (CART), multivariate adaptive regression splines (MARS), random forests, gradient boosting machines (GBM), and support vector machines (SVM). All these methods are implemented in SAS®, giving the user an amazing toolkit of predictive methods. In fact, the set of available methods is so rich it begs the question, When should I use one or a subset of these methods instead of the other methods? In this talk I hope to provide a partial answer to this question through the application of several of these methods in the analysis of several real data sets with numerical and binary response variables. <br/><br/>Richard Cutler, Utah State University
PScript5.dll Version 5.2.2
2018-03-08T23:51:03.000-07:00
2018-03-08T23:51:03.000-07:00
2018-03-08T23:50:47.000-07:00
2018-04-03T15:45:33.672-04:00
773ab02efe9901f7de8f64f5d5c9a02c498a26a2
508135
application/pdf
2018-04-03T15:45:33.612-04:00
Richard
Prediction and Interpretation for Machine Learning Regression Methods
uuid:c8c1c27b-63a3-445c-aa70-733adb25bc00
uuid:4484beb0-805b-4d85-9c45-cf4de50e31c3
thirdparty
Acrobat Distiller 11.0 (Windows)
support:sgf-papers
year:2018
software:STAT
support:sgf-papers/session-type/breakout
support:sgf-papers/skill-level/intermediate
support:sgf-papers/topic/analytics/data-mining-predictive-modeling
support:customer-roles/professor
endstream
endobj
454 0 obj
<>
endobj
456 0 obj
<>
endobj
457 0 obj
<>
endobj
458 0 obj
<>
endobj
333 0 obj
<>
endobj
336 0 obj
<>
endobj
339 0 obj
<>
endobj
342 0 obj
<>
endobj
365 0 obj
<>
endobj
394 0 obj
<>
endobj
395 0 obj
<>stream
hޤZMϯ&oxw n;9z$[Y߇,VlifGիbۃ\~}D'.tC:5}V}}[ )~f]osNS'_9ٹ!0Nbi[a0}L:qD#[i`lyܮ{Ikxe$Qp>|ISO#G.ļ5=}vu|i?4{EjG˳PC/Y~I/m~&5Wm/7gM٪2^k25nΓLH5옝8-(P-[
vlWY:nM~x{Yd!{ghw{
E3UdrF⬱|csc4#i1o@$9\
}kJAmFU^UPX߅Ŧ
5$M)rܐ16f<.P/&ȅl4
Wن/{kr(<^ ic+
yPઍ:92ej6dܸ#DۉL+\f]
ɬT0
3h"/q<=t b4a!_u
?//Г꽲bHjo3FLCEyTyVrcG
UNVjSIBRgZvPA1̃FtOMϓ @ykAt3k#u,N٦w e-$_h=fk㖋SK&hJRzB\_D yC.t7.3YӒ߄d!Gz]>sL.]LJ2ըҸR|"3grL )LBJp[=꠬UFՁZ^E
WҖQ]7ullG*FzU3.5THN°JWLŤ5RN:?]SƬdĤ9O5+ _Dơԫ#S{&R(&ARk/PQ#xLVuk8X5xVq֧*!ɊR q&Xʆ2{ct@wM&7z:]î%DRFR
}n锳#Ζ8xW=л\yM8`6NS)O