Record ID | marc_loc_2016/BooksAll.2016.part29.utf8:183982028:3707 |
Source | Library of Congress |
Download Link | /show-records/marc_loc_2016/BooksAll.2016.part29.utf8:183982028:3707?format=raw |
LEADER: 03707cam a2200277 a 4500
001 2002020214
003 DLC
005 20060728202639.0
008 020125s2002 flua b 001 0 eng
010 $a 2002020214
020 $a1584881712 (acid-free paper)
040 $aDLC$cDLC$dDLC
050 00 $aQA278.2$b.M56 2002
082 00 $a519.5/36$221
100 1 $aMiller, Alan J.
245 10 $aSubset selection in regression /$cAlan Miller.
250 $a2nd ed.
260 $aBoca Raton :$bChapman & Hall/CRC,$cc2002.
300 $axvii, 238 p. :$bill. ;$c24 cm.
440 0 $aMonographs on statistics and applied probability ;$v95
504 $aIncludes bibliographical references (p. 223-234) and index.
505 8 $aMachine generated contents note: 1 Objectives -- 1.1 Prediction, explanation, elimination or what? -- 1.2 How many variables in the prediction formula? -- 1.3 Alternatives to using subsets -- 1.4 'Black box' use of best-subsets techniques -- 2 Least-squares computations -- 2.1 Using sums of squares and products matrices -- 2.2 Orthogonal reduction methods -- 2.3 Gauss-Jordan v. orthogonal reduction methods -- 2.4 Interpretation of projections -- Appendix A. Operation counts for all-subsets regression -- A.1 Garside's Gauss-Jordan algorithm -- A.2 Planar rotations and a Hamiltonian cycle -- A.3 Planar rotations and a binary sequence -- A.4 Fast planar rotations -- 3 Finding subsets which fit well -- 3.1 Objectives and limitations of this chapter -- 3.2 Forward selection -- 3.3 Efroymson's algorithm -- 3.4 Backward elimination -- 3.5 Sequential replacement algorithms -- 3.6 Replacing two variables at a time -- 3.7 Genierating all subsets -- 3.8 Using branch-and-bound techniques -- 3.9 Grouping variables -- 3.10 Ridge regression and other alternatives -- 3.11 The nonnegative garrote and the lasso -- 3.12 Some examples -- 3.13 Conclusions and recommendations -- Appendix A. An algorithm for the lasso -- 4 Hypothesis testing -- 4.1 Is there any information in the remaining variables? -- 4.2 Is one subset better than another? -- 4.2.1 Applications of Spj-tvoll's method -- 4.2.2 Using other confidence ellipsoids -- Appendix A.Spjotvoll's method - detailed description -- 5 When to stop? -- 5.1 What criterion should we use? -- 5.2 Prediction criteria -- 5.2.1 Mean squared errors of prediction (MSEP) -- 5.2.2 MSEP for the fixed model -- 5.2.3 MSEP for the random model -- 5.2.4 A simulation with random predictors -- 5.3 Cross-validation and the P SS statistic -- 5.4 Bootstrapping -- 5.5 Likelihood and information-based stopping rules -- 5.5.1 Minimum description length (MDL) -- Appendix A. Approximate equivaence of stppingules -- A.1 F-to-enter -- A.2 Adjusted R2 or Fisher's A-statistic -- A.3 Akaikesinformatibn criterion (AIC) -- 6 Estatmaion of regression eficients -- 6.1 Selection bias -- 6.2 Choice between two varies -- 6.3 Selection rduction -- 6.3.1 Monte C o et tionfias i f d lection -- 6.3.2 Shrinkage methods -- 6.3.3 Using the jack-knife -- 6.3.4 Independent; data sets ; -- 6.4 Conditional likiood estimations -- 6.5 Estimationofpopulation means -- 6.6 Estimating least-squares projections ; -- Appendix A. Changing projections to equate sums of squares -- 7 Bayesian mnethods -- 7.1 Bayesian introduction -- 7.2 'Spike and slab'prior -- 7.3 Normal prior for regression coefficients -- 7.4 Model averaging -- 7.5 Picking the best model -- 8 Conclusions and some recommendations -- References -- Index.
650 0 $aRegression analysis.
650 0 $aLeast squares.
856 41 $3Table of contents only$uhttp://www.loc.gov/catdir/toc/fy022/2002020214.html
856 42 $3Publisher description$uhttp://www.loc.gov/catdir/enhancements/fy0646/2002020214-d.html