Check nearby libraries
Buy this book
This work doesn't have a description yet. Can you add one?
Check nearby libraries
Buy this book
Previews available in: English
Edition | Availability |
---|---|
1
Subset selection in regression
2002, Chapman & Hall/CRC
in English
- 2nd ed.
1584881712 9781584881711
|
aaaa
|
2 |
zzzz
|
Book Details
Table of Contents
Machine generated contents note: 1 Objectives
1.1 Prediction, explanation, elimination or what?
1.2 How many variables in the prediction formula?
1.3 Alternatives to using subsets
1.4 'Black box' use of best-subsets techniques
2 Least-squares computations
2.1 Using sums of squares and products matrices
2.2 Orthogonal reduction methods
2.3 Gauss-Jordan v. orthogonal reduction methods
2.4 Interpretation of projections
Appendix A. Operation counts for all-subsets regression
A.1 Garside's Gauss-Jordan algorithm
A.2 Planar rotations and a Hamiltonian cycle
A.3 Planar rotations and a binary sequence
A.4 Fast planar rotations
3 Finding subsets which fit well
3.1 Objectives and limitations of this chapter
3.2 Forward selection
3.3 Efroymson's algorithm
3.4 Backward elimination
3.5 Sequential replacement algorithms
3.6 Replacing two variables at a time
3.7 Genierating all subsets
3.8 Using branch-and-bound techniques
3.9 Grouping variables
3.10 Ridge regression and other alternatives
3.11 The nonnegative garrote and the lasso
3.12 Some examples
3.13 Conclusions and recommendations
Appendix A. An algorithm for the lasso
4 Hypothesis testing
4.1 Is there any information in the remaining variables?
4.2 Is one subset better than another?
4.2.1 Applications of Spj-tvoll's method
4.2.2 Using other confidence ellipsoids
Appendix A.Spjotvoll's method - detailed description
5 When to stop?
5.1 What criterion should we use?
5.2 Prediction criteria
5.2.1 Mean squared errors of prediction (MSEP)
5.2.2 MSEP for the fixed model
5.2.3 MSEP for the random model
5.2.4 A simulation with random predictors
5.3 Cross-validation and the P SS statistic
5.4 Bootstrapping
5.5 Likelihood and information-based stopping rules
5.5.1 Minimum description length (MDL)
Appendix A. Approximate equivaence of stppingules
A.1 F-to-enter
A.2 Adjusted R2 or Fisher's A-statistic
A.3 Akaikesinformatibn criterion (AIC)
6 Estatmaion of regression eficients
6.1 Selection bias
6.2 Choice between two varies
6.3 Selection rduction
6.3.1 Monte C o et tionfias i f d lection
6.3.2 Shrinkage methods
6.3.3 Using the jack-knife
6.3.4 Independent; data sets ;
6.4 Conditional likiood estimations
6.5 Estimationofpopulation means
6.6 Estimating least-squares projections ;
Appendix A. Changing projections to equate sums of squares
7 Bayesian mnethods
7.1 Bayesian introduction
7.2 'Spike and slab'prior
7.3 Normal prior for regression coefficients
7.4 Model averaging
7.5 Picking the best model
8 Conclusions and some recommendations
References
Index.
Edition Notes
Includes bibliographical references (p. 223-234) and index.
Classifications
The Physical Object
ID Numbers
Community Reviews (0)
July 26, 2024 | Edited by MARC Bot | import existing book |
December 14, 2022 | Edited by MARC Bot | import existing book |
September 15, 2021 | Edited by ImportBot | import existing book |
December 4, 2010 | Edited by Open Library Bot | Added subjects from MARC records. |
December 10, 2009 | Created by WorkBot | add works page |