目錄
1 Approaching Data Analysis
Introduction
1A The staircase and the shortcut to inference
1B Student's true contribution
1C Distributions and their troubles
1D A classical example: Wilson and Hilferty's analysis of Peirce's data
1E Kinds of nonnormality and robustness
1F The role of vague concepts
1G Other vague concepts
1H Indication, determination, or inference
Summary: Data analysis
2 Indication and Indicators
2A The value of indication
2B Examples of stopping with indication
2C Concealed inference
2D Choice of indicators
2E An example of choice of indicator
2F Indications of quality: Cross-validation
Summary: Indication and indicators
3 Displays and Summaries for Batches
3A Stem-and-leaf.
3B Medians, hinges, etc.
3C Mids and spreads
3D Subsampling
3E Exploratory plotting
3F Trends and running medians
3G Smoothing nonlinear regressions
3H Looking for patterns
3I Residuals more generally
3J Plotting and smoothing
Summary: Batch summaries and displays
4 Stralghtening Curves and Plots
The iden of straightening out curves
4A The ladder of re-expressions
4B Re-expressing y=x2
4C The bulging rule
4D More complicated curves
4E Scatter plots
Summary: Straightening curves
5 The Practice of Re-Expression
5A Kinds of numbers
5B Quick logs
5C Quick (square) roots and reciprocals
5D Quick re-expressions of counted fractions, percentages, etc.
5E Matching for powers and logs
5F Re-expressions for grades
5G Re-expressing ranks
5H First aid in re-expression
5I What to do with zeros—and infinities
Summary: Re-expression
6 Need We Re-Express?
General hints when re-expressed carrier is logx
7 Hunting out the Real Uncertainty
7A How ?/Vn can mislead
7B A further example of the need for direct assessment of variability
7C Choosing an error term
7D More detailed choices of error terms
7E Making direct assessment possible
7F Difficulties with direct assessment
7G Supplementary uncertainty and its combination with internal uncertainty
Summary: Hunting out the real uncertainty
8 A Method of Direct Assessment
8A The jackknife
Appendix to 8A
8B Examples with individuals
8C Jackknife using groups: Ratio estimation for a sample survey
8D A more complex example
8E Cross-validation in the example
8F Two simultaneous uses of "leave out one"
8G Dispersion of the p's
8H Further discussion of the example
Summary: The jackknife
9 Two- and More-Way Tables
9A PLUS analyses
9B Looking at two-way PL.US analyses
9C Taking advantage of levels
9D Polishing additive fits
9E Fitting one more constant
9F Using re-expression
9G Three- and more-way analyses
Summary:Two-way tables of responses
10 Robust and Resistant Measures
10A Resistance
10B Robustness
10C Robust and resistant estimates of location
10D Robust estimates of scale
10E Robust and resistant intervals
10F Resistant and robust regression
10G Multiple-component data
10H Closing comment
Summary: Resistant and robust techniques
11 Standardizing for Comparison
11A The simplest case
11B Direct standardization
11C Precision of directly standardized values
11D Difficulties with direct standardization
11E Indirect standardizing
11F Adjustment for broad categories
11G More than two broad categories
Summary: Standardizing for comparison
12 Regression for Fitting
Introduction
12A The two meanings of regression
12B Purposes of regression
12C Graphical fitting by stages
12D Collinearity
12E Linear dependence, exact and approximate
12F Keeping out what is imprecisely measured–Regression as exclusion
*12G Which straight line? (Optional)12H Using subsamples
Summary:Regression
13 Woes of Regression Coefficients
13A Meaning of coefficients in multiple regression
13B Linear adjustment as a mode of descripfion
13C Examples of linear adjustment
13D The relative unimportance of the exact carrier
13E Proxy phenomena
13F Sometimes x's can be "held constant"
13G Experiments, closed systems, and physical versus socialsciences, with examples
*13H Estimated variances are not enough
Summary: Woes of regression coefficients
14 A Class of Mechanisms for Fitting
14A Fitting lines—some through the origin
14B Matching as a way of fitting
14C Matchers tuned to a single coefficient—and catchers
14D Ordinary least squares
14E Tuning for ordinary least squares
14F Weighted least squares
14G Influence curves for location
14H Iteratively weighted linear least squares
14I Least absolute deviations (Optional)
14J Analyzing troubles
14K Proof of the statement of Section 13B
Summary: Mechanisms for fitting regression
15 Guided Regression
15A How can we be guided in what to fit?
15B Stepwise techniques
15C All-subset techniques
15D Combined techniques
15E Rearranging carriers—judgment components
15F Principal components
15G How much we are likely to learn?
15H Several y's or several studies
15I Regression starting where?
15J Arbitrary adjustment
Summary: Guided regression
16 Examining Regression Residuals
16A Examiningy
16B Variables—and other carriers
16C The next step: Looking with regard to an old variable, tas
16D Looking with regard to a new variable, trew
16E Looking for additional product terms
16F In what order?
Summary: Examining regression residuals
Appendix: Details About the Need to Re-Express
Problems
Data Exhibits for Problems
Index