Generalized, linear, and mixed models /

副标题:无

作   者:Charles E. McCulloch, Shayle R. Searle.

分类号:

ISBN:9780471193647

微信扫一扫,移动浏览光盘

简介

Summary: Publisher Summary 1 For graduate students and practicing statisticians, McCulloch (biostatistics, U. of California-San Francisco) and Searle (biometry, Cornell U.) begin by reviewing the basics of linear models and linear mixed models, in which the variance structure is based on random effects and their variance components. Then they head into the more difficult terrain of generalized linear models, generalized linear mixed models, and even some nonlinear models. The early chapters could provide a core for a one-quarter or one-semester course, or part of a course on linear models. Annotation c. Book News, Inc., Portland, OR (booknews.com)   Publisher Summary 2 Wiley Series in Probability and Statistics A modern perspective on mixed models The availability of powerful computing methods in recent decades has thrust linear and nonlinear mixed models into the mainstream of statistical application. This volume offers a modern perspective on generalized, linear, and mixed models, presenting a unified and accessible treatment of the newest statistical methods for analyzing correlated, nonnormally distributed data. As a follow-up to Searle's classic, Linear Models, and Variance Components by Searle, Casella, and McCulloch, this new work progresses from the basic one-way classification to generalized linear mixed models. A variety of statistical methods are explained and illustrated, with an emphasis on maximum likelihood and restricted maximum likelihood. An invaluable resource for applied statisticians and industrial practitioners, as well as students interested in the latest results, Generalized, Linear, and Mixed Models features: * A review of the basics of linear models and linear mixed models * Descriptions of models for nonnormal data, including generalized linear and nonlinear models * Analysis and illustration of techniques for a variety of real data sets * Information on the accommodation of longitudinal data using these models * Coverage of the prediction of realized values of random effects * A discussion of the impact of computing issues on mixed models  

目录

CONTENTS 7
PREFACE 20
1 INTRODUCTION 24
1.1 MODELS 24
a. Linear models (LM) and linear mixed models (LMM) 24
b. Generalized models (GLMs and GLMMs) 25
1.2 FACTORS, LEVELS, CELLS, EFFECTS AND DATA 25
1.3 FIXED EFFECTS MODELS 28
a. Example 1: Placebo and a drug 29
b. Example 2: Comprehension of humor 30
c. Example 3: Four dose levels of a drug 31
1.4 RANDOM EFFECTS MODELS 31
a. Example 4: Clinics 31
b. Notation 32
i. Properties of random effects in LMMs 32
ii. The notation of mathematical statistics 33
iii. Variance of y 34
iv. Variance and conditional expected values 34
c. Example 5: Ball bearings and calipers 35
1.5 LINEAR MIXED MODELS (LMMs) 36
a. Example 6: Medications and clinics 36
b. Example 7: Drying methods and fabrics 36
c. Example 8: Potomac River Fever 37
d. Regression models 37
e. Longitudinal data 37
f. Model equations 39
1.6 FIXED OR RANDOM? 39
a. Example 9: Clinic effects 39
b. Making a decision 40
1.7 INFERENCE 41
a. Estimation 43
i. Maximum likelihood (ML) 43
ii. Restricted maximum likelihood (REML) 44
iii. Solutions and estimators 44
iv. Bayes theorem 45
v. Quasi\u2013likelihood estimation 46
vi. Generalized estimating equations 46
b. Testing 46
i. Likelihood ratio test (LRT) 47
ii. Wald's procedure 47
c. Prediction 47
1.8 COMPUTER SOFTWARE 48
1.9 EXERCISES 48
2 ONE-WAY CLASSIFICATIONS 51
2.1 NORMALITY AND FIXED EFFECTS 52
a. Model 52
b. Estimation by ML 52
c. Generalized likelihood ratio test 54
d. Confidence intervals 55
i. For means 56
ii. For differences in means 56
iii. For linear combinations 57
iv. For the variance 57
e. Hypothesis tests 57
2.2 NORMALITY, RANDOM EFFECTS AND ML 57
a. Model 57
i. Covariances caused by random effects 58
ii. Likelihood 59
b. Balanced data 60
i. Likelihood 60
ii. ML equations and their solutions 60
iii. ML estimators 61
iv. Expected values and bias 62
v. Asymptotic sampling variances 63
vi. REML estimation 65
c. Unbalanced data 65
i. Likelihood 65
ii. ML equations and their solutions 65
iii. ML estimators 66
d. Bias 67
e. Sampling variances 67
2.3 NORMALITY, RANDOM EFFECTS AND REML 68
a. Balanced data 68
i. Likelihood 68
ii. REML equations and their solutions 69
iii. REML estimators 69
iv. Comparison with ML 70
v. Bias 70
vi. Sampling variances 71
b. Unbalanced data 71
2.4 MORE ON RANDOM EFFECTS AND NORMALITY 71
a. Tests and confidence intervals 71
i. For the overall mean, μ 71
ii. For σ[sup(2)] 72
iii. For σ[sup(2)][sub(a)] 72
b. Predicting random effects 72
i. A basic result 72
ii. In a 1-way classification 73
2.5 BERNOULLI DATA: FIXED EFFECTS 74
a. Model equation 74
b. Likelihood 74
c. ML equations and their solutions 75
d. Likelihood ratio test 75
e. The usual chi-square test 75
f. Large-sample tests and intervals 77
g. Exact tests and confidence intervals 78
h. Example: Snake strike data 79
2.6 BERNOULLI DATA: RANDOM EFFECTS 80
a. Model equation 80
b. Beta-binomial model 80
i. Means, variances, and covariances 81
ii. Overdispersion 82
iii. Likelihood 83
iv. ML estimation 83
v. Large-sample variances 84
vi. Large-sample tests and intervals 85
vii. Prediction 86
c. Legit-normal model 87
i. Likelihood 87
ii. Calculation of the likelihood 88
iii. Means, variances, and covariances 88
iv. Large-sample tests and intervals 89
v. Prediction 90
d. Probit-normal model 90
2.7 COMPUTING 91
2.8 EXERCISES 91
3 SINGLE-PREDICTOR REGRESSION 94
3.1 INTRODUCTION 94
3.2 NORMALITY: SIMPLE LINEAR REGRESSION 95
a. Model 95
b. Likelihood 96
c. Maximum likelihood estimators 96
d. Distributions of MLEs 97
e. Tests and confidence intervals 98
f. Illustration 98
3.3 NORMALITY: A NONLINEAR MODEL 99
a. Model 99
b. Likelihood 99
c. Maximum likelihood estimators 99
d. Distributions of MLEs 101
3.4 TRANSFORMING VERSUS LINKING 101
a. Transforming 101
b. Linking 102
c. Comparisons 102
3.5 RANDOM INTERCEPTS: BALANCED DATA 102
a. The model 103
b. Estimating μ and β 105
i. Estimation 105
ii. Unbiasedness 107
iii. Sampling distributions 107
c. Estimating variances 108
i. When ML solutions are estimators 108
ii. When an ML solution is negative 110
d. Tests of hypotheses \u2013 using LRT 111
i. Using the maximized log likelihood ι*(θ) 111
ii. Testing the hypothesis H[sub(0)] : σ[sup(2)][sub(a)] = 0 112
iii. Testing H[sub(0)] : β = 0 113
e. Illustration 114
f. Predicting the random intercepts 115
3.6 RANDOM INTERCEPTS: UNBALANCED DATA 117
a. The model 118
b. Estimating μ and β when variances are known 119
i. ML estimators 119
ii. Unbiasedness 122
iii. Sampling variances 122
iv. Predicting a[sub(i)] 122
3.7 BERNOULLI - LOGISTIC REGRESSION 123
a. Logistic regression model 123
b. Likelihood 125
c. ML equations 126
d. Large-sample tests and intervals 128
3.8 BERNOULLI - LOGISTIC WITH RANDOM INTERCEPTS 129
a. Model 129
b. Likelihood 131
c. Large-sample tests and intervals 131
d. Prediction 132
e. Conditional Inference 132
3.9 EXERCISES 134
4 LINEAR MODELS (LMs) 136
4.1 A GENERAL MODEL 137
4.2 A LINEAR MODEL FOR FIXED EFFECTS 138
4.3 MLE UNDER NORMALITY 139
4.4 SUFFICIENT STATISTICS 140
4.5 MANY APPARENT ESTIMATORS 141
a. General result 141
b. Mean and variance 142
c. Invariance properties 142
d. Distributions 143
4.6 ESTIMABLE FUNCTIONS 143
a. Introduction 143
b. Definition 144
c. Properties 144
d. Estimation 145
4.7 A NUMERICAL EXAMPLE 145
4.8 ESTIMATING RESIDUAL VARIANCE 147
a. Estimation 147
b. Distribution of estimators 148
4.9 COMMENTS ON 1- AND 2-WAY CLASSIFICATIONS 149
a. The 1-way classification 149
b. The 2-way classification 150
4.10 TESTING LINEAR HYPOTHESES 151
a. Using the likelihood ratio 152
4.11 t-TESTS AND CONFIDENCE INTERVALS 153
4.12 UNIQUE ESTIMATION USING RESTRICTIONS 154
4.13 EXERCISES 155
5 GENERALIZED LINEAR MODELS (GLMs) 158
5.1 INTRODUCTION 158
5.2 STRUCTURE OF THE MODEL 160
a. Distribution of y 160
b. Link function 161
c. Predictors 161
d. Linear models 162
5.3 TRANSFORMING VERSUS LINKING 162
5.4 ESTIMATION BY MAXIMUM LIKELIHOOD 162
a. Likelihood 162
b. Some useful identities 163
c. Likelihood equations 164
d. Large-sample variances 166
e. Solving the ML equations 166
f. Example: Potato flour dilutions 167
5.5 TESTS OF HYPOTHESES 170
a. Likelihood ratio tests 170
b. Wald tests 171
c. Illustration of tests 172
d. Confidence intervals 172
e. Illustration of confidence intervals 173
5.6 MAXIMUM QUASI\u2013LIKELIHOOD 173
a. Introduction 173
b. Definition 174
5.7 EXERCISES 177
6 LINEAR MIXED MODELS (LMMs) 179
6.1 A GENERAL MODEL 179
a. Introduction 179
b. Basic properties 180
6.2 ATTRIBUTING STRUCTURE TO VAR(y) 181
a. Example 181
b. Taking covariances between factors as zero 181
c. The traditional variance components model 183
i. Customary notation 183
ii. Amended notation 184
d. An LMM for longitudinal data 185
6.3 ESTIMATING FIXED EFFECTS FOR V KNOWN 185
6.4 ESTIMATING FIXED EFFECTS FOR V UNKNOWN 187
a. Estimation 187
b. Sampling variance 187
c. Bias in the variance 189
d. Approximate F-statistics 190
6.5 PREDICTING RANDOM EFFECTS FOR V KNOWN 191
6.6 PREDICTING RANDOM EFFECTS FOR V UNKNOWN 193
a. Estimation 193
b. Sampling variance 193
c. Bias in the variance 194
6.7 ANOVA ESTIMATION OF VARIANCE COMPONENTS 194
a. Balanced data 195
b. Unbalanced data 196
6.8 MAXIMUM LIKELIHOOD (ML) ESTIMATION 197
a. Estimators 197
b. Information matrix 198
c. Asymptotic sampling variances 199
6.9 RESTRICTED MAXIMUM LIKELIHOOD (REML) 199
a. Estimation 199
b. Sampling variances 200
6.10 ML OR REML? 200
6.11 OTHER METHODS FOR ESTIMATING VARIANCES 201
6.12 APPENDIX 201
a. Differentiating a log likelihood 201
i. A general likelihood under normality 201
ii. First derivatives 202
iii. Information matrix 202
b. Differentiating a generalized inverse 204
c. Differentiation for the variance components model 205
6.13 EXERCISES 207
7 LONGITUDINAL DATA 210
7.1 INTRODUCTION 210
7.2 A MODEL FOR BALANCED DATA 211
a. Prescription 211
b. Estimating the mean 211
c. Estimating V[sub(0)] 211
7.3 A MIXED MODEL APPROACH 212
a. Fixed and random effects 213
b. Variances 213
7.4 PREDICTING RANDOM EFFECTS 214
a. Uncorrelated subjects 215
b. Uncorrelated between, and within, subjects 215
c. Uncorrelated between, and autocorrelated within, subjects 216
d. Correlated between, but not within, subjects 216
7.5 ESTIMATING PARAMETERS 218
a. The general case 218
b. Uncorrelated subjects 219
c. Uncorrelated between, and within, subjects 220
d. Uncorrelated between, and autocorrelated within, subjects 222
e. Correlated between, but not within, subjects 224
7.6 UNBALANCED DATA 225
a. Example and model 225
b. Uncorrelated subjects 226
i. Matrix V and its inverse 226
ii. Estimating the fixed effects 227
iii. Predicting the random effects 227
c. Uncorrelated between, and within, subjects 227
i. Matrix V and its inverse 227
ii. Estimating the fixed effects 228
iii. Predicting the random effects 228
d. Correlated between, but not within, subjects 229
7.7 AN EXAMPLE OF SEVERAL TREATMENTS 229
7.8 GENERALIZED ESTIMATING EQUATIONS 231
7.9 A SUMMARY OF RESULTS 235
a. Balanced data 235
i. With some generality 235
ii. Uncorrelated subjects 236
iii. Uncorrelated between, and within, subjects 236
iv. Uncorrelated between, and autocorrelated within, subjects 236
v. Correlated between, but not within, subjects 237
b. Unbalanced data 237
i. Uncorrelated subjects 237
ii. Uncorrelated between, and within, subjects 237
iii. Correlated between, but not within, subjects 237
7.10 APPENDIX 238
a. For Section 7.4a 238
b. For Section 7.4b 238
c. For Section 7.4d 238
7.11 EXERCISES 241
8 GLMMs 243
8.1 INTRODUCTION 243
8.2 STRUCTURE OF THE MODEL 244
a. Conditional distribution of y 244
8.3 CONSEQUENCES OF HAVING RANDOM EFFECTS 245
a. Marginal versus conditional distribution 245
b. Mean of y 245
c. Variances 246
d. Covariances and correlations 247
8.4 ESTIMATION BY MAXIMUM LIKELIHOOD 248
a. Likelihood 248
b. Likelihood equations 250
i. For the fixed effects parameters 250
ii. For the random effects parameters 251
8.5 MARGINAL VERSUS CONDITIONAL MODELS 251
8.6 OTHER METHODS OF ESTIMATION 254
a. Generalized estimating equations 254
b. Penalized quasi\u2013likelihood 255
c. Conditional likelihood 257
d. Simpler models 261
8.7 TESTS OF HYPOTHESES 262
a. Likelihood ratio tests 262
b. Asymptotic variances 263
c. Wald tests 263
d. Score tests 263
8.8 ILLUSTRATION: CHESTNUT LEAF BLIGHT 264
a. A random effects probit model 265
i. The fixed effects 265
ii. The random effects 266
iii. Consequences of having random effects 266
iv. Likelihood analysis 267
v. Results 268
8.9 EXERCISES 269
9 PREDICTION 270
9.1 INTRODUCTION 270
9.2 BEST PREDICTION (BP) 271
a. The best predictor 271
b. Mean and variance properties 272
c. A correlation property 272
d. Maximizing a mean 272
e. Normality 273
9.3 BEST LINEAR PREDICTION (BLP) 273
a. BLP(u) 273
b. Example 274
c. Derivation 275
d. Ranking 276
9.4 LINEAR MIXED MODEL PREDICTION (BLUP) 277
a. BLUE(Χβ) 277
b. BLUP(t'Χβ + s'u) 278
c. Two variances 279
d. Other derivations 279
9.5 REQUIRED ASSUMPTIONS 279
9.6 ESTIMATED BEST PREDICTION 280
9.7 HENDERSON'S MIXED MODEL EQUATIONS 281
a. Origin 281
b. Solutions 282
c. Use in ML estimation of variance components 282
i. ML estimation 282
ii. REML estimation 283
9.8 APPENDIX 283
a. Verification of (9.5) 283
b. Verification of (9.7) and (9.8) 284
9.9 EXERCISES 285
10 COMPUTING 286
10.1 INTRODUCTION 286
10.2 COMPUTING ML ESTIMATES FOR LMMs 286
a. The EM algorithm 286
i. EM for ML 288
ii. EM (a variant) for ML 288
iii. EM for REML 288
b. Using E[u|y] 289
c. Newton\u2013Raphson method 290
10.3 COMPUTING ML ESTIMATES FOR GLMMs 292
a. Numerical quadrature 292
i. Gauss\u2013Hermite quadrature 293
ii. Likelihood calculations 295
iii. Limits of numerical quadrature 296
b. EM algorithm 297
c. Markov chain Monte Carlo algorithms 298
i. Metropolis 299
ii. Monte Carlo Newton\u2013Raphson 300
d. Stochastic approximation algorithms 301
e. Simulated maximum likelihood 303
10.4 PENALIZED QUASI\u2013LIKELIHOOD AND LAPLACE 304
10.5 EXERCISES 307
11 NONLINEAR MODELS 309
11.1 INTRODUCTION 309
11.2 EXAMPLE: CORN PHOTOSYNTHESIS 309
11.3 PHARMACOKINETIC MODELS 312
11.4 COMPUTATIONS FOR NONLINEAR MIXED MODELS 313
11.5 EXERCISES 313
APPENDIX M: SOME MATRIX RESULTS 314
M.1 VECTORS AND MATRICES OF ONES 314
M.2 KRONECKER (OR DIRECT) PRODUCTS 315
M.3 A MATRIX NOTATION 315
M.4 GENERALIZED INVERSES 316
a. Definition 316
b. Generalized inverses of Χ'Χ 317
c. Two results involving Χ(Χ'V[sup(- 1)]Χ)[sup(-)]Χ'V[sup(-1)] 318
d. Solving linear equations 319
e. Rank results 319
f. Vectors orthogonal to columns of Χ 319
g. A theorem for Κ' with Κ'Χ being null 319
M.5 DIFFERENTIAL CALCULUS 320
a. Definition 320
b. Scalars 320
c. Vectors 320
d. Inner products 320
e. Quadratic forms 321
f. Inverse matrices 321
g. Determinants 322
APPENDIX S: SOME STATISTICAL RESULTS 323
S.1 MOMENTS 323
a. Conditional moments 323
b. Mean of a quadratic form 324
c. Moment generating function 324
S.2 NORMAL DISTRIBUTIONS 325
a. Univariate 325
b. Multivariate 325
c. Quadratic forms in normal variables 326
i. The non-central x[sup(2)] 326
ii. Properties of y'Αy when y ~ N(μ, V) 326
S.3 EXPONENTIAL FAMILIES 327
S.4 MAXIMUM LIKELIHOOD 327
a. The likelihood function 327
b. Maximum likelihood estimation 328
c. Asymptotic variance-covariance matrix 328
d. Asymptotic distribution of MLEs 329
S.5 LIKELIHOOD RATIO TESTS 329
S.6 MLE UNDER NORMALITY 330
a. Estimation of β 330
b. Estimation of variance components 331
c. Asymptotic variance-covariance matrix 331
d. Restricted maximum likelihood (REML) 332
i. Estimation 332
ii. Asymptotic variance 333
REFERENCES 334
INDEX 344
A 344
B 344
C 344
D 345
E 345
F 345
G 346
H 346
I 346
K 346
L 346
M 346
N 347
O 347
P 347
Q 347
R 348
S 348
T 348
U 348
V 348
W 348

已确认勘误

次印刷

页码 勘误内容 提交人 修订印次

Generalized, linear, and mixed models /
    • 名称
    • 类型
    • 大小

    光盘服务联系方式: 020-38250260    客服QQ:4006604884

    意见反馈

    14:15

    关闭

    云图客服:

    尊敬的用户,您好!您有任何提议或者建议都可以在此提出来,我们会谦虚地接受任何意见。

    或者您是想咨询:

    用户发送的提问,这种方式就需要有位在线客服来回答用户的问题,这种 就属于对话式的,问题是这种提问是否需要用户登录才能提问

    Video Player
    ×
    Audio Player
    ×
    pdf Player
    ×
    Current View

    看过该图书的还喜欢

    some pictures

    解忧杂货店

    东野圭吾 (作者), 李盈春 (译者)

    loading icon