Foundations of Agnostic StatisticsReflecting a sea change in how empirical research has been conducted over the past three decades, Foundations of Agnostic Statistics presents an innovative treatment of modern statistical theory for the social and health sciences. This book develops the fundamentals of what the authors call agnostic statistics, which considers what can be learned about the world without assuming that there exists a simple generative model that can be known to be true. Aronow and Miller provide the foundations for statistical inference for researchers unwilling to make assumptions beyond what they or their audience would find credible. Building from first principles, the book covers topics including estimation theory, regression, maximum likelihood, missing data, and causal inference. Using these principles, readers will be able to formally articulate their targets of inquiry, distinguish substantive assumptions from statistical assumptions, and ultimately engage in cutting-edge quantitative empirical research that contributes to human knowledge. |
Contents
1 | 5 |
Summarizing Distributions | 44 |
1 | 66 |
Learning from Random Samples | 91 |
4 | 107 |
9 | 126 |
Regression | 143 |
14 | 145 |
Parametric Models | 178 |
Missing Data | 207 |
21 | 232 |
Causal Inference | 235 |
Glossary of Mathematical Notation | 282 |
288 | |
289 | |
295 | |
Other editions - View all
Common terms and phrases
ˆθ analogous approximation assume assumption binary choice model bootstrap bounds causal inference Chapter characterize classical linear model cluster coefficients coin flip conditional expectation conditional expectation function confidence intervals consider consistent estimator continuous random variables Cov[X,Y covariates define Definition denotes discrete random variables discuss E[Yi empirical CDF example expected value explanatory variables finite following theorem given i.i.d. observations i.i.d. random variables implies independent IPW estimator Iterated Expectations joint distribution joint PDF Large Numbers Law of Iterated Maximum Likelihood MCAR micronumerosity missing data ML estimate multivariate normal distribution overfitting p-value parameters parametric model partial derivative pD(Xi plug-in estimator plug-in regularity conditions polynomial population mean pp(X pr(X Pr[D Pr[R Pr[X predictor propensity score properties random assignment random vector regression estimator sample mean sample statistic Section standard error strong ignorability Supp[X Suppose unbiased unit usual plug-in regularity WLLN zero