Introduction to Econometrics Made Easy: Stock & Watson Key Concepts (Free PDF)
PART ONE Introduction and Review
1.1 Cross-Sectional, Time Series, and Panel Data 53
2.1 Expected Value and the Mean 60
2.2 Variance and Standard Deviation 61
2.3 Means, Variances, and Covariances of Sums of Random Variables 74
2.4 Computing Probabilities and Involving Normal Random Variables 76
2.5 Simple Random Sampling and i.i.d. Random Variables 82
2.6 Convergence in Probability, Consistency, and the Law of Large Numbers 86
2.7 The Central Limit Theorem 89
3.1 Estimators and Estimates 105
3.2 Bias, Consistency, and Efficiency 105
3.3 Efficiency of Y : Y Is BLUE 107
3.4 The Standard Error of Y 113
3.5 The Terminology of Hypothesis Testing 115
3.6 Testing the Hypothesis E(Y) = μY,0 Against the Alternative E(Y) μY,0 116
3.7 Confidence Intervals for the Population Mean 118
PART TWO Fundamentals of Regression Analysis
4.1 Terminology for the Linear Regression Model with a Single Regressor 146
4.2 The OLS Estimator, Predicted Values, and Residuals 150
4.3 The Least Squares Assumptions for Causal Inference 160
4.4 Large-Sample Distributions of bn0 and bn1 162
5.1 General Form of the t-Statistic 179
5.2 Testing the Hypothesis b1 = b1,0 Against the Alternative b1 b1,0 181
5.3 Confidence Interval for b1 185
5.4 Heteroskedasticity and Homoskedasticity 190
5.5 The Gauss–Markov Theorem for bn1 195
6.1 Omitted Variable Bias in Regression with a Single Regressor 213
6.2 The Multiple Regression Model 219
6.3 The OLS Estimators, Predicted Values, and Residuals in the Multiple Regression Model 221
6.4 The Least Squares Assumptions for Causal Inference in the Multiple Regression Model 227
6.5 Large-Sample Distribution of bn0, bn1,c, bnk 228
6.6 The Least Squares Assumptions for Causal Inference in the Multiple Regression Model with Control Variables 233
7.1 Testing the Hypothesis bj = bj,0 Against the Alternative bj bj,0 249
1.1 Cross-Sectional, Time Series, and Panel Data 53
2.1 Expected Value and the Mean 60
2.2 Variance and Standard Deviation 61
2.3 Means, Variances, and Covariances of Sums of Random Variables 74
2.4 Computing Probabilities and Involving Normal Random Variables 76
2.5 Simple Random Sampling and i.i.d. Random Variables 82
2.6 Convergence in Probability, Consistency, and the Law of Large Numbers 86
2.7 The Central Limit Theorem 89
3.1 Estimators and Estimates 105
3.2 Bias, Consistency, and Efficiency 105
3.3 Efficiency of Y : Y Is BLUE 107
3.4 The Standard Error of Y 113
3.5 The Terminology of Hypothesis Testing 115
3.6 Testing the Hypothesis E(Y) = μY,0 Against the Alternative E(Y) μY,0 116
3.7 Confidence Intervals for the Population Mean 118
PART TWO Fundamentals of Regression Analysis
4.1 Terminology for the Linear Regression Model with a Single Regressor 146
4.2 The OLS Estimator, Predicted Values, and Residuals 150
4.3 The Least Squares Assumptions for Causal Inference 160
4.4 Large-Sample Distributions of b n0 and bn1 162
5.1 General Form of the t-Statistic 179
5.2 Testing the Hypothesis b1 = b1,0 Against the Alternative b1 b1,0 181
5.3 Confidence Interval for b1 185
5.4 Heteroskedasticity and Homoskedasticity 190
5.5 The Gauss–Markov Theorem for bn1 195
6.1 Omitted Variable Bias in Regression with a Single Regressor 213
6.2 The Multiple Regression Model 219
6.3 The OLS Estimators, Predicted Values, and Residuals in the Multiple Regression Model 221
6.4 The Least Squares Assumptions for Causal Inference in the Multiple Regression Model 227
6.5 Large-Sample Distribution of bn0, bn1,c, bnk 228
6.6 The Least Squares Assumptions for Causal Inference in the Multiple Regression Model with Control Variables 233
7.1 Testing the Hypothesis bj = bj,0 Against the Alternative bj bj,0 249
7.2 Confidence Intervals for a Single Coefficient in Multiple Regression 250
7.3 R2 and R 2: What They Tell You—and What They Don’t 263
8.1 The Expected Change in Y from a Change in X1 in the Nonlinear Regression
Model [Equation (8.3)] 283
8.2 Logarithms in Regression: Three Cases 295
8.3 A Method for Interpreting Coefficients in Regressions with Binary Variables 299
8.4 Interactions Between Binary and Continuous Variables 302
8.5 Interactions in Multiple Regression 306
9.1 Internal and External Validity 331
9.2 Omitted Variable Bias: Should I Include More Variables in My Regression? 335
9.3 Functional Form Misspecification 336
9.4 Errors-in-Variables Bias 338
9.5 Sample Selection Bias 340
9.6 Simultaneous Causality Bias 343
9.7 Threats to the Internal Validity of a Multiple Regression Study 344
PART THREE Further Topics in Regression Analysis
10.1 Notation for Panel Data 362
10.2 The Fixed Effects Regression Model 369
10.3 The Fixed Effects Regression Assumptions 375
11.1 The Linear Probability Model 396
11.2 The Probit Model, Predicted Probabilities, and Estimated Effects 400
11.3 Logit Regression 402
12.1 The General Instrumental Variables Regression Model and Terminology 438
12.2 Two Stage Least Squares 440
12.3 The Two Conditions for Valid Instruments 441
12.4 The IV Regression Assumptions 442
12.5 A Rule of Thumb for Checking for Weak Instruments 446
12.6 The Overidentifying Restrictions Test (The J-Statistic) 449
14.1 m-Fold Cross Validation 523
14.2 The Principal Components of X 535
PART FOUR Regression Analysis of Economic Time Series Data
15.1 Lags, First Differences, Logarithms, and Growth Rates 557
15.2 Autocorrelation (Serial Correlation) and Autocovariance 559
15.3 Stationarity 562
15.4 Autoregressions 568
15.5 The Autoregressive Distributed Lag Model 571
15.6 The Least Squares Assumptions for Forecasting with Time Series Data 572
15.7 Pseudo Out-of-Sample Forecasts 575
15.8 The QLR Test for Coefficient Stability 592
16.1 The Distributed Lag Model and Exogeneity 616
16.2 The Distributed Lag Model Assumptions 618
16.3 HAC Standard Errors 624
17.1 Vector Autoregressions 650
17.2 Iterated Multi-period Forecasts 656
17.3 Direct Multi-period Forecasts 658
17.4 Orders of Integration, Differencing, and Stationarity 660
17.5 Cointegration 664
PART FIVE Regression Analysis of Economic Time Series Data
18.1 The Extended Least Squares Assumptions for Regression with a Single Regressor 689
19.1 The Extended Least Squares Assumptions in the Multiple Regression Model 715
19.2 The Multivariate Central Limit Theorem 718
19.3 Gauss–Markov Theorem for Multiple Regression 727
19.4 The GLS Assumptions 729
Comments
Post a Comment