Model Estimation and Data Analysis: Linear Regression Models

LIMDEP and NLOGIT software offer a complete set of powerful tools for linear regression estimation, hypothesis testing, specification analysis and simulation.

Least squares regression

  • Least squares
  • Instrumental variables and 2SLS
  • Least absolute deviations with bootstrapped standard errors
  • Extreme accuracy

Summary Statistics

  • Fit measures, F, R2, adjusted R2, sum of squares
  • Information criteria
  • Likelihood function
  • Durbin-Watson

Predictions and Residuals

  • List, plot, retain
  • Standardized residuals
  • Confidence interval for predictions
  • Impute missing values 

Robust Estimation

  • White and heteroscedasticity adjusted covariance matrix
  • Newey-West estimator
  • Cluster corrected covariance matrix
  • Kernel density regression

Panel Data

  • Analysis of variance and covariance
  • Fixed effects
  • Random effects
  • Random parameters (GLS, hierarchical)
  • Balanced or unbalanced panels
  • Cluster correction
  • Heteroscedasticity and autocorrelation tests
  • LM and Hausman tests for effects
  • White and Newey-West robust estimators
  • Dynamic linear models (Arellano/Bond)

Heteroscedasticity

  • Weighted least squares
  • Multiplicative heteroscedasticity, maximum likelihood
  • Goldfeld-Quandt, Breusch-Pagan tests
  • Groupwise heteroscedasticity
  • Heteroscedastic fixed and random effects
  • ARCH, GARCH, GARCH in mean

Specification Tests

  • Omitted variables
  • Structural change
  • J, Cox, PE tests

Restrictions

  • F and Wald tests for linear restrictions
  • Restricted regression
  • Inequality restricted regression
  • Wald tests for nonlinear restrictions
  • Lagrange multiplier, likelihood ratio tests

Systems of Linear Equations

  • 2SLS
  • 3SLS
  • Seemingly unrelated regressions
    • Autocorrelation
    • Heteroscedasticity
    • Singular equation systems with constraints
    • GLS and maximum likelihood
  • Cross and within equation constraints
  • Covariance structures
    • OLS, GLS
    • Panel corrected standard errors
    • Grouping of observation units

Linear Regression Example

Linear regression of Household Income on Age, Eduction and Marital Status for women with a residual plot.

-----------------------------------------------------------------------------
Ordinary     least squares regression ............
LHS=HHNINC   Mean                 =         .34489
             Standard deviation   =         .16279
             No. of observations  =           1000  Degrees of freedom
Regression   Sum of Squares       =        4.58257           3
Residual     Sum of Squares       =        21.8912         996
Total        Sum of Squares       =        26.4738         999
             Standard error of e  =         .14825
Fit          R-squared            =         .17310  R-bar squared =   .17061
Model test   F[  3,   996]        =       69.49881  Prob F > F*   =   .00000
Diagnostic   Log likelihood       =      491.89637  Akaike I.C.   = -3.81367
             Restricted (b=0)     =      396.86157  Bayes  I.C.   = -3.79404
             Chi squared [  3]    =      190.06961  Prob C2 > C2* =   .00000
--------+--------------------------------------------------------------------
        |                  Standard            Prob.      95% Confidence
  HHNINC|  Coefficient       Error       z    |z|>Z*         Interval
--------+--------------------------------------------------------------------
Constant|    -.04595         .03469    -1.32  .1854     -.11394    .02205
     AGE|     .00054         .00046     1.16  .2474     -.00037    .00145
    EDUC|     .02379***      .00198    11.99  .0000      .01990    .02767
 MARRIED|     .11794***      .01193     9.88  .0000      .09455    .14133
--------+--------------------------------------------------------------------
Note: ***, **, * ==>  Significance at 1%, 5%, 10% level.
-----------------------------------------------------------------------------

Linear regression using LIMDEP