Multiple Regression


Multiple Regression

Chapter Goals

After completing this chapter, you should be able to:

§     Apply multiple regression analysis to business decision-making situations

§     Analyze and interpret the computer output for a multiple regression model

§     Perform a hypothesis test for all regression coefficients or for a subset of coefficients

§     Fit and interpret nonlinear regression models

§     Incorporate qualitative variables into the regression model by using dummy variables

§     Discuss model specification and analyze residuals

The Multiple Regression Model

Multiple Regression Equation

Multiple Regression Equation

Standard Multiple Regression Assumptions

§          The values  xi and the error terms  ?i are independent

§          The error terms are random variables with mean 0 and a constant variance, s2.

(The constant variance property is called

homoscedasticity)

Standard Multiple Regression Assumptions

§          The random error terms, ?i , are not correlated with one another, so that

§          It is not possible to find a set of numbers, c0, c1, . . . , ck, such that

(This is the property of no linear relation for

the Xj’s)

Example:
2 Independent Variables

§    A distributor of frozen desert pies wants to evaluate factors thought to influence demand

§   Dependent variable:       Pie sales (units per week)

§   Independent variables:   Price (in $)

Advertising ($100’s)

§    Data are collected for 15 weeks

Pie Sales Example

Sales = b0 + b1 (Price)

+ b2 (Advertising)

Estimating a Multiple Linear
Regression Equation

§     Excel will be used to generate the coefficients and measures of goodness of fit for multiple regression

§     Excel:

§    Tools / Data Analysis… / Regression

§     PHStat:

§    PHStat / Regression / Multiple Regression…

Multiple Regression Output

The Multiple Regression Equation

Coefficient of Determination, R2

§    Reports the proportion of total variation in  y explained by all  x  variables taken together

§    This is the ratio of the explained variability to total sample variability

Coefficient of Determination, R2

Estimation of Error Variance

§     Consider the population regression model

§     The unbiased estimate of the variance of the errors is

where

§     The square root of the variance, se , is called the standard error of the estimate

Standard Error, se

Adjusted Coefficient of Determination,

§    R2 never decreases when a new  X  variable is added to the model, even if the new variable is not an important predictor variable

§   This can be a disadvantage when comparing models

§    What is the net effect of adding a new variable?

§   We lose a degree of freedom when a new  X variable is added

§   Did the new  X  variable add enough explanatory power to offset the loss of one degree of freedom?

Adjusted Coefficient of Determination,

§     Used to correct for the fact that adding non-relevant independent variables will still reduce the error sum of squares

(where n = sample size, K = number of independent variables)

§    Adjusted R2 provides a better comparison between multiple regression models with different numbers of independent variables

§    Penalize excessive use of unimportant independent variables

§ Smaller than R2

Coefficient of Multiple Correlation

§     The coefficient of multiple correlation is the correlation between the predicted value and the observed value of the dependent variable

§     Is the square root of the multiple coefficient of determination

§     Used as another measure of the strength of the linear relationship between the dependent variable and the independent variables

§     Comparable to the correlation between Y and X in simple regression

Evaluating Individual
Regression Coefficients

§    Use t-tests for individual coefficients

§    Shows if a specific independent variable is conditionally important

§    Hypotheses:

§   H0: ?j = 0 (no linear relationship)

§   H1: ?j ? 0  (linear relationship does exist

between xj and y)

Evaluating Individual
Regression Coefficients

H0: ?j = 0 (no linear relationship)

H1: ?j ? 0  (linear relationship does exist

between xi and y)

Test Statistic:

(df = n – k – 1)

Evaluating Individual
Regression Coefficients

H0: ?j = 0

H1: ?j ¹ 0

Confidence Interval Estimate
for the Slope

Confidence Interval Estimate
for the Slope

Test on All Coefficients

§    F-Test for Overall Significance of the Model

§    Shows if there is a linear relationship between all of the  X  variables considered together and  Y

§    Use F test statistic

§    Hypotheses:

H0: ?1 = ?2 = … = ?k = 0  (no linear relationship)

H1: at least one  ?i ? 0   (at least one independent

variable affects Y)

F-Test for Overall Significance

§    Test statistic:

where F has   k  (numerator) and

(n – K – 1)  (denominator)

degrees of freedom

§    The decision rule is

F-Test for Overall Significance

F-Test for Overall Significance

H0: ?1 = ?2 = 0

H1: ?1 and ?2 not both zero

a = .05

df1= 2      df2 = 12

Tests on a Subset of Regression Coefficients

§     Consider a multiple regression model involving variables  xj and  zj , and the null hypothesis that the  z variable coefficients are all zero:

Tests on a Subset of Regression Coefficients

§     Goal: compare the error sum of squares for the complete model with the error sum of squares for the restricted model

§    First run a regression for the complete model and obtain SSE

§    Next run a restricted regression that excludes the  z  variables (the number of variables excluded is  r)  and obtain the restricted error sum of squares  SSE(r)

§    Compute the  F  statistic and apply the decision rule for a significance level  a

Prediction

§     Given a population regression model

§     then given a new observation of a data point

(x1,n+1, x 2,n+1, . . . , x K,n+1)

the best linear unbiased forecast of  yn+1 is

§      It is risky to forecast for new X values outside the range of the data used to estimate the model coefficients, because we do not have data to support that the linear model extends beyond the observed range.

Using The Equation to Make Predictions

Predictions in PHStat

§     PHStat | regression | multiple regression …

Predictions in PHStat

Input values

Residuals in Multiple Regression

Nonlinear Regression Models

§    The relationship between the dependent variable and an independent variable may not be linear

§    Can review the scatter diagram to check for  non-linear relationships

§    Example: Quadratic model

§ The second independent variable is the square of the first variable

Quadratic Regression Model

§     where:

?0 = Y intercept

?1 = regression coefficient for linear effect of X on Y

?2 = regression coefficient for quadratic effect on Y

?i = random error in Y for observation i

Linear vs. Nonlinear Fit

Quadratic Regression Model

Testing for Significance: Quadratic Effect

§    Testing the Quadratic Effect

§    Compare the linear regression estimate

§    with quadratic regression estimate

§    Hypotheses

§                        (The quadratic term does not improve the model)

§                        (The quadratic term improves the model)

Testing for Significance: Quadratic Effect

§    Testing the Quadratic Effect

Hypotheses

§                        (The quadratic term does not improve the model)

§                        (The quadratic term improves the model)

§     The test statistic is

Testing for Significance: Quadratic Effect

§    Testing the Quadratic Effect

Compare  R2 from simple regression to

R2 from the quadratic model

§    If R2 from the quadratic model is larger than R2 from the simple model, then the quadratic model is a better model

Example: Quadratic Model

§    Purity increases as filter time increases:

Example: Quadratic Model

§     Simple regression results:

y = -11.283 + 5.985 Time

Example: Quadratic Model

The Log Transformation

§     Original multiplicative model

§     Transformed multiplicative model

Interpretation of coefficients

For the multiplicative model:

§    When both dependent and independent variables are logged:

§    The coefficient of the independent variable  Xk can be interpreted as

a 1 percent change in  Xk leads to an estimated  bk percentage change in the average value of  Y

§    bk is the elasticity of Y with respect to a change in Xk

Dummy Variables

§    A dummy variable is a categorical independent variable with two levels:

§   yes or no, on or off, male or female

§   recorded as 0 or 1

§    Regression intercepts are different if the variable is significant

§    Assumes equal slopes for other variables

§    If more than two levels, the number of dummy variables needed is (number of levels – 1)

Dummy Variable Example

Dummy Variable Example

Interpreting the
Dummy Variable Coefficient

Interaction Between Explanatory Variables

§    Hypothesizes interaction between pairs of  x variables

§    Response to one  x  variable may vary at different levels of another  x  variable

§    Contains two-way cross product terms

§

Effect of Interaction

§    Given:

§ Without interaction term, effect of X1 on Y  is measured by ?1

§ With interaction term, effect of X1 on Y  is measured by ?1 + ?3 X2

§    Effect changes as X2 changes

Interaction Example

Significance of Interaction Term

§     The coefficient  b3 is an estimate of the difference in the coefficient of  x1 when  x2 = 1  compared to when  x2 = 0

§     The t statistic for  b3 can be used to test the hypothesis

§     If we reject the null hypothesis we conclude that there is a difference in the slope coefficient for the two subgroups

Multiple Regression Assumptions

Assumptions:

§    The errors are normally distributed

§    Errors have a constant variance

§    The model errors are independent

Analysis of Residuals
in Multiple Regression

§    These residual plots are used in multiple regression:

§ Residuals vs. yi

§   Residuals vs. x1i

§   Residuals vs. x2i

§ Residuals vs. time (if time series data)

Chapter Summary

§     Developed the multiple regression model

§     Tested the significance of the multiple regression model

§     Discussed adjusted R2 ( R2 )

§     Tested individual regression coefficients

§     Tested portions of the regression model

§     Used quadratic terms and log transformations in regression models

§     Used dummy variables

§     Evaluated interaction effects

§     Discussed using residual plots to check model assumptions