Você está na página 1de 3

Moderated Multiple Regression (MMR)

(example from Aiken & West, 1991)

Imagine we are trying to predict the self-assurance of managers (Y) based on two
predictors, their length of time in the managerial position (X) and their managerial ability
(Z). We predict that there will be an interaction between time in the position (X) and
managerial ability (Z) on self-assurance (Y).

In order to test the interaction, we must first create the interaction term. Because of
concerns with multicollinearity, we will center each variable first. To do this, we must
first compute the means for each variable involved in the interaction. The mean length of
time in position is 5.0 years. The mean level of managerial ability is 10. Thus, we will
create two new variables by subtracting out these means. Now we create the interaction
term by computing the product of X and Z (these are the centered variables).

To see why we center to reduce multicollinearity with the interaction, see the correlation
matrices below:

Correlation Matrix for Centered Data


X Z XZ Y
X - 0.42 0.10 0.17
Z - 0.04 0.31
XZ - 0.21

Correlation Matrix for Uncentered Data


X Z XZ Y
X - 0.42 0.81 0.17
Z - 0.86 0.31
XZ - 0.21

Notice that X and Z are highly correlated with the interaction term in the uncentered data,
but have very low correlations in the centered data.

Testing the Interaction

We now regress Y on X, Z, and XZ. The resulting equation is:

Yhat = 2.54 + 1.14X + 3.58Z + 2.58XZ

The b weights for Z and XZ are significant. Because the b weight for XZ is significant,
we know that there is a significant interaction between X and Z. Note that this test of the
b weight for the interaction term is the same as utilizing a hierarchical regression and
testing the incremental R2 for the interaction effect, when it is entered after X and Z.

The significant interaction indicates that the regression of Y on X depends on the value of
Z. To better understand the interaction, we will plot it. Because our predictors are both
continuous, we will have to choose values to use in plotting the regression lines. A
common rule of thumb is to choose three values for each predictor: the mean, one
standard deviation below the mean, and one standard deviation above the mean.

For X:
1 SD above = +0.95
Mean = 0
1 SD below = -0.95

For Z:
1 SD above = +2.20
Mean = 0
1 SD below = -2.20

We can compute three regression lines demonstrating the relationship between Y and X
as a function of three values of Z. To generate these regression lines, we can rearrange
the overall regression equation to show the regression of Y on X at levels of Z:

Yhat = (1.14 + 2.58Z)X + (3.58Z + 2.54)

Now we plug in the three values of Z (+2.20, 0, -2.20) and we find the following
regression equations:

Z = +2.20
Yhat = [1.14 + 2.58(2.20)]X + [3.58(2.20) + 2.54]
Yhat = [1.14 + 5.68]X + [7.88 + 2.54]
Yhat = 6.82X + 10.42

Z=0
Yhat = [1.14 + 2.58(0)]X + [3.58(0) + 2.54]
Yhat = 1.14X + 2.54

Z = -2.20
Yhat = [1.14 + 2.58(-2.20)]X + [3.58(-2.20) + 2.54]
Yhat = [1.14 – 5.68]X + [-7.88 + 2.54]
Yhat = -4.54X – 5.34

We can now plug in the three values of X (+0.95, 0, -0.95) in each equation. Thus, we
will find:

At Z = +2.20: Yhat = 6.82X + 10.42


Yhat = 6.82(+0.95) + 10.42 = 16.90
Yhat = 6.82(0) + 10.42 = 10.42
Yhat = 6.82(-0.95) + 10.42 = 3.94
At Z = 0: Yhat = 1.14X + 2.54
Yhat = 1.14(+0.95) + 2.54 = 3.62
Yhat = 1.14(0) + 2.54 = 2.54
Yhat = 1.14(-0.95) + 2.54 = 1.46

At Z = -2.20: Yhat = -4.54X – 5.34


Yhat = -4.54(+0.95) – 5.34 = -9.65
Yhat = -4.54(0) – 5.34 = -5.34
Yhat = -4.54(-0.95) – 5.34 = -1.03

The plot of this data looks like this:

20.00

16.90
15.00

10.42
10.00

3.94
Self-Assurance

5.00
2.54 3.62 Z-above
1.46 Z-mean
Z-below
0.00
X-below X-mean X-above
-1.03

-5.00
-5.34

-10.00 -9.65

-15.00
Time as manager (X)

Você também pode gostar