Interactions, part 2

PSY 716

Single-predictor linear regression

$$Y_i = \beta_0 + \beta_1 X_i + \epsilon_i$$

where:

  • \( Y_i \) is the dependent variable
  • \( X_i \) is the independent variable (group membership or continuous covariate)
  • \( \beta_0 \) is the intercept
  • \( \beta_1 \) is the regression coefficient
  • \( \epsilon_i \) is the error term

Adding an interaction

$$Y_i = \beta_0 + \beta_1 x_{i1} + \beta_2 x_{i2} + \beta_3 x_{i1} x_{i2} + \epsilon_i$$

where:

  • \( Y_i \) is the dependent variable
  • \( X_{i1} \) is an independent variable
  • \( X_{i2} \) is another independent variable
  • \( \beta_0 \) is the intercept
  • \( \beta_j \) is the regression coefficient for the ( j^{th} ) term
  • \( \epsilon_i \) is the error term

Adding an interaction

$$Y_i = \beta_0 + \beta_1 x_{i1} + \beta_2 x_{i2} + \beta_3 x_{i1} x_{i2} + \epsilon_i$$

where:

  • \( Y_i \) is the dependent variable
  • \( X_{i1} \) is an independent variable
  • \( X_{i2} \) is another independent variable
  • \( \beta_0 \) is the intercept
  • \( \beta_j \) is the regression coefficient for the ( j^{th} ) term
  • \( \epsilon_i \) is the error term

Adding an interaction

$$Y_i = \beta_0 + (\beta_1 + \beta_3 x_{i2})x_{i1} + (\beta_2 + \beta_3 x_{i1})x_{i2} + \epsilon_i$$

Adding an interaction

The main effect is the effect of one variable when the other is zero.

When \( x_{i1} \) is zero...

$$Y_i = \beta_0 + (\beta_1 + \beta_3 x_{i2})0 + (\beta_2 + \beta_3 0)x_{i2} + \epsilon_i$$

$$Y_i = \beta_0 + \beta_2 x_{i2}$$

Question: What does this look like when \( x_{i2} \) is zero?

Simple Intercepts

Simple intercepts represent the predicted value of \( Y \) for one IV when the other IV equals zero.

$$\text{Simple intercept for } x_{i1} = \beta_0 + \beta_2 x_{i2}$$

$$\text{Simple intercept for } x_{i2} = \beta_0 + \beta_1 x_{i1}$$

Simple intercepts tell us where the regression line for one predictor hits the y axis, conditional on the value of the other predictor.

Simple Slopes

Simple slopes represent the effect of one IV on the DV at a specific value of the other IV.

$$\text{Simple slope for } x_{i1} = \beta_1 + \beta_3 x_{i2}$$

$$\text{Simple slope for } x_{i2} = \beta_2 + \beta_3 x_{i1}$$

  • The effect of \( x_{i1} \) on \( Y \) depends on the value of \( x_{i2} \)
  • The coefficient \( \beta_3 \) represents...
    • how much the effect of \( x_{i1} \) changes when \( x_{i2} \) increases by one unit
    • how much the effect of \( x_{i2} \) changes when \( x_{i1} \) increases by one unit

Simple slopes are essential for interpreting interactions because main effects are conditional.

Probing Interactions

  1. Pick-a-point approach: Examine simple slopes at meaningful values

    • For categorical moderators: use each group
    • For continuous moderators: often use Mean, Mean±1SD
  2. Plotting predicted values: Create line graphs showing the relationship at different moderator values

    • x-axis: focal predictor
    • separate lines: different values of the moderator
    • y-axis: predicted values of the outcome
  3. Johnson-Neyman technique: Find regions of significance (discussed next)

The goal is to determine when and how the relationship between ( x_{i1} ) and ( Y ) changes as a function of ( x_{i2} ).

Johnson-Neyman Technique

The Johnson-Neyman technique identifies the specific values of the moderator where the relationship between the focal predictor and outcome transitions between significance and non-significance.

$$\text{Simple slope for } x_{i1} = \beta_1 + \beta_3 x_{i2}$$

$$\text{Standard error for simple slope} = $$

This technique avoids arbitrary selection of moderator values and provides a more complete picture of the interaction.

$$\sqrt{\text{Var}(\beta_1) + x_{i2}^2 \text{Var}(\beta_3) + 2x_{i2}\text{Cov}(\beta_1, \beta_3)}$$

Johnson-Neyman Technique

$$\text{Simple slope for } x_{i1} = \beta_1 + \beta_3 x_{i2}$$

$$\text{Standard error for simple slope} = $$

Steps:

  1. Calculate the simple slope for all possible values of the moderator
  2. Calculate the standard error for each simple slope
  3. Find the values of the moderator where

    $$ t = \frac{\text{simple slope}}{\text{SE}} = \text{critical value} $$

This technique avoids arbitrary selection of moderator values and provides a more complete picture of the interaction.

$$\sqrt{\text{Var}(\beta_1) + x_{i2}^2 \text{Var}(\beta_3) + 2x_{i2}\text{Cov}(\beta_1, \beta_3)}$$

Regions of Significance

Regions of significance are the ranges of the moderator variable where the relationship between the focal predictor and the outcome is statistically significant. They:

  • Shows precisely where the effect of \( x_{i1} \) on \( Y \) is significant
  • Can have lower and/or upper bounds
  • Sometimes extend indefinitely in one direction
  • Provide more information than simple slopes at arbitrary points

Interpretation example: "The relationship between anxiety and performance is significantly negative when social support is below 2.34, not significant when social support is between 2.34 and 4.56, and significantly positive when social support exceeds 4.56."

Interactions - part 2

By Veronica Cole

Interactions - part 2

  • 54