Interaction Effect
An interaction effect in user experiments or statistical analysis refers to a situation where the impact of one variable on an outcome depends on the level of another variable. Related: Independence. Interaction can be examined in different types of models, such as in regression analysis or analysis of variance (ANOVA), but the basic idea is the same.
In the case of two-way interaction (interaction between two independent variables), let's denote our variables as follows:
- and are your independent variables.
- is your dependent variable.
- represents the interaction between and .
In a regression model that includes an interaction term, the model would look like this:
Here, is the intercept, and are the main effects of and respectively, and represents the interaction effect of and on . is the error term.
To calculate the interaction effect, you need to estimate the regression coefficients and . This is typically done through a method called Ordinary Least Squares (OLS) regression, which minimizes the sum of the squared residuals. In a factorial ANOVA setting, you would calculate the interaction effect as the difference between the effect of one factor at different levels of the other factor.
Calculating
Linear Regression
In the regression model:
The coefficients , , and are typically estimated using the method of Ordinary Least Squares (OLS). OLS minimizes the sum of the squared residuals (the differences between the observed and predicted values of the dependent variable ). In simple linear regression (only one predictor), the formulas to estimate the coefficients are:
where
- and are the individual observations,
- and are the means of and respectively,
- is the number of observations.
For multiple predictors and interaction terms, we typically use matrix notation and some linear algebra to solve a system of linear equations to estimate the coefficients. This process requires several assumptions to be valid, including linearity, independence, homoscedasticity, and normally distributed errors. If these assumptions are violated, other methods might be more appropriate to estimate the coefficients.
Multiple Linear Regression
For multiple linear regression (which includes multiple predictors and interaction terms), as in the case of our model:
The calculation of coefficients and becomes more complex. The formula that generalizes the one for simple linear regression involves matrix operations.
If we denote:
- as a matrix that includes a column of ones (for the intercept), and the values of the predictor variables (and their products for interaction terms),
- as a column vector of the outcome variable,
- as a column vector of the coefficients to be estimated,
Then the formula for the least squares estimates in multiple regression is:
where denotes the transpose of and denotes the inverse of .
As in the simple regression case, these estimates are based on minimizing the sum of the squared residuals (i.e., differences between observed and predicted values of the outcome variable), and the validity of the estimates depends on several assumptions, including linearity, independence, homoscedasticity (constant variance of errors), and normally distributed errors.