Skip to main content


Variance is a statistical measure that describes how spread-out or dispersed a data set is around its mean or expected value. It measures the average deviation of each data point from the mean.

The variance of a set of nn data points, denoted by Var(X)\text{Var}(X) or σ2\sigma^2, is calculated by first calculating the mean or expected value of the data set (μ\mu), and then calculating the sum of the squared differences between each data point and the mean, divided by n1n-1:

Var(X)=1n1_i=1n(Xiμ)2\text{Var}(X) = \frac{1}{n-1}\sum\_{i=1}^{n}(X_i - \mu)^2

Where XiX_i is the i-th data point in the data set.

The variance is always non-negative; if it is zero, all the data points are the same. The larger the variance, the more spread out the data is.

Variance is a useful statistical measure because it helps to describe the variability or dispersion of a set of data points. It is used in various statistical techniques, such as regression analysis, hypothesis testing, and confidence intervals. In addition, the square root of the variance, known as the standard deviation, is a commonly used measure of statistical variability.