Skip to content

Mean Deviation

  • Quantifies how far, on average, data points are from the mean.
  • Computed by averaging the absolute differences between each value and the mean.
  • Useful for gauging typical deviation without squaring differences.

Mean deviation is a statistical measure used to describe the dispersion of a dataset. It is calculated by taking the absolute difference between each data point and the mean of the data, then averaging those differences.

Mean Deviation=1ni=1nxixˉ\text{Mean Deviation} = \frac{1}{n}\sum_{i=1}^{n} \left|x_i - \bar{x}\right|

To compute mean deviation:

  • Find the mean (average) of the dataset.
  • For each data point, compute the absolute difference from the mean.
  • Average those absolute differences to obtain the mean deviation.

This result expresses, in the same units as the data, how far data points are from the mean on average.

Given a dataset of 5 numbers: 1, 2, 3, 4, and 5. The mean is 3. The absolute differences from the mean are:

|1 - 3| = 2
|2 - 3| = 1
|3 - 3| = 0
|4 - 3| = 1
|5 - 3| = 2

The average of these differences is 1.2, so the mean deviation is 1.2.

If a dataset of stock prices has a mean price of 100andthemeandeviationis100 and the mean deviation is 10, that indicates the stock prices typically fluctuate by $10 from the mean.

  • Understanding the dispersion of a dataset.
  • Supporting predictions or decisions by conveying how much values typically vary from the mean.
  • Mean
  • Dispersion