Debug School

rakesh kumar
rakesh kumar

Posted on

How to measure measure of dispersion in descriptive statistics

Measures of dispersion are used in descriptive statistics to describe the spread or variability of a dataset. There are several measures of dispersion, but the most commonly used are the range, variance, and standard deviation.

Range: The range is the difference between the maximum and minimum values in a dataset. For example, consider the following dataset

of exam scores: 85, 90, 75, 80, 95. 
The range of these scores is
 calculated as 95 - 75 = 20.
Enter fullscreen mode Exit fullscreen mode

Variance: The variance is a measure of how much the data deviates from the mean. It is calculated by taking the sum of the squared differences between each data point and the mean, and dividing by the number of data points. For example, consider the same dataset of exam scores as before.

 The mean of these scores is 85, 
and the variance is calculated as
 ((85-85)^2 + (90-85)^2 + (75-85)^2 + (80-85)^2 + (95-85)^2) / 5 = 110.
Enter fullscreen mode Exit fullscreen mode

Standard deviation: The standard deviation is the square root of the variance. It measures the amount of variation or dispersion in the dataset relative to the mean. For example, using the same dataset of exam scores, the standard deviation is calculated as

 the square root of the variance, 
which is approximately 10.49.
Enter fullscreen mode Exit fullscreen mode

In summary, measures of dispersion provide information about the spread or variability of a dataset. The range, variance, and standard deviation are commonly used measures of dispersion, and each one provides different information about the data. The range is simple but sensitive to outliers, while the variance and standard deviation are more robust measures that take into account the entire dataset.

Top comments (0)