Variance is a measure of the spread or variability of a distribution. It measures how far the values in a dataset are from the mean of the distribution. Here's how to calculate the variance step by step using an example:
Consider the following dataset of exam scores: 85, 90, 75, 80, 95.
Step 1: Calculate the mean
Add up all the values in the dataset and divide by the total number of values to find the mean:
85 + 90 + 75 + 80 + 95 / 5 = 85
The mean is 85.
Step 2: Calculate the deviations from the mean
Calculate the deviation of each value from the mean by subtracting the mean from each value:
85 - 85 = 0
90 - 85 = 5
75 - 85 = -10
80 - 85 = -5
95 - 85 = 10
Step 3: Square the deviations
Square each deviation to eliminate the negative values:
0^2 = 0
5^2 = 25
(-10)^2 = 100
(-5)^2 = 25
10^2 = 100
Step 4: Calculate the sum of the squared deviations
Add up all the squared deviations:
0 + 25 + 100 + 25 + 100 = 250
Step 5: Calculate the variance
Divide the sum of the squared deviations by the total number of values in the dataset:
250 / 5 = 50
The variance is 50.
Step 6: Calculate the standard deviation
The standard deviation is the square root of the variance. In this example, the standard deviation is:
√50 = 7.07
The standard deviation is 7.07
Therefore, the variance of the exam scores is 50,
and the standard deviation is 7.07.
This means that the exam scores are somewhat spread out from the mean of 85. A higher variance and standard deviation indicate that the data is more spread out, while a lower variance and standard deviation indicate that the data is more tightly clustered around the mean.
Top comments (0)