Standard Deviation

What is Standard Deviation?

Standard deviation is the measure of dispersion, or how spread out values are, in a dataset. It’s represented by the sigma (σ) symbol and found by taking the square root of the variance. The variance is just the average of the squared differences from the mean. Unlike variance, standard deviation is measured using the same units as the data.


How is Standard Deviation Used in Machine Learning?

Using this metric to calculate the variability of a population or sample is a crucial test of a machine learning model’s accuracy against real world data. In addition, standard deviation can be used to measure confidence in a model’s statistical conclusions.


Other Basic Statistics Used in Machine Learning: