Bias-Variance
Bias and Variance
Suppose
On the other hand, we build another model using a specific method such as k-nearest neighbors, which is denoted as
The bias measures the deficit between
Zero bias means we are matching the perfect model.
The variance of the model is a measurement of the consistency
The larger the variance, the more wiggly the model is.
Mean Square Error
Bias measures the deficit between the specific model and the perfect model. To measure the deficit between the specific model and the actual data point, we need the Mean Squared Error (MSE).
The Mean Squared Error (MSE) is defined as
This form of expected error can also be used to evaluate models, i.e., calculate expectations by varying models. A straightforward decomposition using equation (
In this derivation, we’ve used several relations.
- We used
becuase the term is constant thus the expectation value is itself. - We have dropped the terms in red. They are related to the fact that the irreducible error is required to be zero,
. If it is not zero then the model is not perfect.
Bias-Variance Tradeoff
The more parameters we introduce in the model, it is more likely to reduce the bias. However, at some point, the more complexity we have in the model, the more wiggles the model will have. Thus the variance will be larger.
wiki/machine-learning/basics/bias-variance
Links to:L Ma (2019). 'Bias-Variance', Datumorphism, 06 April. Available at: https://datumorphism.leima.is/wiki/machine-learning/basics/bias-variance/.