Datumorphism Datumorphism
Notebooks
Awesome Blog Cards Hologram Reading Notes TIL Wiki
  Hologram   Deep Learning Book
Blog AmNeumarkt
  • Datumorphism
  • Wiki
  • Model Selection
  • Parsimony of Models

Parsimony of Models

GROWING This note is being developed. It is readible but changes are being applied.
wiki/model-selection/parsimony-of-models.md Current Note ID: The unique ID of this note.

For models with a lot of parameters, the goodness-of-fit is very likely to be very high. However, it is also likely to generalize bad. So we need measure of generalizability

Here parsinomy gives us a few advantages.

  • easy to perceive
  • better generalizations

Planted: 2020-11-08 by L Ma;

References:
  1. Vandekerckhove, J., & Matzke, D. (2015). Model comparison and the principle of parsimony. Oxford Library of Psychology.
Dynamic Backlinks to wiki/model-selection/parsimony-of-models:
Goodness-of-fit
Does the data agree with the model? Calculate the distance between data and model predictions. Apply …
wiki/model-selection/parsimony-of-models Links to:
Goodness-of-fit
Does the data agree with the model? Calculate the distance between data and model predictions. Apply …
Measures of Generalizability
To measure the generalization, we define a generalization error, $$ \begin{align} \mathcal G = …

L Ma (2020). 'Parsimony of Models', Datumorphism, 11 April. Available at: https://datumorphism.leima.is/wiki/model-selection/parsimony-of-models/.

« Measures of Generalizability MDL and Neural Networks »

Created and maintained by L Ma. Acknowledgement: Hugo, Bulma, KausalFlow. love.
Use CMD/WIN+k to activate the command palette
About Feed JSON Data