Machine Learning Basics
Some basics of machine learning
KL Divergence
Published:
Category: { statistics }
Tags:
References:
- Kullback–Leibler divergence @ Wikipedia
- Kullback-Leibler Divergence Explained @ COUNT BAYESIE by Will Kurt
Summary: Kullback–Leibler divergence indicates the differences between two distributions
Pages: 3
2 Bias-Variance
Published:
Category: { Machine Learning } { Basics }
Tags:
References:
- Understanding the Bias-Variance Tradeoff
- The Bias-Variance trade-off : Explanation and Demo
- The Element of Statistical Learning
- James, G., Witten, D., Hastie, T., & Tibshirani, R. (2013). An Introduction to Statistical Learning. In Springer Texts in Statistics. Springer Science & Business Media.
Summary: Bias Variance Trade off is a key concept in statistical learning
Pages: 3
1 Confusion Matrix (Contingency Table)
Published:
Category: { Machine Learning }
Tags:
References:
- Fawcett, T. (2006). An introduction to ROC analysis. Pattern Recognition Letters, 27(8), 861–874.
- Confusion_matrix#Table_of_confusion @ Wikipedia
- F1 Score
Summary: Confusion Matrix It is much easier to understand the confusion matrix if we use a binary classification problem as an example. For example, we have a bunch of cat photos and the user labeled “cute or not” data. Now we are using the labeled data to train a cute-or-not binary classifier.
Then we apply the classifier on the test dataset and we would only find four different kinds of results.
Labeled as Cute Labeled as Not Cute Classifier Predicted to be Cute True Positive (TP) False Positive (FP) Classifier Predicted to be Not Cute False Negative (FN) True Negative (TN) This table is easy enough to comprehend.
Pages: 3