Machine Learning Basics

Some basics of machine learning

KL Divergence

Published:
Category: { statistics }
Summary: Kullback–Leibler divergence indicates the differences between two distributions
Pages: 3

1 Confusion Matrix (Contingency Table)

Published:
Category: { Machine Learning }
Summary: Confusion Matrix It is much easier to understand the confusion matrix if we use a binary classification problem as an example. For example, we have a bunch of cat photos and the user labeled “cute or not” data. Now we are using the labeled data to train a cute-or-not binary classifier. Then we apply the classifier on the test dataset and we would only find four different kinds of results. Labeled as Cute Labeled as Not Cute Classifier Predicted to be Cute True Positive (TP) False Positive (FP) Classifier Predicted to be Not Cute False Negative (FN) True Negative (TN) This table is easy enough to comprehend.
Pages: 3