Discriminative model: The conditional probability of class label on data (posterior) $p(C_k\mid x)$ …
Contrastive models learn to compare1. Contrastive use special objective functions such as [[NCE]] …
The task of GAN is to generate features $X$ from some noise $\xi$ and class labels $Y$, $$\xi, Y \to …
In contrastive methods, we can manipulate the data to create data entries and infer the changes …
The essence of [[GAN]] GAN The task of GAN is to generate features $X$ from some noise $\xi$ and …
An autoregressive (AR) model is autoregressive, $$ \begin{equation} \log p_\theta (x) = \sum_{t=1}^T …
It was discovered that the success of [[mutual information based contrastive learning]] Contrastive …
Normalizing flow is a method to convert a complicated distribution $p(x)$ to a simpler distribution …
In GAN, the latent space input is usually random noise, e.g., Gaussian noise. The objective of …
Contrastive Predictive Coding, aka CPC, is an autoregressive model combined with InfoNCE loss1. …
Max Global Mutual Information Why not just use the global mutual information of the input and …
Autoencoders (AE) are machines that encodes inputs into a compact latent space. The simplest …
In an inference problem, $p(z\vert x)$, which is used to infer $z$ from $x$. $$ p(z\vert x) = …
Generative self-supervised learning models can utilize more data
Contrastive self-supervised learning models can utilize more data
Adversarial models use a generator and discriminator
Review of self-supervised learning.