# self-supervised learning

Discriminative model: The conditional probability of class label on data (posterior) $p(C_k\mid x)$ …

Contrastive models learn to compare1. Contrastive use special objective functions such as [[NCE]] …

The task of GAN is to generate features $X$ from some noise $\xi$ and class labels $Y$, $$\xi, Y \to … In contrastive methods, we can manipulate the data to create data entries and infer the changes … The essence of [[GAN]] GAN The task of GAN is to generate features X from some noise \xi and … An autoregressive (AR) model is autoregressive,$$ \log p_\theta (x) = \sum_{t=1}^T …

It was discovered that the success of [[mutual information based contrastive learning]] Contrastive …

For a probability density $p(x)$ and a transformation of coordinate $x=g(z)$ or $z=f(x)$, the …

In GAN, the latent space input is usually random noise, e.g., Gaussian noise. The objective of …

Contrastive Predictive Coding, aka CPC, is an autoregressive model combined with InfoNCE loss1. …

Max Global Mutual Information Why not just use the global mutual information of the input and …

Autoencoders (AE) are machines that encodes inputs into a compact latent space. The simplest …

Variational Auto-Encoder (VAE) is very different from [[Generative Model: Auto-Encoder]] Generative …

Generative self-supervised learning models can utilize more data

Contrastive self-supervised learning models can utilize more data

Adversarial models use a generator and discriminator

Review of self-supervised learning.