Negative Sampling

#Word2vec

Knowledge of CBOW or skipgram is required.

A naive model to train a model of words is to

  1. encode input words and output words using vectors,
  2. use the input word vector to predict the output word vector,
  3. calculate the errors between predicted output word vector and real output word vector,
  4. minimize the errors.

However, it is very expensive to prject out the output words and calcualte the error eveytime. A trick is to use negative sampling.

Negative sampling adds a new column to the data as the predictions.

Input (Center Word)Output (Context)Target (is Neighbour)
intendedextravagant1
intendeddisplay1
intendedto1
intendedattract1

Now we have a problem. The target is always 1. This dataset might lead to network that outputs 1 all the time. We need some nagative samples to make it noisy. We randomly sampled words from the dictionary.

Input (Center Word)Output (Context)Target (is Neighbour)
intendedextravagant1
intendeddisplay1
intendedto1
intendedattract1
intendedI0
intendeda0
intendedintellect0
intendedmating0
intendedcourse0

Published: by ;

Table of Contents

Current Ref:

  • cards/machine-learning/embedding/negative-sampling.md