CBOW: Continuous Bag of Words

Here we encode all words presented in the corpus to demostrate the idea of CBOW. In the real world, we might want to remove some certain words such as the.

We use the following quote by Ford in Westworld as an example.

I read a theory once that the human intellect is like peacock feathers. Just an extravagant display intended to attract a mate, just an elaborate mating ritual. But, of course, the peacock can barely fly. It lives in the dirt, pecking insects out of the muck, consoling itself with its great beauty.

The word intended is surrunded by extravagant display in the front and to attract after it. The task is to predict the middle word intended using the ‘history words’ extravagant display and to attract.

  • Input: extravagant, display, to, attract
  • Output: intended

In the bag-of-words model, the order of the words extravagant, display, to, attract doesn’t matter hence bag-of-words. [mikolov2013]

This makes it easier to represent the dataset:

Input (Context)Output (Center Word)
extravagantintended
displayintended
tointended
attractintended

To create a real dataset, we “slide” over all the words.

Input (Context)Output (Center Word)
readI
aI
Iread
aread
theoryread
Ia
reada
theorya
oncea
readtheory
atheory
oncetheory

It is not required to choose two words as history words and two words as future words. The number of words to choose is the window size.

Planted: by ;

L Ma (2020). 'CBOW: Continuous Bag of Words', Datumorphism, 01 April. Available at: https://datumorphism.leima.is/cards/machine-learning/embedding/continuous-bag-of-words/.