• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Book chapter

Learning Word Embeddings without Context Vectors

P. 244-249.
Zobnin A., Elistratova E.

Most word embedding algorithms such as word2vec or fastText construct two sort of vectors: for words and for contexts. Naive use of vectors of only one sort leads to poor results. We suggest using indefinite inner product in skip-gram negative sampling algorithm. This allows us to use only one sort of vectors without loss of quality. Our “context-free” cf algorithm performs on par with SGNS on word similarity datasets.

In book

Edited by: I. Augenstein, S. Gella, S. Ruder et al. Iss. W19-43. Association for Computational Linguistics, 2019.