Word Embedding Visualizer

Explore 2D word vectors and perform word arithmetic

Embedding Space

Regular Words
Arithmetic Result
Selected Word
Nearest Neighbor

Controls

100%

Word Arithmetic

Perform vector arithmetic operations like: king - man + woman = queen

- +
How it works: Word embeddings represent words as vectors in a multi-dimensional space. Similar words are positioned close to each other. Vector arithmetic can capture semantic relationships: subtracting "man" from "king" and adding "woman" often results in a vector close to "queen".

Available Words

Click any word to highlight it in the visualization

Nearest Neighbors

About Word Embeddings

What are Word Embeddings?

Word embeddings are dense vector representations of words that capture semantic meaning. Words with similar meanings have similar vector representations.

Key Concepts:

  • Vector Space: Each word is represented as a point in a multi-dimensional space
  • Semantic Similarity: Similar words cluster together in the space
  • Analogies: Relationships between words can be captured through vector arithmetic
  • Cosine Similarity: Used to measure how similar two word vectors are

Example Applications:

  • Natural language understanding
  • Text classification
  • Machine translation
  • Sentiment analysis
  • Information retrieval