← cd ../apps

Embedding Space

Explore how words live in vector space and do word arithmetic

Loading...

// embedding_space_2d

1.0x

Scroll to zoom, drag to pan. Click words to see connections.

// word_arithmetic

+

// try_these

// how_embeddings_work

Words as vectors: Language models represent words as high-dimensional vectors (384 dimensions for this model). Similar words end up close together in this space.

Semantic Similarity

"happy" and "joyful" are close together, while "happy" and "sad" are far apart.

Analogical Reasoning

The vector from "man" to "king" is similar to the vector from "woman" to "queen".

Note: The 2D visualization uses PCA projection. Some relationships visible in 384D may not be perfectly preserved in 2D.