Try the famous example: king - man + woman = queen
Result will appear here...
Word2Vec represents words as dense vectors (embeddings) in a high-dimensional space, typically ranging from 100 to 300 dimensions. In this demo, we're using 100-dimensional vectors.
These vectors capture semantic relationships between words, where:
The model used in this demo contains pre-computed vectors for approximately 1,000 common English words, trained on a large text corpus.