How AI works · meaning as numbers
Embedding
30-second gist~30s read
An embedding is how a computer turns words, sentences, or pictures into a long list of numbers it can compare. Things that mean similar things end up with similar numbers.
It's the quiet machinery behind a lot of useful magic: search that "gets" what you mean, "more like this" buttons, customer-support routing, and the inside of every modern recommendation system.
If you want more
How does it work, in plain English?
Imagine a giant library where every book is placed on a shelf according to its meaning. Books about cooking sit near each other; books about cooking Italian food sit even closer; novels are in another wing entirely. An embedding is the address on the shelf, written as numbers.
To find "more like this", the computer doesn't search for matching keywords — it just looks for nearby addresses. That's why a search for "doctor in town" can find a result that says "GP in Auckland" without any of the same words appearing.
Where you've seen it
Spotify's "Discover Weekly", Netflix's "Because you watched…", Amazon's "Customers also bought", and the search bar inside most modern apps all use embeddings under the hood. The same trick now powers most chatbots that need to find the right snippet of a long document before answering.