Vector embeddings
up: LLMs ⚐
A vector embedding is data mapped into a numerical representation, a Vector.
Embedding models
First, text is split into tokens. This can be by word, by subword, or some other method.
Models are trained to “know” which tokens are similar. They have been fed a lot of text to see which words are used in context with one another. Based on that, models translate each token into an embedding.l