halopend

joined 1 year ago
[–] halopend@alien.top 1 points 11 months ago

Word2vec is a method of transforming words to vectors. An embedding in its most general sense just means a transformed representation. In the case of AI, it's something that is (again at its most basic level) math friendly. What's the most math friendly format AI people love? Vectors. Note some embeddings are more complicated (ex: query key value), but fundamentally it's still using vector math, just shifted around a bit to add extra capabilities.

RAG is like the entire system for shoving data you need into a prompt from a pool a data, Vector DB is the place the data gets stored for an RAG.

Note vectorDB is kind of like storing data in a word2vec format so you can perform vector math to find similar things.