• 0 Posts
  • 1 Comment
Joined 11 months ago
cake
Cake day: October 29th, 2023

help-circle
  • halopend@alien.topBtoLocalLLaMA@poweruser.forumRag vs Vector db
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Word2vec is a method of transforming words to vectors. An embedding in its most general sense just means a transformed representation. In the case of AI, it’s something that is (again at its most basic level) math friendly. What’s the most math friendly format AI people love? Vectors. Note some embeddings are more complicated (ex: query key value), but fundamentally it’s still using vector math, just shifted around a bit to add extra capabilities.

    RAG is like the entire system for shoving data you need into a prompt from a pool a data, Vector DB is the place the data gets stored for an RAG.

    Note vectorDB is kind of like storing data in a word2vec format so you can perform vector math to find similar things.