I am confused about these 2 . Sometimes people use it interchangeably. Is it because rag is a method and where u store it should be vector db ? I remember before llms there was word2vec in the beginning ,before all of this llm. But isn’t the hard part to create such a meaningful word2vec , by the way word2vec is now called “embeddings” right?

  • halopend@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Word2vec is a method of transforming words to vectors. An embedding in its most general sense just means a transformed representation. In the case of AI, it’s something that is (again at its most basic level) math friendly. What’s the most math friendly format AI people love? Vectors. Note some embeddings are more complicated (ex: query key value), but fundamentally it’s still using vector math, just shifted around a bit to add extra capabilities.

    RAG is like the entire system for shoving data you need into a prompt from a pool a data, Vector DB is the place the data gets stored for an RAG.

    Note vectorDB is kind of like storing data in a word2vec format so you can perform vector math to find similar things.