I’m always willing to build a career in AI/ML infra. Usually when talking about AI infra in tech industry, we refer to training infra, serving infra, model deployment etc.

Now with this genAI/LLM wave, I find many NLP specific infrastructure such as semantic indexing, vector databases are quickly rising up. So do semantic indexing/vector databases also count as AI infra? And is it a promising field?

  • notllmchatbot@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I see these as the equivalent of selling picks and shovels in a gold rush. Good thing is that you won’t need to bet on any particular vertical or application, which is always hard for novel technologies. Bad thing is infra is usually not where most of the value generation/capture happens.

  • Mammoth-Doughnut-160@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Yes, semantic indexing and vector databases are now part of AI infra called Retrieval Augmented Generation which is used to link knowledge sources to LLMs for information retrieval. (LLMs are not good at searching). To learn more about how to implement RAG in a GenAI context, check out LLMWare which provides an integrated RAG platform so you can quickly level up in AI Infra: https://github.com/llmware-ai/llmware

  • localhost80@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    No, it is not considered AI infra. Embedded databases consume AI infra to create the embedding, but vector databases can exist without any AI component. AI infra is the generation of output not the consumption of the generation. If that was the case, every piece of software built that uses a LLM component would be considered AI infra.