The cofounder of Google’s AI division DeepMind says everybody will have their own AI-powered ‘chief of staff’ over the next five years::Mustafa Suleyman said AI will “intimately know your personal information” and be able to serve you 24-7.

  • Eager Eagle@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    1 year ago

    I do wonder why Cortana, Siri, Alexa, and Google Assistant are lagging so behind these LLMs. I personally don’t use them with any frequency other than setting timers, but it’s annoying to even consider using them and then realizing they are not as nearly as usable or helpful as ChatGPT.

    • Elohim@lemm.ee
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      1 year ago

      The big thing that’s holding Apple back regarding Siri is that they aim to have all their AI-driven functions processed on the user’s hardware, for security/privacy. So they not only need the software component, they want to have the hardware capable of running it inside the individual phones.

      • Eager Eagle@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        edit-2
        1 year ago

        eh… sounds like privacy theater to me. Only the audio transcription may be processed on the device.

        In all cases, transcripts of your interactions will be sent to Apple to process your requests

        src

        They might aim to have a full blown LLM on the device, but it’ll never be as good as the others with these limitations.

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 year ago

          Many teams are currently working on striking the right balance of fine tuning and model size. Most aren’t considering phones yet, but PCs off network.

          It is entirely possible to have an LLM run “closed loop”, but obviously Google and Apple want in that loop

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      For “impressive” general reasoning and conversation these LLM currently require pretty beefy hardware. You’re either lugging a GPU around or calling to an API.

      • elfin8er@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Aren’t these current personal assistants already relying on API calls for their responses?

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Like siri? Yes, my point pertained to hardware needed for LLM specifically though

    • WiseMoth@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I know Apple’s developing their own LLM which will hopefully be used in Siri. There’s no guarantee, but I can’t think it would be too hard to add Bard into Google Assistant. Cortana on the other hand was canceled by Microsoft and is being replaced by Bing chat. I believe Amazon is also stopping the Alexa development

    • Terrasque@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      They’re working on it, but it takes time. Especially making it reliable.

      The current crop of llm’s will happily answer or do nonsense or even dangerous things.