• 👁️👄👁️@lemm.ee
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    1 year ago

    I get what you mean, but there’s currently nothing planned or in the works to run local AI on phones, also they’re still way too demanding. They can’t even handle an 7b model if we’re talking about ram usage alone.

    • Deiskos@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Futureproofing, I guess, but also I’m sure there are things other than language models that can benefit from a dedicated processing unit, like photo processing, smaller models, Lens, etc.

    • newIdentity@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Actually, Speach To Text is done locally on newer pixel devices. So is the audio recognition, the camera processing and a lot more AI features.

      AI isn’t just chatgpt and basically every device in the last 4 years has a dedicated AI chip