• IntolerantModerate@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The GPU limitations/requirements to go next level may also put a practical ceiling on things. Could you even run a model that was 10x larger than GPT4 without breaking the bank?

  • onyxengine@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Everytime someone makes this claim ai leaps ahead 30 years on the consensus of expected capabilities on the time line. AGI will probably be here before 2030.

  • lpds100122@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    With all my respect to Bill, I clearly don’t understand why we should to listen him. The guy has absolutely no vision of future!

    He was ridiculously blind to WWW, blockchain technologies, smartphones, etc etc. Just let him leave in peace in his mansion. He is not a visionary and never was.

    As to me personally and if I need a real hero, I would prefer to listen to Steve Wozniak.

    • MyLittlePIMO@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      His blindness to the WWW is overstated. Blockchain has not had a huge effect on the world. And he was no longer CEO of Microsoft once smartphones happened; that was Steve Ballmer.

      I’m not a Gates fan, but you’re also exaggerating critique.

  • 0x00410041@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Yes Bill that’s why we are now innovating around and adding other functionality to what an LLM can be. It is just one component of what people talk about when we discuss AGI which will be a combination of hundreds of systems interacting, each of which may be extremely powerful and complex individually.

  • learn-deeply@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    who cares what Bill Gates thinks, he doesn’t do research or programming anymore, nor interacts closely with people who do.

  • AGM_GM@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I have no problem with a plateau for a while. GPT-4 is already very powerful and the use cases for it are far from being fully explored in all kinds of fields. A plateau that gives people, businesses, and institutions some time to get our heads properly around the implications of this tech before the next breakthrough would likely be for the best.

  • xbimba@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Did he forget that GPT is a computerized AI, not a human? It doesn’t become old, it keeps getting smarter and smarter.

    • bgighjigftuik@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      But he is a billionaire! Billionaires are always right… That’s why they are billionaires… Right?

  • aluode@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The magic will come from the interactions of millions of people with super smart AI. People who were not able to program before can, nurses can be as efficient as doctors, people who have no clue about fixing things can fix the things with augmented reality glasses etc. If GPT 4 level AI was widely adopted (which it is not), it alone could change the society fundamentally. Now we are in the adoption face and the AI’s as they get better, will move more and more to the center of our lives, as did cellphones. Will they get better? I think the models they were trained on will get better. They will get faster, they will have memory, ability to connect with the world through senses, the AI chips will make them vastly faster. The magic would happen even if they had plateaued, which they have not.

    • AllowFreeSpeech@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Indeed. The more Reddit gangs up against alternative modes of thought, the more Reddit is likely to be wrong.