• 0 Posts
  • 2 Comments
Joined 10 months ago
cake
Cake day: November 20th, 2023

help-circle

  • Alright, I’m tired of this AGI stuff getting around. A bit of context, I have a master in generative AI and currently pursuing a PhD in explainable NLP.

    Chat-GPT, and LLMs in general, are not remotely close to being an AGI. The best they can do is construct a pseudo representation of words’ meaning (which, if we consider words to be the main descriptor of our world, could be a world representation).

    They then use this word representation and try to find the closest ones that make sens together. It is essentially like counting from 1 to 5 and thinking, I see that the closest number after 1 is 2 and so on.

    Granted, they have a really good representation of our language and that is what makes them so believable. But in reality they don’t “think”, they just compute distances in a really smart and complex manner.

    However, one philosophical aspect that resonate with LLMs is in how we represent the world around us. Is it using only words? But then our representation is linked with language, which differs between person. How did we represent our world without words?