The Hard Problem of Consciousness bothers me a lot. Qualia vs. correlates and all that. I have no freaking clue how it works and I hate it.
Maybe there is some spark of divinity in us that has zero to do with our ability to hold a conversation, now that we’ve written a Python program that can do that
Someday, AI will achieve something resembling consciousness.
Months of messing around with LLaMA has shown me this ain’t it, chief
I don’t know; I’ve encountered LLMs that pass my personal Turing Test and several Redditors who fail it…
Would love it if instead of proving LLMs are concious, we prove that none of us are. Or, I guess, I wouldn’t be since I wouldn’t be concious
The Hard Problem of Consciousness bothers me a lot. Qualia vs. correlates and all that. I have no freaking clue how it works and I hate it.
Maybe there is some spark of divinity in us that has zero to do with our ability to hold a conversation, now that we’ve written a Python program that can do that
LLaMa, no. But GPT-4 is frighteningly good
*was