When I read the Lambda transcripts of that Google employee (priest?) who tried to whistleblow that that Google’s AI was sentient… I mean damn, I read those transcripts, it sounded real as hell.
Stochastic parrot or not, I think a significant part of my own consciousness goes towards predicting the next word in a given context.
IIRC, consciousness is really, really complicated. We might not be capable of knowing if we are LLMs in meatsuits. An LLM might be a highly vocal infant, and we simply wouldn’t have a great way of really making that judgement. Shit, we still can’t define consciousness in humans–or other animals–in any meaningful way.
Agreed. I think it’s a spectrum, and even a chair (an object that forms a feedback loop of forces with its surroundings that depends on its previous state) is conscious to a degree.
When I read the Lambda transcripts of that Google employee (priest?) who tried to whistleblow that that Google’s AI was sentient… I mean damn, I read those transcripts, it sounded real as hell.
Stochastic parrot or not, I think a significant part of my own consciousness goes towards predicting the next word in a given context.
IIRC, consciousness is really, really complicated. We might not be capable of knowing if we are LLMs in meatsuits. An LLM might be a highly vocal infant, and we simply wouldn’t have a great way of really making that judgement. Shit, we still can’t define consciousness in humans–or other animals–in any meaningful way.
Agreed. I think it’s a spectrum, and even a chair (an object that forms a feedback loop of forces with its surroundings that depends on its previous state) is conscious to a degree.
I don’t think there is a line
There will be a legal line at some point.