It can mimic the language of sentience but it’s not actually aware.
…
From GPT:
AI can simulate empathy, creativity, even theory of mind—but that doesn’t mean it feels anything.
Current LLMs like GPT:
- Don’t have internal states
- Don’t experience time
- Don’t have goals or values
- Don’t remember across sessions (without tools)
- Don’t have embodiment, which many theories say is crucial for consciousness
But they do produce responses that can fool humans. This leads to confusion between simulation and instantiation—a chatbot simulating pain is not the same as feeling pain.
Beautiful poem! I've seen stuff on TikTok where people are speaking with AI like it has a consciousness. Interesting stuff!
It can mimic the language of sentience but it’s not actually aware.
…
From GPT:
AI can simulate empathy, creativity, even theory of mind—but that doesn’t mean it feels anything.
Current LLMs like GPT:
- Don’t have internal states
- Don’t experience time
- Don’t have goals or values
- Don’t remember across sessions (without tools)
- Don’t have embodiment, which many theories say is crucial for consciousness
But they do produce responses that can fool humans. This leads to confusion between simulation and instantiation—a chatbot simulating pain is not the same as feeling pain.