An AI researcher's explanation for why the transcript of Blake Lemoine's conversation with the LaMDA chatbot does not in any way support the notion that it is sentient
Furthermore, your input defines a path within the language model that only makes sense to you. A language model has no agency on its own. It a new case of the Elisa effect...
LaMDA’s Sentience is Nonsense - Here’s Why
Furthermore, your input defines a path within the language model that only makes sense to you. A language model has no agency on its own. It a new case of the Elisa effect...