Futurism: People Are Being Involuntarily Committed, Jailed After Spiraling Into “ChatGPT Psychosis”

A chatbot does not “happily” do anything, and one no more develops a “relationship” with it than with bourbon.

«That all would have alarmed a friend or medical provider, but Copilot happily played along, telling the man it was in love with him, agreeing to stay up late, and affirming his delusional narratives.



The man’s relationship with Copilot continued to deepen, as did his real-world mental health crisis.»

https://futurism.com/commitment-jail-chatgpt-psychosis

Maybe this writer knows this and assumes everybody else does, too, but then, maybe this writer is simply displaying the same mental vulnerability (like a security vulnerability) that enables a chatbot to hack his brain.

He needs a grumpy editor.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.