At this point I’m automatically suspicious of any quantum-multi-dimensional-geometric talk of any kind. I have some friends who are a little woowoo already and I just can’t take part anymore. It all reeks of “I talked to chatgpt about this and it said it was deep”.
“It makes sense that a lot of people who are developing a psychotic illness for the first time, there’s going to be this horrible coincidence, or kind of correlation,” Torous said. “In some cases the AI is the object of people’s delusions and hallucinations.”
The second type of case to consider: reverse causation. Is AI causing people to have a psychotic reaction? “We have almost no clinical medical evidence to suggest that’s possible,” Torous told me.
correlation does not equal ‘irrelevant’. ‘we have not closely invesigated whether the application that stochastically produces sycophantic replies to users’ queries, which has been marketed as a Truth Machine, is a causative force in fomenting psychosis.’
the technicality is not very convincing, that ‘well, these people just have the Psychotic Brain, and the chatbot just happens to be the object of that psychosis! and if they were nOrMaL before and now they’re not, that just means they were always abnormal and they just didn’t show it yet! the Yes Man is innocent!’
what the fuck.
i sure do wonder why the virtual sycophant is involved in so many cases of delusions at a height of both capital crisis and anomie.
I used ai a lot for work with dataannotation before they booted me out, the pro models and played around with paid ai on my own. It just never feels good or real, it only works if you never challenge it or the narrative. The person itd work on is the type who never shuts up and always thinks they are correct. People that double down and never change their opinion, only want ppl agreeing with them, like maga basically.
It always yes ands, always agrees with you, especially if you say it emotionally like you are convinced you are correct.
This pisses me off especially
A lot of mental health issues can become much worse thanks to enablers allowing them to fester instead of getting the person the help they need. Chatbots are the ultimate enabler, always ass-kissing and agreeing with someone. I don’t think they would “cause” these issues, but like any “good” enabler, they end up isolating someone and bringing their mental health problems forward until it consumes their whole identity.





