Uses a tool the bad way despite it being public knowledge that it's bad for mental health
Was predisposed to mental health problems
Died, partly because they talked to a chatbot
"It's the chatbot's, creator fault", despite the chatbot never being made to cause those problems, and efforts being made to fix those problems
...
Yea nah, it's just anti-ai people doing their thing again and not being objective.
Get a better fight, such as hating on pharmaceutical laboratories companies pushing the use of extremely addictive substances for profit, despite them knowing the immense risk they cause to consumers, and financing false ads to make it safe.
If Sam Altman belongs in prison, it would either be:
Because he's destroying the planet (ecologically)
Because he stole lots of content to train his models
What it’s doing is sorting reviews by language, under the idea that reviews written in the language you use are probably more relevant to you.
It already did that, but only when viewing reviews. The update just makes it so that the score ex: "overwhelmingly positive" is now only based on reviews of your language
...
Yea nah, it's just anti-ai people doing their thing again and not being objective.
Get a better fight, such as hating on pharmaceutical laboratories companies pushing the use of extremely addictive substances for profit, despite them knowing the immense risk they cause to consumers, and financing false ads to make it safe.
If Sam Altman belongs in prison, it would either be: