A year-long commission has been launched by Mind to examine AI and mental health after a Guardian investigation exposed how Google’s AI Overviews, which are shown to 2 billion people each month, gave people “very dangerous” mental health advice.
When the 'health professionals' are pushing the 'trans child' concept and telling men 'Of course we can surgically make you into a realistic woman!' and vice versa, could it really be more dangerous?
“I set myself and my team of mental health information experts at Mind a task: 20 minutes searching using queries we know people with mental health problems tend to use. None of us needed 20. “Within two minutes, Google had served AI Overviews that assured me starvation was healthy. It told a colleague mental health problems are caused by chemical imbalances in the brain. Another was told that her imagined stalker was real, and a fourth that 60% of benefit claims for mental health conditions are malingering. It should go without saying that none of the above are true.
Well, you'd have to be mental to take as gospel anything you read on the Inter...
Oh, right!
“In each of these examples we are seeing how AI Overviews are flattening information about highly sensitive and nuanced areas into neat answers.”
Which is, after all, what the users want. They don't want to be told 'No, you can't have a functioning penis because you were born a girl!'...
“People need and deserve access to constructive, empathetic, careful and nuanced information at all times.”
Then start giving it to them, and stop lying to them, and they won't need to use AI.
No comments:
Post a Comment
A reminder, dear reader, that you're welcome to comment as Anon but if so, please invent a moniker to appear somewhere in your text ... it tells Watchers nothing, it does help the readers.