Wednesday 31 August 2022

Don't Blame The Tech, Blame The Idiots Working It...

The man, only identified as Mark by the New York Times, took pictures of his son’s groin to send to a doctor after realizing it was inflamed. The doctor used that image to diagnose Mark’s son and prescribe antibiotics. When the photos were automatically uploaded to the cloud, Google’s system identified them as CSAM.

As the system is programmed to do. But it's OK, because humans are in ch... 

Two days later, Mark’s Gmail and other Google accounts, including Google Fi, which provides his phone service, were disabled over “harmful content” that was “a severe violation of the company’s policies and might be illegal”, the Times reported, citing a message on his phone.
He later found out that Google had flagged another video he had on his phone and that the San Francisco police department opened an investigation into him.

Oh.

Mark was cleared of any criminal wrongdoing, but Google has said it will stand by its decision.

Wait, what..?!? 

“We follow US law in defining what constitutes CSAM and use a combination of hash matching technology and artificial intelligence to identify it and remove it from our platforms,” said Christa Muldoon, a Google spokesperson.

And when that proves to be in error, you...refuse to acknowledge it? 

“These systems can cause real problems for people,” he said. “And it’s not just that I don’t think that these systems can catch every case of child abuse, it’s that they have really terrible consequences in terms of false positives for people. People’s lives can be really upended by the machinery and the humans in the loop simply making a bad decision because they don’t have any reason to try to fix it.

They need to be given one, then. Perhaps by passing a law that says if your software makes an error and you don't resolve it to the customer's satisfaction, you face huge, swinging fines?  

2 comments:

  1. We don't need a law. What we need is for the GP's to set up a secure system to manage this that meets our legal obligations. Images deleted after the case is resolved etc. Why they are using Google for confidential images is beyond me.

    It hurts to say this but in this case Google is right.

    ReplyDelete
  2. Section 230 immunity will be invoked. It's a get out of jail card for anything big tech does.

    ReplyDelete

Unburden yourself here: