Google refuses to reinstate man’s account after he took medical images of son’s groin | Technology

Date:

Google has refused to reinstate a man’s account after it falsely marked medical images of his son’s groin as child sexual abuse material (CSAM), the New York Times first reported. Experts say it is an inevitable pitfall to try to apply a technological solution to a social problem.

Experts have long warned of the limitations of automated image detection systems for child sexual abuse, especially as companies face regulatory and public pressure to address the existence of sexual abuse material.

“These companies have access to a hugely invasive amount of data about people’s lives. And yet they don’t have the context of what people’s lives actually are,” said Daniel Kahn Gillmor, senior staff technologist at the ACLU. “There are all kinds of things where just the fact of your life isn’t as readable to these information giants.” He added that the use of these systems by tech companies acting “as proxies” to law enforcement puts people at risk of being “swept away” by “state power.”

The man, identified only as Mark by the New York Times, took photos of his son’s groin to send to a doctor after realizing it was inflamed. The doctor used that image to diagnose Mark’s son and prescribe antibiotics. When the photos were automatically uploaded to the cloud, Google’s system identified them as CSAM. Two days later, Mark’s Gmail and other Google accounts, including Google Fi, which provides its phone service, were disabled for “harmful content” that “was a serious violation of company policy and may be illegal,” the Times reported, citing a message on his phone. He later found out that Google had flagged another video on his phone and that the San Francisco Police Department was opening an investigation into him.

Mark was cleared of any criminal wrongdoing, but Google has said it will stand by his decision.

“We follow US law in defining what CSAM is and use a combination of hash-matching technology and artificial intelligence to identify it and remove it from our platforms,” said Christa Muldoon, a Google spokesperson.

Muldoon added that Google employees who assess CSAM are trained by medical experts to detect rashes or other problems. However, they were not medical experts themselves, and medical experts were not consulted in reviewing each case, she said.

That’s just one way these systems can wreak havoc, Gillmor said. For example, to address any limitations algorithms may have in distinguishing between images of harmful sexual abuse and medical images, companies often have a human in the loop. But those people themselves are inherently limited in their expertise, and getting the context right for each case requires further access to user data. Gillmor said it was a much more intrusive process that could still be an ineffective method of detecting CSAM.

“These systems can cause real problems for people,” he said. “And it’s not just that I don’t think these systems can handle every instance of child abuse, it’s that they have really horrendous consequences in terms of false positives for people. People’s lives can really be turned upside down by the machinery and the people in the loop who just make a bad decision because they have no reason to try to fix it.”

Gillmor argued that technology was not the solution to this problem. In fact, it could introduce many new problems, he said, including creating a robust surveillance system that could inflict disproportionate damage on those on the margins.

“There’s a dream of some kind of techno-solutionist thing, [where people say]”Oh, well, you know, there’s an app for me to find a cheap lunch. Why can’t there be an app to find a solution to a thorny social problem, like child sexual abuse?” ” he said. “Well, you know, they might not be solvable with the same kind of technology or skills.”

The Valley Voice
The Valley Voicehttp://thevalleyvoice.org
Christopher Brito is a social media producer and trending writer for The Valley Voice, with a focus on sports and stories related to race and culture.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Popular

More like this
Related

Wall St deep in bear market as S&P 500 hits new two-year low

S&P 500 hits lowest point since Nov. 2020Rate...

Do covid vaccines affect periods? A new study says they do.

Not long after the rollout of coronavirus vaccines last...

GM delays return-to-office mandate after employee backlash

General Motors CEO Mary Barra speaks with reporters as...

Biden to release plan for reducing obesity, ending hunger by 2030

correctionAn earlier version of this story inaccurately said that...