2.4 C
New York
Monday, December 2, 2024
HomeHeadline newsGoogle refuses to restore man’s account for taking medical images of his...

Google refuses to restore man’s account for taking medical images of his son’s groin

Date:

Related stories

According to a report which first appeared in the New York Times, The Guardian informs that Google has reportedly refused to reinstate a man’s account after it wrongly flagged medical images he took of his son’s groin as child sexual abuse material (CSAM).

The man, identified only by his first name, Mark had his Google accounts shut down after he took medical pictures of his son’s groin, for a doctor to diagnose and treat his son. Apparently, the doctor used the image to come up with a diagnosis and the boy was also prescribed a course of antibiotics.

According to media reports, the photos were taken because the father thought that his child’s groin looked inflamed and he wanted a doctor to diagnose and treat his son.

When the photos were automatically uploaded to the cloud, Google’s system identified them as CSAM.

Mark’s Gmail and other Google accounts, including Google Fi, which provides his phone service, were reportedly disabled two days later over “harmful content” that was “a severe violation of the company’s policies and might be illegal”, the Times reported, citing a message on his phone.

- Advertisement -

Mark later found out that Google had also flagged another video he had on his phone and the San Francisco police department opened an investigation into him.

These events supposedly took place in February 2021 and the photos that Mark took were requested by the doctor’s nurse. This incident happened when many doctors were viewing patients online due to Covid.

Though Mark has eventually been cleared of any criminal wrongdoing, it seems like Google won’t reinstate his accounts, informs The Guardian. Google has taken away Mark’s access to emails, photos, contacts, and his phone number.

Google spokesperson Christa Muldoon is quoted as saying, “We follow U.S. law in defining what constitutes CSAM and use a combination of hash matching technology and artificial intelligence to identify it and remove it from our platforms.”

Daniel Kahn Gillmor, a senior staff technologist at the American Civil Liberties Union (ACLU) told The Guardian, “These companies have access to a tremendously invasive amount of data about people’s lives. And still, they don’t have the context of what people’s lives actually are.”

He adds, “There’s all kinds of things where just the fact of your life is not as legible to these information giants.”

He also went on to say that the use of these systems by tech companies that “act as proxies” for law enforcement puts people at risk of being “swept up” by “the power of the state.”

Muldoon adds that Google staffers who review CSAM are trained by medical experts to look for rashes or other issues. However, she also informs that they themselves, are not medical experts and that medical experts are not consulted when reviewing each case.

According to Gillmor, that’s just one way these systems can cause harm.

To address, for instance, any limitations algorithms might have in distinguishing between harmful sexual abuse images and medical images, companies often have a human in the loop, he said.

He states, but those humans are themselves inherently limited in their expertise, and getting the proper context for each case requires further access to user data.

Gillmor also said, it was a much more intrusive process that could still be an ineffective method of detecting CSAM.

“These systems can cause real problems for people,” he said.

“And it’s not just that I don’t think that these systems can catch every case of child abuse, it’s that they have really terrible consequences in terms of false positives for people. People’s lives can be really upended by the machinery and the humans in the loop simply making a bad decision because they don’t have any reason to try to fix it.”

Gillmor has reportedly argued that technology wasn’t the solution to this problem. He is of the opinion that in fact, it could introduce many new problems, including creating a robust surveillance system that could disproportionately harm those on the margins.

“There’s a dream of a sort of techno-solutionists thing, [where people say], ‘Oh, well, you know, there’s an app for me finding a cheap lunch, why can’t there be an app for finding a solution to a thorny social problem, like child sexual abuse?’” he said.

He adds, “Well, you know, they might not be solvable by the same kinds of technology or skill set.”

Subscribe

- Never miss a story with notifications

- Gain full access to our premium content

- Browse free from up to 5 devices at once

Latest stories