12/08/2022 | Press release | Distributed by Public on 12/08/2022 03:56
08 Dec 2022, 09:39am
Tech giant's convoluted systems retraumatising women and girls when they try to remove explicit content from internet
When reporting content, users must tick a box saying they recognise they can be punished if a submission is untrue'Google must do more to prevent the spread of online gender-based violence - not just in Korea, but everywhere' - Jihyun Yoon
Survivors of online sexual abuse in South Korea have told Amnesty International that Google's slow and convoluted system for processing content takedown requests has added to their suffering and in some cases re-traumatised them.
Women and girls targeted by digital sex crimes said the difficult and lengthy process for reporting non-consensual explicit content on Google meant that videos of sexual abuse have ended up proliferating online.
Google says non-consensual explicit content can be removed on request, but survivors and activists told Amnesty the company's reporting categories and procedures were confusing, the appropriate forms were difficult to find, and they contained ambiguous categories about the type of content being reported.Amnesty interviewed four survivors of online gender-based violence along with six activists who have been supporting them. All the survivorsreported that their physical and mental health were damaged, including feeling they needed to withdraw from society to avoid stigma. Hyun-jin, who went to the police after a non-consensual video of her containing sexually explicit content appeared online, told Amnesty she wrongly assumed the video would soon be deleted:
"When you are victimised like this, you have no idea what to do. I was looking at my phone and googling my name the whole day. I could barely sleep an hour a day, spending most of my time searching. I had constant nightmares, but reality itself was more of a nightmare. In order to delete videos, images and search keywords, I had to take hundreds of screenshots and report these to Google.
"I couldn't ask someone else to do all this for me because I had to attach these harmful images depicting myself when I reported them. I had to face it all alone. It was so easy [for the perpetrator] to upload a video, but it took months to get it removed."
Even after targeted individuals have managed to submit a complaint problems have continued with Google failing to keep people informed about the progress of their claims. Amnesty surveyed 25 survivors and activists - all 11 of those who made complaints via Google said it was difficult to confirm whether their requests had been properly processed. Hyun-jin* waited more than a year between receiving a confirmation notice from Google and finally being informed of the outcome of a series of removal requests.
Google also appears to fail to take into account survivors' trauma when designing its takedown procedures. When reporting content, users must tick a box saying they recognise they can be punished if their submission is untrue, while Google says it will not process incomplete complaints. Hyun-jin said these guidelines heightened her anxiety:
"I submitted it with difficulty, but rather than being convinced that it would be deleted, I became more anxious because I thought that if it didn't work, it would be my responsibility."
Hyun-jin has since prepared a response templatewhich she has shared with other survivors to help them make removal requests.
One of Google's reporting forms also requires a photo ID to be attached when submitting a report, failing to take into account the fact that survivors who have had explicit material distributed without their consent fear sharing their image online. Dan, from the activist group Team Flame, said "asking survivors to post photo IDs online, where videos of victims are circulating, is nothing short of traumatic".Amnesty has today launched a global petitioncalling on Google to address the serious flaws in its reporting system.Jihyun Yoon, Amnesty International Korea's Director, said:
"As a wave of digital sex crimes in South Korea causes severe harm to the women and girls who have been targeted, Google's inadequate system for reporting non-consensual explicit content is making matters even worse."Google must do more to prevent the spread of online gender-based violence - not just in Korea, but everywhere.Survivors around the world are forced to use this same flawed reporting system when they try to get harmful content removed, so it is highly likely this issue extends way beyond Korea."These women have no choice but to put their faith in removal requests to tech companies, despite the painful process of having to repeatedly search for and collect the non-consensual explicit content they are featured in.
"When these requests are not processed quickly and the abusive content can be re-distributed at any time, survivors are exposed to prolonged physical and mental harm."Systems contributing to digital sex crime rise
In March 2020, a group of South Korean journalists exposed eight secret chat rooms on the messaging app Telegram, where thousands of videos of women and girls containing explicit non-consensual sexual content were being sold using crypto currency. Korean police said more than 60,000 people participated in the crimes by entering these rooms, collectively known as the "Nth Room" case. In October 2021, one of the chat operators was sentenced to 42 years in jail.
Recent criminal casesshow that perpetrators have threatened survivors with existing video content to force them into producing more sexually abusive material. This shows that unless the content and survivors' personal information are deleted, women and girls will continue to be subjected to further harm or crimes even when the original perpetrators are punished.
Google reponseGoogle's own human rights policy contains a commitment to "upholding the standards established in the United Nations Guiding Principles on Business and Human Rights", which requires companies to respect human rights and to address any violations.On 11 November, Amnesty wrote to Google requesting a response to its findings. Google did not provide an official response but stated in a private meeting that the issue is important and the company wants to improve how it responds to it.
*All survivors' real identities have been protected at their request.