In early 2022, two Google policy staffers met with a trio of women victimized by a scam that resulted in explicit videos of them circulating onlineâincluding via Google search results. The women were among the hundreds of young adults who responded to ads seeking swimsuit models only to be coerced into performing in sex videos distributed by the website GirlsDoPorn. The site shut down in 2020, and a producer, a bookkeeper, and a cameraman subsequently pleaded guilty to sex trafficking, but the videos kept popping up on Google search faster than the women could request removals.
The women, joined by an attorney and a security expert, presented a bounty of ideas for how Google could keep the criminal and demeaning clips better hidden, according to five people who attended or were briefed on the virtual meeting. They wanted Google search to ban websites devoted to GirlsDoPorn and videos with its watermark. They suggested Google could borrow the 25-terabyte hard drive on which the womenâs cybersecurity consultant, Charles DeBarber, had saved every GirlsDoPorn episode, take a mathematical fingerprint, or âhash,â of each clip, and block them from ever reappearing in search results.
The two Google staffers in the meeting hoped to use what they learned to win more resources from higher-ups. But the victimâs attorney, Brian Holm, left feeling dubious. The policy team was in âa tough spotâ and âdidnât have authority to effect change within Google,â he says.
His gut reaction was right. Two years later, none of those ideas brought up in the meeting have been enacted, and the videos still come up in search.
WIRED has spoken with five former Google employees and 10 victimsâ advocates who have been in communication with the company. They all say that they appreciate that because of recent changes Google has made, survivors of image-based sexual abuse such as the GirlsDoPorn scam can more easily and successfully remove unwanted search results. But they are frustrated that management at the search giant hasnât approved proposals, such as the hard drive idea, which they believe will more fully restore and preserve the privacy of millions of victims around the world, most of them women.
The sources describe previously unreported internal deliberations, including Googleâs rationale for not using an industry tool called StopNCII that shares information about nonconsensual intimate imagery (NCII) and the companyâs failure to demand that porn websites verify consent to qualify for search traffic. Googleâs own research team has published steps that tech companies can take against NCII, including using StopNCII.
The sources believe such efforts would better contain a problem thatâs growing, in part through widening access to AI tools that create explicit deepfakes, including ones of GirlsDoPorn survivors. Overall reports to the UKâs Revenge Porn hotline more than doubled last year, to roughly 19,000, as did the number of cases involving synthetic content. Half of over 2,000 Brits in a recent survey worried about being victimized by deepfakes. The White House in May urged swifter action by lawmakers and industry to curb NCII overall. In June, Google joined seven other companies and nine organizations in announcing a working group to coordinate responses.
Right now, victims can demand prosecution of abusers or pursue legal claims against websites hosting content, but neither of those routes is guaranteed, and both can be costly due to legal fees. Getting Google to remove results can be the most practical tactic and serves the ultimate goal of keeping violative content out of the eyes of friends, hiring managers, potential landlords, or datesâwho almost all likely turn to Google to look up people.
A Google spokesperson, who requested anonymity to avoid harassment from perpetrators, declined to comment on the call with GirlsDoPorn victims. She says combating what the company refers to as nonconsensual explicit imagery (NCEI) remains a priority and that Googleâs actions go well beyond what is legally required. âOver the years, weâve invested deeply in industry-leading policies and protections to help protect people affected by this harmful content,â she says. âTeams across Google continue to work diligently to bolster our safeguards and thoughtfully address emerging challenges to better protect people.â
+ There are no comments
Add yours