- Beck Diefenbach/Reuters
- Google’s autocomplete and related search features showed the names of victims involved in sexual assault cases, even though they are granted anonymity under UK law, an investigation by The Times found.
- The flaw means typing a defendant’s name, plus some common search terms, into Google’s search bar also showed the accuser’s name on autocomplete or related search.
- The Times found at least four examples which Google investigated and dealt with.
- Business Insider found a fifth example that Google is investigating.
- Google said it doesn’t permit autocomplete or related search results which violate the law.
LONDON – Anyone who accuses someone of sexual assault is automatically covered by UK law, which protects alleged victims from having their identities revealed. Lifelong anonymity applies even if a court finds the accused not guilty.
But Google’s autocomplete and related search features are inadvertently revealing the identities of women involved in high-profile sexual assault cases.
Both features are intended to make it easier for a person to find content relevant to their search terms. Google said it bases these predictions on what other people are searching for.
- Business Insider
The Times first reported that typing certain terms around assault cases into Google, such as the accused’s name, also showed details of the alleged victim such as name and hometown. The searches also worked the other way around, the newspaper found, with searches for names of the victim also bringing up the alleged attacker. The newspaper discovered four examples of Google associating alleged abusers with their victims.
Business Insider discovered a new, fifth example from a high-profile sexual assault case in recent years, which we can’t name in case it identifies the accuser. When we typed in the defendant’s name and the name of a well-known website, the woman’s name also appeared. We were able to replicate the example on several devices not influenced by related searches.
Google’s explanation for how autocomplete works shows how humans can manipulate seemingly neutral algorithms. Those who are extremely familiar with the details of sexual assault case, even obtaining the victim’s name illegally, can alter Google’s predictions simply by typing that information into search. A user who is less familiar with the case and who then looks it up, using terms such as the defendant’s name, can then stumble across identifying information without really trying.
Google told The Times it had deleted the four instances discovered by the newspaper. It told Business Insider it was investigating the fifth example.
A spokeswoman told Business Insider: “We don’t allow these kinds of autocomplete predictions or related searches that violate laws or our own policies and we remove examples when we’re made aware of them. We recently expanded our removals policy to cover predictions which disparage victims of violence and atrocities, and we encourage people to send us feedback about any sensitive or bad predictions.”
The company also said it co-operated with courts to try and prevent sensitive terms popping up on search.