Autocomplete is that nifty little tool which, by predicting what you’re searching for, allows you to save a few keystrokes here and there. Of course, that’s when it’s functioning properly. Gone awry, it can alienate millions, expose the dark underbelly of society, and repulse users. Just ask Google.
Facebook is the latest tech giant to be bedeviled by an offensive autocomplete, though their case remains more inexplicable than the rest. Thursday night at 8:43 PM, @BennettJonah tweeted the following cryptic instructions:
Right now, go to your Facebook search bar and type: video of
and see what results show up.
— Jonah Bennett (@BennettJonah) March 16, 2018
At the time, doing so would lead one to a list of some of the most grotesque titles you can imagine. Here’s a screenshot courtesy CNN:
As you can see, they involve sex acts with, or violence against, “girls” or “lil girls.” According to USAToday, the problem was fixed in a “couple of hours.” But it didn’t stop there. Bennett, being the hard-nosed journalist that he is, typed in the Spanish “video de…” only to find results suggesting “live sex videos.” His tweets were picked up by all major news outlets, including the NYTimes.
By early the next morning, the problem seems to have been fixed. In a statement to USAToday, Facebook reps expressed remorse, saying they were “very sorry this happened,” and promising that “As soon as we became aware of these offensive predictions we removed them.”
What’s strange about the whole incident is how those results got there. As per Facebook, their autocomplete suggestions “are representative of what people may be searching for on Facebook.” However, their reps also reiterated that they “do not allow sexually explicit imagery,” meaning that whoever typed in these search terms either didn’t understand how Facebook worked, or was part of a large-scale plan to prank (for lack of a better word) the popular platform. Given how many users would be necessary to bring these search results to the top, as well as the similarity of many of them (i.e. the repeated use of “girls”), suggests this was part of a coordinated effort. But why?