Lost in AI Translation: Facebook Flub of Palestinian’s Good Morning Post Leads to His Arrest
A flubbed translation by Facebook led to a Palestinian man’s arrest, after the Arabic for ‘good morning’ appeared to instead say, ‘attack them.’
Posting a simple, “good morning,” to Facebook saw a Palestinian man arrested near Jerusalem, when the social media colossus’ artificial intelligence-driven translation service instead managed to horrendously flub the meaning of the simple greeting — which he included with a self-portrait, leaning against a bulldozer at his construction job — into the altogether alarming, “attack them.”
Israeli law enforcement waited until later in the day to arrest the publicly unnamed man over concerns he indeed intended to carry out an attack, reports Haaretz — but this wasn’t a simple arrest and release. Facebook’s feckless error entailed a harrowing ordeal in which the man’s wholly innocuous gesture unfortunately translated into a harrowing ordeal of assumed guilt over innocence.
Israeli police “questioned him for several hours, suspicious he was planning to use the pictured bulldozer in a vehicle attack, before realising their mistake,” The Guardian reports. “At no point before his arrest did any Arabic-speaking officer read the actual post.”
Facebook managed a typically blithe apology and a promise to investigate what went wrong, asserting in a statement to Gizmodo,
“Unfortunately, our translation systems made an error last week that misinterpreted what this individual posted. Even though our translations are getting better each day, mistakes like these might happen from time to time and we’ve taken steps to address this particular issue. We apologize to him and his family for the mistake and the disruption this caused.”
Indeed, a mangled translation on social media would not ordinarily bring with it more than a chuckle, reddened cheeks, or, perhaps — in extreme incidents — a sincere apology for the error. But, should the seriousness of this man’s predicament — intense interrogation, assumption of guilt, and an hours-long detention with authorities — truly be glibly written off as simple machine error?
According to the Guardian, the man’s caption of his pose against the bulldozer was the Arabic, “yusbihuhum (يصبحهم),” which translates to “good morning.” But, Facebook’s translation — care of its artificial intelligence-powered service, implemented after breaking with Microsoft’s Bing last year — instead falsely deciphered the message as “hurt them” in English, or, “attack them” in Hebrew.
Authorities felt compelled to detain the innocent construction worker for interrogation under the premise he was plotting to employ the heavy machinery as pictured in the West Bank settlement of Beitar Illit to attack civilians.
Worst of all, no one who speaks Arabic — police officer, ombudsman, or otherwise — bothered to read the original caption in question prior to the man’s arrest.
“English transliteration used by Facebook is not an actual word in Arabic but could look like the verb ‘to hurt,’” Haaretz explains, “even though any Arabic speaker could clearly see the transliteration did not match the translation.”
That makes Facebook’s mistake a matter of grave concern for people in similar situations, where arguably hostile governments training a hawkeye to civilian social media posts — under the ostensive premise of ferreting out lone wolf and organized group threats — can and do arrest people who have yet to commit an actual crime.
In fact, the United States now actively accesses immigrant and visitor social media accounts as part of its notoriously strict entrance requirements — if Customs and Border Patrol and Homeland Security also rely on Facebook’s artificial intelligence translator to perform the task accurately, it would be safe to imagine at least a few innocent people turned away.
Admittedly, like its American ally, the Israeli Defense Force has been forthright about its continued perusal of Facebook and other social media in pursuance of potential ‘lone wolf’ attackers — thus, the man’s pose with heavy machinery similar to what has been used in past attacks set off alarm bells.
Except, it shouldn’t have.
An utterly banal post to Facebook nearly cost one man his freedom — the numbing simplicity of the misunderstanding, a knee-weakening reminder government is always watching and poised to pounce — even when the lack of transgression is apparently lost in the absurdity of bad translation.
Typos, corrections and/or news tips? Email us at Contact@TheMindUnleashed.com