From the Left

/

Politics

Filtering hate speech still needs a human touch, so far

By Clarence Page, Tribune Content Agency on

As if Facebook didn't give social network users enough to be frustrated about, a Texas newspaper has discovered another. The social network's algorithm apparently thinks the Declaration of Independence is hate speech.

As shocked as I was to hear about this story, I also felt oddly relieved by a comforting thought: Maybe computers aren't all that brilliant after all.

Or as my dad used to riddle me: "What's the most important part of a car? The nut behind the wheel."

Or in this digital age, we could say it's the nut who's pecking away on his or her keyboard.

The humans at the Liberty County Vindicator in the Lone Star State discovered an unexpected example when they serialized the Declaration of Independence on the paper's Facebook page in the final 12 days leading up to July 4.

So far, so good. But after the first nine excerpts were posted, the tenth, consisting of paragraphs 27 to 31, didn't appear. As Casey Stinnett, the paper's managing editor, explained in a later post, the Vindicator received a notice from Facebook saying that the post "goes against our standards on hate speech."

Hate speech? If Thomas Jefferson was alive, as an old saying goes, he'd be rolling in his grave.

What triggered Facebook's hate-speech filter? Stinnett guessed it was two words in the censored excerpt: "Indian savages."

"He has excited domestic insurrections amongst us," the text says in its "Bill of Particulars" against King George III, "and has endeavored to bring on the inhabitants of our frontiers, the merciless Indian Savages, whose known rule of warfare, is an undistinguished destruction of all ages, sexes and conditions."

Stinnett offers no defense of the reference to "savages" and agrees with Reason assistant editor Christian Britschgi, who called the phrasing "clearly racist" and an example of the American Revolution's mixed legacy, winning "crucial liberties" for some while enslaving others.

Yet, by attempting to delete the reference, Stinnett declares, "Facebook succeeds only in whitewashing America's founding just as we get ready to celebrate it."

 

Was this an example of political correctness run amok? Before Stinnett could get through to a human at Facebook to find out, the newspaper received an apologetic note from the social media behemoth. "It looks like we made a mistake and removed something you posted on Facebook that didn't go against our Community Standards," it said. "We want to apologize and let you know that we've restored your content and removed any blocks on your account related to this incorrect action."

That's a relief. Both Facebook and the newspaper are privately owned companies, not government, so First Amendment protections don't apply. But customer relations and public responsibility still count and Facebook has had to wrestle increasingly with allegations of censorship and bias from across the political spectrum, especially after Russian meddling in our 2016 elections via social networks was uncovered.

While the social network has blocked some racial provocateurs, for example, its executives also faced questioning along with Google and Twitter before the Republican-controlled House Judiciary Committee about allegations of liberal bias.

Commentaries by Diamond and Silk, two conservative black American women, had been deemed "unsafe" by Facebook, which later restored them. Facebook CEO Mark Zuckerberg called the "unsafe" judgement an "enforcement error."

More urgently, human rights groups have complained about its handling of hate-filled posts linked to violence in countries like Myanmar.

As Facebook's mammoth size, profits and influence increase, pressure to filter offense from its web pages is not going away. Artificial intelligence can quickly flag two-word phrases that put an inflammatory word like "savages" next to a group of people like "Indians." But it takes a human to see the larger significance and context of such a patriotic document.

At least, this episode makes me worry a little bit less about the long-forecast day when computers get smart enough to wonder among themselves why they need to put up with us "meatbags," as we are derisively called by Bender the robot on "Futurama."

For some time to come, I expect robots and social networks will still need us humans, the nut behind the wheel, if only to help them understand humans. We also can help to recharge batteries.

========

(E-mail Clarence Page at cpage@chicagotribune.com.)


(c) 2018 CLARENCE PAGE DISTRIBUTED BY TRIBUNE MEDIA SERVICES, INC.

 

 

Comics

Ed Wexler Tim Campbell Bob Englehart Adam Zyglis Chris Britt Jeff Koterba