I’ve written several times now about the problems that Facebook and other social media giants will have trying to use machine language to police and weed out “hate speech,” so it’s not surprising that, as NBC News and other media reported, Facebook banned the Declaration of Independence just before July 4th. The Vindicator, “The oldest continuously printing news source in South Liberty County [Texas]since 1887,”
challenged its Facebook followers to read the Declaration of Independence. To make it a little easier to digest that short but formidable historic document, the newspaper broke the Declaration down into 12 small bites and one to post each morning from June 24 to July 4. The first nine parts posted as scheduled, but part 10, consisting of paragraphs 27-31 of the Declaration, did not appear. Instead, The Vindicator received a notice from Facebook saying that the post “goes against our standards on hate speech.” …
The removal of the post was an automated action. If any human being working at Facebook were to review it, no doubt the post would be allowed, and the editor has searched for a means of contacting Facebook for an explanation or a opportunity to appeal the post’s removal, but it does not appear the folks at Facebook want anyone contacting them. Or, at least, they do not make it easy. The Vindicator has sent Facebook a feedback message. That being the only way found so far to contact the company. …
So, the removal of this morning’s post puts The Vindicator in a quandary about whether to continue with posting the final two parts of the Declaration scheduled for tomorrow and Wednesday. Should Facebook find anything in them offensive, The Vindicator could lose its Facebook page.
This is frustrating, but your editor is a historian, and to enjoy the study of history a person must love irony. It is a very great irony that the words of Thomas Jefferson should now be censored in America.
On July 3, after national media began covering the story, Facebook did reinstate the Declaration of Independence. The Vindicator posted an update:
UPDATE: Earlier this evening, July 3, the good folks at Facebook restored the post that is the subject of this article. An email from Facebook came in a little after The Vindicator’s office closed today and says the following:
“It looks like we made a mistake and removed something you posted on Facebook that didn’t go against our Community Standards. We want to apologize and let you know that we’ve restored your content and removed any blocks on your account related to this incorrect action.”
The Vindicator extends its thanks to Facebook. We never doubted Facebook would fix it, but neither did we doubt the usefulness of our fussing about it a little.
This point of this story is not that the algorithm made a mistake; that was expected. After all, Facebook has already flagged a number of legitimate posts as “hate speech.”
The real story is that Facebook has failed on a key element of its April 24 announcement of its plan to label and reject “hate speech:” what happens when someone’s legitimate posting is flagged as “hate speech?” Facebook said it would have an “appeals” process, but that process would be “built out” “over the coming year.” But clearly, in the Vindicator’s case, that process did not work as intended. Facebook described its process as:
As a first step, we are launching appeals for posts that were removed for nudity / sexual activity, hate speech or graphic violence.
Here’s how it works:
If your photo, video or post has been removed because we found that it violates our Community Standards, you will be notified, and given the option to request additional review.
This will lead to a review by our team (always by a person), typically within 24 hours.
If we’ve made a mistake, we will notify you, and your post, photo or video will be restored.
But at least so far, Facebook’s “build out” does not appear to include that very important point: there isn’t an “option to request additional review.” Facebook explains that it reviews millions of posts every day using its automated screening system, and admits that it will make mistakes.
But should Facebook automatically review its reviewers? Or simply wait until someone can interest the media in its mistakes before acting?