First Amendment on Ropes at Facebook

Well, Zuckerberg has at least heard of the First Amendment (although it is not mentioned in Wall Street Journal article below.)

A worthy and shocking read. Here are a few thoughts from the article

1. Millennials feelings trump Trump? (Well, almost)

2. 44% of US get their news from Facebook (Uh oh),

3. Thiel’s donation to Trump “act of diversity”? (Is that what make his private gift to a candidate okay? Happy Halloween.)

4. Monika Bickert at FB “strives to be neutral during election season” (What about Nov. 9 forward?)

5. Censorship clearly not out of the question. First Amendment is on the ropes at Facebook. Center-right orgs beware.

http://www.wsj.com/articles/facebook-employees-pushed-to-remove-trump-posts-as-hate-speech-1477075392

By Deepa Seetharaman
Updated Oct. 21, 2016 3:43 p.m. ET
425 COMMENTS

Some of Republican presidential candidate Donald Trump’s posts on Facebook have set off an intense debate inside the social media company over the past year, with some employees arguing certain posts about banning Muslims from entering the U.S. should be removed for violating the site’s rules on hate speech, according to people familiar with the matter.

The decision to allow Mr. Trump’s posts went all the way to Facebook Inc. Chief Executive Mark Zuckerberg, who ruled in December that it would be inappropriate to censor the candidate, according to the people familiar with the matter. That decision has prompted employees across the company to complain on Facebook’s internal messaging service and in person to Mr. Zuckerberg and other managers that it was bending the site’s rules for Mr. Trump, and some employees who work in a group charged with reviewing content on Facebook threatened to quit, the people said.

Mr. Trump’s campaign didn’t respond to requests for comment. In a statement provided Wednesday evening, a Facebook spokeswoman said its reviewers consider the context of a post when assessing whether to take it down. “That context can include the value of political discourse,” she said. “Many people are voicing opinions about this particular content and it has become an important part of the conversation around who the next U.S. president will be.”

On Friday, senior members of Facebook’s policy team posted more details on its policy. “In the weeks ahead, we’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest—even if they might otherwise violate our standards,” they wrote.

The internal debates shed light on how Facebook has grappled with its position as one of the biggest sources of political information during a particularly contentious election cycle.

This week, a controversy bubbled up around Facebook director Peter Thiel, who recently pledged $1.25 million to support Mr. Trump. In an internal post to employees confirmed by the company, Mr. Zuckerberg urged tolerance of Mr. Thiel’s political activity, saying it was key to cultivating diversity. Facebook declined to comment further on the matter, and Mr. Thiel didn’t respond to a request for comment.

Facebook—which stands to collect an estimated $300 million from online political advertising this year, according to Nomura analysts—has strived to appear nonpartisan and neutral, amid complaints that the company and key executives favor Democrats. A May report from tech blog Gizmodo alleged Facebook contract workers manipulated its trending topics feature for political purposes. Facebook denied bias, but in August, fired the contractors so that it could run the feature largely by software.

“They are confronting in a very real way for the first time the political dimensions of their platform,” said Anna Lauren Hoffmann, who teaches information ethics at the University of California, Berkeley.

About 44% of Americans get at least some of their news from Facebook, according to Pew Research.

The company insists it is a neutral platform for open debate. Yet it has strict rules around what users can post. The rules, which Facebook has tightened in recent years, ban discrimination toward people based on their race and religion. Facebook typically removes content that violates the rules.

Legal experts say Facebook isn’t bound by the Federal Communications Commission’s equal-time rules, which require radio stations and broadcast networks, with exceptions, to devote the same airtime to political candidates.

Issues around Mr. Trump’s posts emerged when he posted on Facebook a link to a Dec. 7 campaign statement “on preventing Muslim immigration.” The statement called for “a total and complete shutdown of Muslims entering the United States until our country’s representatives can figure out what is going on.” Mr. Trump has since backed away from an outright ban based on religion, saying his policies would target immigrants from countries with a record of terrorism.

Users flagged the December content as hate speech, a move that triggered a review by Facebook’s community-operations team, with hundreds of employees in several offices world-wide. Some Facebook employees said in internal chat rooms that the post broke Facebook’s rules on hate speech as detailed in its internal guidelines, according to people familiar with the matter.

Content reviewers were asked by their managers not to remove the post, according to some of the people familiar. Facebook’s head of global policy management, Monika Bickert, later explained in an internal post that the company wouldn’t take down any of Mr. Trump’s posts because it strives to be impartial in the election season, according to people who saw the post.

During one of Mr. Zuckerberg’s weekly town hall meetings in late January at the company’s Menlo Park, Calif., headquarters, a Muslim employee asked how the executive could condone Mr. Trump’s comments. Mr. Zuckerberg acknowledged that Mr. Trump’s call for a ban did qualify as hate speech, but said the implications of removing them were too drastic, according to two people who attended the meeting. Mr. Zuckerberg said he backed Ms. Bickert’s call, they said.

Many employees supported the decision. “Banning a U.S. presidential candidate is not something you do lightly,” said one person familiar with the decision.

But others, including some Muslim employees at Facebook, were upset that the platform would make an exception. In Dublin, where many of Facebook’s content reviewers work, more than a dozen Muslim employees met with their managers to discuss the policy, according to another person familiar with the matter. Some created internal Facebook groups protesting the decision, while others threatened to leave.

Employees continued to submit questions for Mr. Zuckerberg’s weekly town hall about Mr. Trump’s posts for months after, the person familiar said. But the internal-communications team responded that the question had been answered and the matter was decided, the person said.

wsj.com|By Deepa Seetharaman

“Our job is to raise the necessary funds so you can do your mission with integrity and success.”

~ Matt Waters

202-355-6385