Technology

Facebook would be in 'way better shape' adopting Twitter's disinformation response, ex-Facebook security chief says


Facebook‘s Mark Zuckerberg continues to miss the mark when it comes to controlling political speech on the giant social media platform, former Facebook security chief Alex Stamos told CNBC on Wednesday.

Facebook should not be concerned about whether it is censoring the flow of information, whether right or wrong, from elected officials, but about the amplification of misinformation, said Stamos, now director of the Stanford Internet Observatory at Stanford University.

Zuckerberg has been under intense scrutiny over Facebook’s approach to regulating free speech and the kind of content, particularly political ads, that can be posted to the website since his speech on free expression at Georgetown University last fall.

The issue is “what kind of capability does Facebook provide people to amplify their speech well beyond what would have been possible 5, 10 years ago before everybody was on social media,” Stamos said on “Power Lunch.” “If [Zuckerberg] changed his view on that and applied a little more of a subtle model to this, I think he could do a lot better.”

In the October speech in the nation’s capital, Zuckerberg argued that social media has become a “fifth estate,” alongside the fourth estate of traditional news media, that lets the public air their thoughts and ideas without relying on gatekeepers. The event was his response to pressure from politicians on both sides of the aisle looking to address how Facebook and other social media regulate speech online, especially with the 2020 election getting closer.

Zuckerberg, the chief executive and controlling shareholder of Facebook, decided that the company would not fact-check ads by political candidates, though he admitted to considering banning political ads altogether.

“We think people should be able to see for themselves what politicians are saying,” he said in the speech. “I don’t think it’s right for a private company to censor politicians or the news in a democracy.”

Twitter, Facebook’s smaller rival that has banned political ads, set a standard on this front, Stamos said. The short message platform is a favorite of President Donald Trump.

“And to be honest, this is actually a really hard problem,” explained Stamos, who also serves as an advisor to Zoom Video Communications. “There is a reason why in our country we don’t have laws around this, because we have decided that more political speech is generally better and that it’s very dangerous to allow centralized powerful organizations to control that speech.”

The two social platforms also took polarized approaches to the president’s racist “when the looting starts, the shooting starts” posts. Zuckerberg ruled that it did not violate policies, while Twitter warned users of “violent rhetoric” in the tweet.

Twitter also moved to attach warning labels to two Trump tweets about mail-in voting in May.

While the company would find it hard to “deplatform” someone like Trump from its website whenever rules are broken, Twitter now limits how much election disinformation can spread by labeling misleading information, Stamos said.

“They will use their own First Amendment right to say ‘we don’t agree with this’ and ‘we don’t think this is true,’ and they will limit the spread of that message” via a “middle way,” Stamos explained. “I think that’s the kind of middle way that, if Mark had adopted it a couple months ago, Facebook would be in way better shape right now.”

Stamos, who departed the company more than two years ago due to disagreements over its handling of disinformation in the 2016 election, said he thinks Facebook will continue to face challenges moving forward as long as Zuckerberg stands his ground on these issues.

As of late, Facebook is back in the news because of an advertising boycott against hate speech being led by multiple civil rights groups. Additionally, a two-year audit commissioned by Facebook and released Wednesday concluded that some newly installed policies led to “significant setbacks for civil rights.” 

“Because there’s really no legal framework here, this is up to Facebook themselves, and they are kind of vacillating back and forth as the political wind shifts and making these decisions, it seems, in a pretty little bit of a haphazard manner,” Stamos said.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.