Navigation
Suscribe
Menu Search Facebook Twitter
Search Close
Menu ALL SECTIONS
  • Capital Coahuila
  • Capital Hidalgo
  • Capital Jalisco
  • Capital Morelos
  • Capital Oaxaca
  • Capital Puebla
  • Capital Quintana Roo
  • Capital Querétaro
  • Capital Veracruz
  • Capital México
  • Capital Michoacán
  • Capital Mujer
  • Reporte Índigo
  • Estadio Deportes
  • The News
  • Efekto
  • Diario DF
  • Capital Edo. de Méx.
  • Green TV
  • Revista Cambio
Radio Capital
Pirata FM
Capital Máxima
Capital FM
Digital
Prensa
Radio
TV
X
Newsletter
Facebook Twitter
X Welcome! Subscribe to our newsletter and receive news, data, statistical and exclusive promotions for subscribers
World

Facebook Gearing Up to Fight Political Propaganda

Facebook security researchers and its chief security officer said the company will monitor the efforts of those who try to hurt "civic discourse" on its service

Conference workers speak in front of a demo booth at Facebook's annual F8 developer conference in San Jose, California, photo: AP/Noah Berger
8 months ago

NEW YORK – Facebook is acknowledging that governments or other malicious non-state actors are using its social network to influence political sentiment in ways that could affect national elections.

It’s a long way from CEO Mark Zuckerberg’s assertion back in November that it was “pretty crazy” to think that false news on Facebook influenced the U.S. presidential election. It’s also a major sign that the world’s biggest social network is continuing to grapple with its outsized role in how the world communicates, for better or for worse.

In a paper posted online on Thursday, Facebook security researchers and its chief security officer said the company will monitor the efforts of those who try to hurt “civic discourse” on its service, whether that’s governments or other groups. It is also looking to identify fake accounts, and says it will notify people if their accounts have been targeted by such cyberattackers.

“[We] have had to expand our security focus from traditional abusive behavior, such as account hacking, malware, spam and financial scams, to include more subtle and insidious forms of misuse, including attempts to manipulate civic discourse and deceive people,” the report states. It was written by researchers Jen Weedon and William Nuland and Facebook exec Alex Stamos and titled “Information Operations and Facebook.”

ELECTION MEDDLING

The team defined “information operations” as any actions taken by governments or other actors to “distort domestic or foreign political sentiment” to achieve a strategic purpose. Such operations can include the dissemination of false news and disinformation and the use of fake-account networks aimed at manipulating public opinion through a variety of means.

Using the 2016 U.S. presidential election as an example, Facebook said it uncovered “several situations” where malicious actors used social media to “share information stolen from other sources, such as email accounts, with the intent of harming the reputation of specific political targets.”

The company did not name the actors or the victims, but it said its data “does not contradict” a January report by the U.S. Director of National Intelligence that Russia tried to meddle with the U.S. election.

MORE TO DO

Jonathan Albright, a professor who studies data journalism at Elon University in North Carolina, urged journalists and others back in February to look not just at the role of Facebook in spreading false or misleading information, but also at the sources of such information. That is, to attempt to identify both the producers of this material and those who spread it using social networks and other means.

Facebook’s paper addresses the amplifiers of such content — the fake accounts that “like” and share false news stories, for example. The company has also announced steps to support legitimate journalism and news literacy. But the paper does not delve into ideas about attacking false news and propaganda at the source, including by banning such content from the site.

Currently, Facebook users who want to share an article that has been debunked by outside fact-checkers, for example, are able to do so after they get a warning from Facebook. Facebook has long held that it does not want to be the arbiter of truth — that it wants its users to decide for themselves (within limits of its terms of service) what they want to read and post.

But balancing a desire not to censor with a desire to weed out state-sponsored propaganda has been a challenging exercise for the company.

BARBARA ORTUTAY

Comments Whatsapp Twitter Facebook Share
More From The News
Business

Snow disrupts road, air travel in Englan ...

2 days ago
Business

German intelligence warns of increased C ...

2 days ago
Latest News

Early praise for 'The Last Jedi' after e ...

3 days ago
Sports

Winless Cologne wastes 3-goal lead to lo ...

3 days ago
Most Popular

Up to 90 Million More Takata Airbag Infl ...

By The Associated Press
Business

New Delivery App Rappi Says It's Not Sel ...

By Caitlin Donohue
Business

Global Markets Cautious as Investors Awa ...

By The Associated Press
Business

Ferrari CEO Opens the Door on Building U ...

By The Associated Press
Business

Coast of Yucatan Sees First Beached Whal ...

By Reuters
World