Facebook says gov'ts exploited its platform to manipulate opinion
MANILA, Philippines – Facebook just took a step forward in the battle against misinformation, disinformation, fake news, and propaganda.
On Thursday, April 27, the social network admitted that organized government and non-state actors use its platform for "information operations."
Information operations were at the heart of a white paper authored by Facebook's security and intelligence teams.
The teams defined these operations as "actions taken by organized actors (governments or non-state actors) to distort domestic or foreign political sentiment, most frequently to achieve a strategic and/or geopolitical outcome." Facebook enumerated some of these actions: false news, disinformation, and the use of a network of fake accounts aimed at manipulating public opinion. (Read: Fake accounts, manufactured reality on social media)
The networks of fake accounts, which Facebook officially labeled "false amplifiers," can also be used to swarm on those who may have an opinion incongruent to the actors' messaging or propaganda. Facebook said this swarming is used in "discouraging specific parties from participating in discussion."
Anyone who's been on Facebook long enough is most likely aware that fake news is prevalent in the social network and that it is very easy to make a fake Facebook account. Facebook itself knows that too, and has regularly rolled out a number of tools and algorithmic changes to remove false content, among others. The white paper, however, marks the first time that Facebook has directly linked such content to the government and other actors that may be in support of certain political groups.
Mounting information operations sometimes comes with little risk and is not expensive, said Facebook, which makes it a viable, feasible propaganda tool. That the network has massive, global reach wherein everyone is a potential amplifier of messaging only adds to the appeal of Facebook as a way to sway opinion – by hook or by crook.
Major features of information operation
Facebook revealed the 3 major features of information operation that they believed have been attempted on the social network:
- Targeted data collection, with the goal of stealing, and often exposing, non-public information that can provide unique opportunities for controlling public discourse.
- Content creation, false or real, either directly by the information operator or by seeding stories to journalists and other third parties, including via fake online personas.
- False amplification, which we define as coordinated activity by fake accounts with the intent of manipulating political discussion (e.g., by discouraging specific parties from participating in discussion or amplifying sensationalistic voices over others).
In a nutshell, actors steal and harvest information from targets that can then be used to attack them. Information can be stolen not just from Facebook accounts but also from email and other avenues. This information can then be used to create content like memes and articles to sow disinformation.
Based on Facebook's statement, information operations can also target journalists, baiting them to create content that may shape opinion according to the actors' liking.
The content is then spread through the said false amplifiers. False amplifiers can be used to promote or denigrate a specific cause or issue; sow distrust in political institutions; or spread confusion.
Facebook also delineated two types of these false amplifiers:
"In some instances dedicated, professional groups attempt to influence political opinions on social media with large numbers of sparsely populated fake accounts that are used to share and engage with content at high volumes. In other cases, the networks may involve behavior by a smaller number of carefully curated accounts that exhibit authentic characteristics with well-developed online personas."
To detect false amplifier activity, Facebook said it looks at the "inauthenticity of the account and its behaviors and not the content the accounts are publishing."
Facebook proposed a "whole-of-society approach" to negate these information operations that affect individuals, political parties, governments, civil society organizations and media companies.
For individuals, enhanced media literacy is key, said Facebook, while urging users to activate two-factor security authentication.
For political parties and candidates, the social network said it has political outreach teams that provide information on potential risks, and are working with government cybersecurity agencies. Facebook also urged candidates, campaigns and political parties to make use of cloud options instead of self-hosted services, saying that shared managed services are more secure than attempting to build one's own cybersecurity team.
The social network also put up The Facebook Journalism Project to help news organizations navigate the current landscape.
As for Facebook itself? It said it is increasing its protection against manually created fake accounts and is using AI to identify more types of abuse. It said its systems are now able to detect "repeated posting of the same content or abberations in the volume of content creation." It cited its work in France – which recently held its elections – where its programming improvements havs enabled it to take action against 30,000 fake accounts.
With confirmation coming from Facebook that governments and political figures can indeed exploit the social network for their own motives and gains, the social network takes another step in embracing its role as a media company. Its reporters are all of us who are on Facebook; and it's an editor telling us what's happening within the company, what's wrong, and whatnot.
"We recognize that, in today's information environment, social media plays a sizable role in facilitating communications," Facebook said.
Read the full white paper here. – Rappler.com