Skip to main content

Facebook is facing its toughest challenge yet: an election complicated by a pandemic, a deeply divided U.S. lured by conspiracy theories and alternate versions of reality. Is it ready? Here are some of the biggest steps and missteps it’s taken in the fight against misinformation since 2016.

  • Nov. 10, 2016: Days after the election of U.S. President Donald Trump, Facebook chief executive Mark Zuckerberg calls the idea that “fake news” on Facebook had influenced the election “a pretty crazy idea.” He later walks back the comment.
  • December, 2016: Facebook says it will hire third-party fact-checkers to combat misinformation.
  • April 27, 2017: Facebook publicly acknowledges that governments or other malicious non-state actors are using its social network to influence national elections, in line with the U.S. government’s findings of Russian interference.
  • October, 2017: Facebook says ads linked to a Russian internet agency were seen by an estimated 10 million people before and after the 2016 U.S. election.
  • November, 2017: Ahead of congressional hearings on U.S. election interference, Facebook ups that estimate, saying Russian ads fomenting political division potentially reached as many as 126 million users.
  • Jan. 4, 2018: Mr. Zuckerberg declares his 2018 resolution is to “fix” Facebook.
  • March, 2018: Evidence grows that Facebook campaigns were used to steer Britain toward Brexit.
  • April, 2018: Mr. Zuckerberg testifies before U.S. Congress and apologizes for the company’s missteps, as well as fake news, hate speech, a lack of data privacy and foreign interference in the 2016 elections on his platform.
  • May, 2018: Democrats on the U.S. House intelligence committee release more than 3,500 Facebook ads created or promoted by a Russian internet agency before and after the 2016 U.S. election.
  • July, 2018: British lawmakers call for greater oversight of Facebook and other platforms.
  • July, 2018: After Facebook warns of skyrocketing expenses owing in part to beefing up security and hiring more moderators, its stock price suffers the worst drop in its history. Its shares don’t recover until January, 2020.
  • Sept. 5, 2018: Facebook and Twitter executives pledge before U.S. Congress to defend against foreign intrusion.
  • October, 2018: Facebook invites the press to tour a newly created “war room” for combatting election-related misinformation in what is largely seen as a public-relations move.
  • October-November, 2018: Ahead of the 2018 U.S. midterm election, Facebook removes hundreds of accounts, pages and groups for suspected links to foreign interference.
  • Feb. 18, 2019: In a scathing report, British lawmakers call for a mandatory code of ethics and independent overseers for social-media platforms, specifically calling out Facebook for technical design that seems to “conceal knowledge of and responsibility for specific decisions.”
  • May, 2019: Facebook declines to remove a video manipulated to show U.S. House Speaker Nancy Pelosi slurring her words. The altered clip is shared millions of times.
  • October, 2019: Facebook unveils new security systems designed to prevent foreign interference in elections.
  • November, 2019: Facebook opens a new misinformation “war room” ahead of British elections.
  • May-June 2020: Facebook declines to remove Trump posts that suggest protesters in Minneapolis could be shot. Mr. Zuckerberg defends his decision in a Facebook post. Facebook also declines to take action on two Trump posts spreading misinformation about voting by mail. Some Facebook employees resign in protest.
  • June, 2020: Facebook says it will add labels to all posts about voting that direct users to authoritative information from American state and local election officials. This includes posts by the President.
  • July 8, 2020: A quasi-independent civil-rights audit criticizes Facebook’s “vexing and heartbreaking decisions” with respect to civil rights and U.S. election misinformation, including Mr. Trump’s tweets on voting by mail.
  • August, 2020: After years of a hands-off approach, Facebook restricts the conspiracy movement QAnon, but doesn’t ban it outright.
  • Sept. 3, 2020: Facebook curbs political ads, although only for seven days before the U.S. election.
  • Oct. 6, 2020: Facebook bans all groups that support QAnon.
  • Oct. 7, 2020: Facebook further limits U.S. political ads, readies more labels for candidate posts that prematurely declare victory or contest official results, and bans the use of “militarized language” in connection with calls for poll watching.

Be smart with your money. Get the latest investing insights delivered right to your inbox three times a week, with the Globe Investor newsletter. Sign up today.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe