A 6,600-word internal memo from a fired Facebook data scientist details how the social network knew leaders of countries around the world were using their site to manipulate voters — and failed to act.
September 14, 2020.- Facebook ignored or was slow to act on evidence that fake accounts on its platform have been undermining elections and political affairs around the world, according to an explosive memo sent by a recently fired Facebook employee and obtained by BuzzFeed News.
The 6,600-word memo, written by former Facebook data scientist Sophie Zhang, is filled with concrete examples of heads of government and political parties in Azerbaijan and Honduras using fake accounts or misrepresenting themselves to sway public opinion. In countries including India, Ukraine, Spain, Brazil, Bolivia, and Ecuador, she found evidence of coordinated campaigns of varying sizes to boost or hinder political candidates or outcomes, though she did not always conclude who was behind them.
“In the three years I’ve spent at Facebook, I’ve found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions,” wrote Zhang, who declined to talk to BuzzFeed News. Her LinkedIn profile said she “worked as the data scientist for the Facebook Site Integrity fake engagement team” and dealt with “bots influencing elections and the like.”
“I have personally made decisions that affected national presidents without oversight, and taken action to enforce against so many prominent politicians globally that I’ve lost count,” she wrote.
The memo is a damning account of Facebook’s failures. It’s the story of Facebook abdicating responsibility for malign activities on its platform that could affect the political fate of nations outside the United States or Western Europe. It’s also the story of a junior employee wielding extraordinary moderation powers that affected millions of people without any real institutional support, and the personal torment that followed.
“I know that I have blood on my hands by now,” Zhang wrote.
These are some of the biggest revelations in Zhang’s memo:
- It took Facebook’s leaders nine months to act on a coordinated campaign “that used thousands of inauthentic assets to boost President Juan Orlando Hernandez of Honduras on a massive scale to mislead the Honduran people.” Two weeks after Facebook took action against the perpetrators in July, they returned, leading to a game of “whack-a-mole” between Zhang and the operatives behind the fake accounts, which are still active.
- In Azerbaijan, Zhang discovered the ruling political party “utilized thousands of inauthentic assets… to harass the opposition en masse.” Facebook began looking into the issue a year after Zhang reported it. The investigation is ongoing.
- Zhang and her colleagues removed “10.5 million fake reactions and fans from high-profile politicians in Brazil and the US in the 2018 elections.”
- In February 2019, a NATO researcher informed Facebook that “he’d obtained Russian inauthentic activity on a high-profile U.S. political figure that we didn’t catch.” Zhang removed the activity, “dousing the immediate fire,” she wrote.
- In Ukraine, Zhang “found inauthentic scripted activity” supporting both former prime minister Yulia Tymoshenko, a pro–European Union politician and former presidential candidate, as well as Volodymyr Groysman, a former prime minister and ally of former president Petro Poroshenko. “Volodymyr Zelensky and his faction was the only major group not affected,” Zhang said of the current Ukrainian president.
- Zhang discovered inauthentic activity — a Facebook term for engagement from bot accounts and coordinated manual accounts— in Bolivia and Ecuador but chose “not to prioritize it,” due to her workload. The amount of power she had as a mid-level employee to make decisions about a country’s political outcomes took a toll on her health.
- After becoming aware of coordinated manipulation on the Spanish Health Ministry’s Facebook page during the COVID-19 pandemic, Zhang helped find and remove 672,000 fake accounts “acting on similar targets globally” including in the US.
- In India, she worked to remove “a politically-sophisticated network of more than a thousand actors working to influence” the local elections taking place in Delhi in February. Facebook never publicly disclosed this network or that it had taken it down.
“We’ve built specialized teams, working with leading experts, to stop bad actors from abusing our systems, resulting in the removal of more than 100 networks for coordinated inauthentic behavior,” Facebook spokesperson Liz Bourgeois said in a statement. “It’s highly involved work that these teams do as their full-time remit. Working against coordinated inauthentic behavior is our priority, but we’re also addressing the problems of spam and fake engagement. We investigate each issue carefully, including those that Ms. Zhang raises, before we take action or go out and make claims publicly as a company.”
BuzzFeed News is not publishing Zhang’s full memo because it contains personal information. This story includes full excerpts when possible to provide appropriate context.
In her post, Zhang said she did not want it to go public for fear of disrupting Facebook’s efforts to prevent problems around the upcoming 2020 US presidential election, and due to concerns about her own safety. BuzzFeed News is publishing parts of her memo that are clearly in the public interest.
Do you work at Facebook or another technology company? We’d love to hear from you. Reach out at firstname.lastname@example.org, email@example.com, firstname.lastname@example.org, or via one of our tip line channels.
“I consider myself to have been put in an impossible spot – caught between my loyalties to the company and my loyalties to the world as a whole,” she said. “The last thing I want to do is distract from our efforts for the upcoming U.S. elections, yet I know this post will likely do so internally.”
Zhang said she turned down a $64,000 severance package from the company to avoid signing a nondisparagement agreement. Doing so allowed her to speak out internally, and she used that freedom to reckon with the power that she had to police political speech.
“There was so much violating behavior worldwide that it was left to my personal assessment of which cases to further investigate, to file tasks, and escalate for prioritization afterwards,” she wrote.
That power contrasted with what she said seemed to be a lack of desire from senior leadership to protect democratic processes in smaller countries. Facebook, Zhang said, prioritized regions including the US and Western Europe, and often only acted when she repeatedly pressed the issue publicly in comments on Workplace, the company’s internal, employee-only message board.
“With no oversight whatsoever, I was left in a situation where I was trusted with immense influence in my spare time,” she wrote. “A manager on Strategic Response mused to myself that most of the world outside the West was effectively the Wild West with myself as the part-time dictator – he meant the statement as a compliment, but it illustrated the immense pressures upon me.”
A former Facebook engineer who knew her told BuzzFeed News that Zhang was skilled at discovering fake account networks on the platform.
“Most of the world outside the West was effectively the Wild West with myself as the part-time dictator.”
“She’s the only person in this entire field at Facebook that I ever trusted to be earnest about this work,” said the engineer, who had seen a copy of Zhang’s post and asked not to be named because they no longer work at the company.
“A lot of what I learned from that post was shocking even to me as someone who’s often been disappointed at how the company treats its best people,” they said.
Zhang’s memo said the lack of institutional support and heavy stakes left her unable to sleep. She often felt responsible when civil unrest took hold in places she didn’t prioritize for investigation and action.
“I have made countless decisions in this vein – from Iraq to Indonesia, from Italy to El Salvador,” she wrote. “Individually, the impact was likely small in each case, but the world is a vast place.”
Still, she did not believe that the failures she observed during her two and a half years at the company were the result of bad intent by Facebook’s employees or leadership. It was a lack of resources, Zhang wrote, and the company’s tendency to focus on global activity that posed public relations risks, as opposed to electoral or civic harm.
“Facebook projects an image of strength and competence to the outside world that can lend itself to such theories, but the reality is that many of our actions are slapdash and haphazard accidents,” she wrote.
“We simply didn’t care enough to stop them”
Zhang wrote that she was just six months into the job when she found coordinated inauthentic behavior — Facebook’s internal term for the use of multiple fake accounts to boost engagement or spread content — benefiting Honduran President Juan Orlando Hernández.
The connection to the Honduran leader was made, Zhang said, because an administrator for the president’s Facebook page had been “happily running hundreds of these fake assets without any obfuscation whatsoever in a show of extreme chutzpah.” The data scientist said she reported the operation, which involved thousands of fake accounts, to Facebook’s threat intelligence and policy review teams, both of which took months to act.
“Local policy teams confirmed that President JOH’s marketing team had openly admitted to organizing the activity on his behalf,” she wrote. “Yet despite the blatantly violating nature of this activity, it took me almost a year to take down his operation.”
That takedown was announced by Facebook in July 2019, but proved futile. Soon, the operation was soon back up and running, a fact Facebook has never disclosed.
“They had returned within two weeks of our takedown and were back in a similar volume of users,” Zhang wrote, adding that she did a final sweep for the fake accounts on her last day at Facebook. “A year after our takedown, the activity is still live and well.”
In Azerbaijan, she found a large network of inauthentic accounts used to attack opponents of President Ilham Aliyev of Azerbaijan and his ruling New Azerbaijan Party, which uses the acronym YAP. Facebook still has not disclosed the influence campaign, according to Zhang.
The operation detailed in the memo is reminiscent of those of Russia’s Internet Research Agency, a private troll farm that tried to influence the 2016 US elections, because it involved “dedicated employees who worked 9-6 Monday-Friday work weeks to create millions of comments” targeting members of the opposition and media reports seen as negative to Aliyev.
“Perhaps they thought they were clever; the truth was, we simply didn’t care enough to stop them.”
“Multiple official accounts for district-level divisions of the ruling YAP political party directly controlled numerous of these fake assets without any obfuscation whatsoever in another display of arrogance,” she wrote. “Perhaps they thought they were clever; the truth was, we simply didn’t care enough to stop them.”
Katy Pearce, an associate professor at the University of Washington who studies social media and communication technology in Azerbaijan, told BuzzFeed News that fake Facebook accounts have been used to undermine the opposition and independent media in the country for years.
“One of the big tools of authoritarian regimes is to humiliate the opposition in the mind of the public so that they’re not viewed as a credible or legitimate alternative,” she told BuzzFeed News. “There’s a chilling effect. Why would I post something if I know that I’m going to deal with thousands or hundreds of these comments, that I’m going to be targeted?”
Pearce said Zhang’s comment in the memo that Facebook “didn’t care enough to stop” the fake accounts and trolling aligns with her experience. “They have bigger fish to fry,” she said.
A person who managed social media accounts for news organizations in Azerbaijan told BuzzFeed News that their pages were inundated with inauthentic Facebook comments.
“We used to delete and ban them because we didn’t want people who came to our page to be discouraged and not react or comment,” said the person, who asked not to be named because they were not authorized to speak for their employer. “But since [the trolls] are employees, it’s easy for them to open new accounts.”
They said Facebook has at times made things worse by removing the accounts or pages of human rights activists and other people after trolls report them. “We tried to tell Facebook that this is a real person who does important work,” but it took weeks for the page to be restored.
Zhang wrote that a Facebook investigation into fake accounts and trolling in Azerbaijan is now underway, more than a year after she first reported the issue. On the day of her departure, she called it her “greatest unfinished business” to stop the fake behavior in the country.
“Many others would think nothing of myself devoting this attention to the United States, but are shocked to see myself fighting for these small countries,” she wrote. “To put it simply, my methodologies were systematic globally, and I fought for Honduras and Azerbaijan because that was where I saw the most ongoing harm.”
“I have blood on my hands”
In other examples, Zhang revealed new information about a large-scale fake account network used to amplify and manipulate information about COVID-19, as well as a political influence operation that used fake accounts to influence 2018 elections in the US and Brazil. Some of these details were not previously disclosed by Facebook, suggesting the company’s regular takedown announcements remain selective and incomplete.
Zhang said Facebook removed 672,000 “low-quality fake accounts” after press reports in April that some of the accounts had been engaging with COVID-19 content on the Spanish Health Ministry’s page. She said accounts in that network also engaged with content on US pages. Facebook did not disclose how many accounts it removed, or that those accounts engaged with content in other countries, including the US.
Zhang also shared new details about the scale of inauthentic activity during the 2018 midterm elections in the US, and from Brazilian politicians that same year. “We ended up removing 10.5 million fake reactions and fans from high-profile politicians in Brazil and the U.S. in the 2018 elections – major politicians of all persuasions in Brazil, and a number of lower-level politicians in the United States,” she wrote.
A September 2018 briefing about Facebook’s election work in the US and Brazil disclosed that it had acted against a network in Brazil that used “fake accounts to sow division and share disinformation,” as well as a set of groups, pages, and accounts that were “falsely amplifying engagement for financial gain.” It did not fully mention Zhang’s findings.
The scale of this activity — 672,000 fake accounts in one network, 10.5 million fake engagement and fans in others — indicates active fake accounts are a global problem, and are used to manipulate elections and public debate around the world.
As one of the few people looking for and identifying fake accounts impacting civic activity outside of “priority” regions, Zhang struggled with the power she had been handed.
“We focus upon harm and priority regions like the United States and Western Europe,” Zhang wrote, adding that “it became impossible to read the news and monitor world events without feeling the weight of my own responsibility.”
In Bolivia, Zhang said she found “inauthentic activity supporting the opposition presidential candidate in 2019” and chose not to prioritize it. Months later, Bolivian politics fell into turmoil, leading to the resignation of President Evo Morales and “mass protests leading to dozens of deaths.”
The same happened in Ecuador, according to Zhang, who “found inauthentic activity supporting the ruling government… and made the decision not to prioritize it.” The former Facebook employee then wondered how her decision led to downstream effects on how Ecuador’s government handled the COVID-19 pandemic — which has devastated the country — and if that would have been different if she’d acted.
“I have made countless decisions in this vein – from Iraq to Indonesia, from Italy to El Salvador. Individually, the impact was likely small in each case, but the world is a vast place. Although I made the best decision I could based on the knowledge available at the time, ultimately I was the one who made the decision not to push more or prioritize further in each case, and I know that I have blood on my hands by now.”
Zhang also uncovered issues in India, Facebook’s largest market, in the lead up to the local Delhi elections in February 2020. “I worked through sickness to take down a politically-sophisticated network of more than a thousand actors working to influence the election,” she wrote.
Last month, Facebook’s Indian operation came under scrutiny after reports in the Wall Street Journal revealed a top policy executive in the country had stopped local staffers from applying the company’s hate speech policies to ruling party politicians who posted anti-Muslim hate speech.
In her “spare time” in 2019, Zhang took on tasks usually reserved for product managers and investigators, searching out countries including Ukraine, Turkey, India, Indonesia, the Philippines, Australia, the United Kingdom, Taiwan, “and many many more.”
Zhang said she found and took down “inauthentic scripted activity” in Ukraine that supported Yulia Tymoshenko, a complicated political figure who has been involved in controversial gas deals with Russia but taken a more pro-Western tack in her later career, as well as for former prime minister Volodymyr Groysman, an ally of former president Petro Poroshenko. “Volodymyr Zelensky and his faction was the only major group not affected,” she wrote.
In another part of her memo, Zhang said she wanted to push back on the idea that Facebook was run by malicious people hoping to achieve a particular outcome. That was not the case, she wrote, attributing actions to “slapdash and haphazard accidents.”
“Last year when we blocked users from naming the Ukraine whistleblower, we forgot to cover hashtags until I stepped in,” she wrote.
But she also remarked on Facebook’s habit of prioritizing public relations over real-world problems. “It’s an open secret within the civic integrity space that Facebook’s short-term decisions are largely motivated by PR and the potential for negative attention,” she wrote, noting that she was told directly at a 2020 summit that anything published in the New York Times or Washington Post would obtain elevated priority.
“It’s why I’ve seen priorities of escalations shoot up when others start threatening to go to the press, and why I was informed by a leader in my organization that my civic work was not impactful under the rationale that if the problems were meaningful they would have attracted attention, became a press fire, and convinced the company to devote more attention to the space.”
Zhang mentioned one example in February 2019, when a NATO strategic communications researcher reached out to Facebook, alerting the company that he’d “obtained” Russian inauthentic activity “on a high-profile U.S. political figure that we didn’t catch.” That researcher said they were planning on briefing Congress the next day.
“I quickly investigated the case, determined what was going on, and removed the activity, dousing the immediate fire,” Zhang wrote. “Perhaps motivated by the experience, the same researcher tried the same experiment within a month or two, waiting half a year afterwards before sending the report to the press and finally causing the PR fire.”
“Human Resources Are Limited”
Beyond specific examples from around the world, Zhang provided insight into the inner workings at Facebook. She criticized her team’s focus on issues related to “99% of activity that’s essentially spam.”
“Overall, the focus of my organization – and most of Facebook – was on large-scale problems, an approach which fixated us on spam,” she said. “The civic aspect was discounted because of its small volume, its disproportionate impact ignored.”
Zhang outlined the political processes within Facebook itself. She said the best way for her to gain attention for her work was not to go through the proper reporting channels, but to post about the issues on Facebook’s internal employee message board to build pressure.
“In the office, I realized that my viewpoints weren’t respected unless I acted like an arrogant asshole.”
“In the office, I realized that my viewpoints weren’t respected unless I acted like an arrogant asshole,” Zhang said.
When she asked the company to do more in terms of finding and stopping malicious activity related to elections and political activity, she said she was told that “human resources are limited.” And when she was ordered to stop focusing on civic work, “I was told that Facebook would no longer have further need for my services if I refused.”
Zhang was fired this month and posted her memo on her last day, even after offering to stay on through the election as an unpaid volunteer. In her goodbye, she encouraged her colleagues to remain at Facebook and to fix the company from within.
“But you don’t – and shouldn’t – need to do it alone,” she wrote. “Find others who share your convictions and values to work on it together. Facebook is too big of a project for any one person to fix.”
Featured image: File Photo