This week is off to a rough start for Facebook.
First, a whistleblower shared a trove of internal documents with regulators and journalists that provided proof to some stinging allegations: that the social media giant prioritizes profits over people; that its algorithms foster discord; that its photo app Instagram negatively affects the mental health of young girls; and that drug cartels and human traffickers openly conduct business on the platform. Then on Monday, Facebook, Instagram, and messaging apps WhatsApp and Messenger went offline for hours as a result of router configuration changes, according to the company’s VP of infrastructure.
The outage also took down some of Facebook’s internal systems, including one called Workplace, from which whistleblower Frances Haugen downloaded many of the documents she used to expose the company’s dark underbelly. Cue the conspiracy theorists.
Having recently written a series on corporate whistleblowers, I have some insights into their motivations and what companies should do when an employee chooses to embark on that path.
I think the whistleblower in the series whose motivation hewed most closely to Haugen’s was Aaron Westrick, a research director at Second Chance Body Armor who exposed how a material called Zylon being used in the panels of bulletproof vests would degrade in normal heat and humidity. The vests became less effective as a result, and officers wearing them were injured or killed by bullets the vests should have stopped.
Westrick, a deputy sheriff who was once saved from injury when his Second Chance vest stopped a bullet that might have hit his heart, loved working for the company. That was why blowing the whistle was so hard, he said. Having exhausted all efforts to get the company to do the right thing, Westrick had no choice but to alert regulators.
After 18 years, he earned a $5.7 million payout for blowing the whistle. But stepping forward prevented thousands of other defective vests from hitting the market and potentially failing law enforcement officials at their time of need.
“I loved that company. I still do,” Westrick said. “It’s interesting: People say (whistleblowers) weren’t loyal. I was the most loyal person there! I really believed in what I was doing.”
Haugen had a similar motivation.
“I don’t hate Facebook,” Haugen said she wrote on Workplace on her last day with the company. “I love Facebook. I want to save it.”
While some whistleblowers might be motivated by a payout, that is not usually their primary driver. They are often outraged by illegal and unethical conduct occurring within an organization they believe in. That outrage turns to anger when their company does nothing to address it.
The lesson here for compliance officers is that many whistleblowers are forced to take their complaints outside the company because their attempts to address the problems internally are rebuffed or ignored.
From a purely practical perspective, the damage their whistleblowing can wreak on a company’s reputation and bottom line are significantly higher if they are forced to tell their tale to regulators and the public. When handling a whistleblower’s complaint, a company should be evaluating all the risks presented by the substance of the complaint, and then do everything in its power to address it.
Of note, a company’s response to the initial complaint will likely be acknowledged by regulators as a mitigating factor—positive or negative—in assessing potential penalties and fines.
Corporate ethics programs need to be supported
While at Facebook, Haugen worked on the 200-person Civic Integrity team, which focused on issues around elections worldwide. But after the 2020 U.S. election (and importantly, before the insurrection at the U.S. Capitol on Jan. 6), the team was disbanded and its members assigned to other jobs.
Haugen told the Wall Street Journal that Facebook seemed unwilling to support the conclusions of investigators regarding safety initiatives, especially if implementing changes meant discouraging new users or dampening engagement. There were small teams of Facebook employees facing huge issues, she said, and the company was unwilling to provide them with more funding or even with a stamp of approval from the board or senior management.
If a company establishes an ethics program, it needs to support it. If an ethics team identifies problems, its conclusions need to be taken seriously. The team’s suggestions for improvements need to be discussed, evaluated, and—if changes would enhance a company’s culture regarding ethics and compliance—implemented.
Apparently, this was not happening at Facebook. Giving lip service to ethical initiatives—but not supporting them in the face of evidence of unethical behavior—can fuel a whistleblower’s sense of outrage.
Internal controls can be a liability shield
Whistleblowers often demand openness and accountability of their employer. Those are noble goals for any corporation. But there are many legitimate factors to why some documents should be kept confidential: legal reasons, competitive reasons, privacy reasons, and more.
A company should establish a strict internal control system that only provides access to documents to the employees who need them to do their job. Haugen said once she decided to blow the whistle, she spent months poking around in Workplace, accessing internal documents she had no professional reason to access.
Facebook not only should have closed off access to some of those documents—it should have created an alert system to raise red flags when employees without the proper credentials attempted to view them. Knowing this kind of activity is occurring has benefits far beyond whistleblowing—it could provide early warning signs of corporate espionage, fraud, blackmail, stalking, and more.
Even large downloads of documents by an employee with the proper credentials could be a warning sign. Why would they need to download a large file if they have access to it already? It’s worth asking about, at the very least.