WASHINGTON, Oct.4 (Reuters) – Former Facebook employee and whistleblower (FB.O) Frances Haugen will urge the U.S. Congress on Tuesday to regulate the social media giant, which she plans to equate with tobacco companies who have denied for decades that smoking was harmful to health, according to prepared testimony seen by Reuters.
“When we realized that the tobacco companies were hiding the damage they were causing, the government took action. When we realized that cars were safer with seat belts, the government took action,” he said. stated Haugen’s written testimony to be provided to a Senate trade subcommittee. “I implore you to do the same here.”
Haugen will tell the panel that Facebook executives have consistently chosen profits over user safety.
“Company executives know how to make Facebook and Instagram safer and will not make the necessary changes because they put their huge profits ahead of people. Congressional action is needed,” she said. “As long as Facebook is operating in the dark, it will not be accountable to anyone. And it will continue to make choices that are against the common good.”
Senator Amy Klobuchar, who is on the subcommittee, said she would question Haugen about the Jan.6 attack on the United States Capitol by supporters of then-President Donald Trump.
“I’m also particularly interested in whether she thinks Facebook has done enough to warn law enforcement and the public about January 6 and whether Facebook has removed guarantees of election misinformation because it was costing the company money.” Klobuchar said in an emailed comment.
The senator also said she wanted to discuss Facebook’s algorithms and whether they “promote harmful and divisive content.”
Haugen, who worked as a product manager on Facebook’s civic disinformation team, was the whistleblower who provided documents used in a Wall Street Journal investigation and Senate hearing into Instagram’s damage to teenage girls. Read more
Facebook owns Instagram as well as WhatsApp.
The company did not respond to a request for comment.
Haugen added that “Facebook’s closed design means it has no control, even from its own supervisory board, which is as blind as the public.”
This makes it impossible for regulators to serve as a control, she added.
“This inability to see into real Facebook systems and confirm that Facebook’s systems are working as they say they are, it’s like the Department of Transportation regulating cars by watching them drive on the freeway,” he testified. “Imagine if no regulator could get into a car, inflate its wheels, test a car in collision, or even know that seat belts could exist.”
Journal articles, based on internal presentations and emails from Facebook, showed the company contributed to increased polarization online when it changed its content algorithm; failed to take steps to reduce reluctance to immunize; and knew Instagram was hurting teenage mental health.
Haugen said Facebook has done too little to prevent its platform from being used by people planning violence.
“The result has been a system that amplifies division, extremism and polarization – and undermines societies around the world. In some cases, this dangerous online discourse has led to real violence that harms and even kills people.” , she said.
Facebook was used by people planning massacres in Myanmar and in the January 6 assault by Trump supporters who were determined to reject the 2020 election results.
Reporting by David Shepardson; additional reporting by Diane Bartz, Editing by Rosalba O’Brien, David Gregorio and Sonya Hepinstall
Our Standards: Thomson Reuters Trust Principles.