A U.S. House Committee held a virtual hearing last month to discuss the dividing effect of disinformation due to a lack of content moderation from giant social media companies.
The panel showed some bipartisan support for stricter regulations to increase transparency in companies’ moderation processes and accountability for the content on their platforms.
Rep. Mike Boyle (D-PA), chair of the Subcommittee on Communications and Technology and the Subcommittee on Consumer Protection and Commerce, believes the effect is heightened by the twin shocks of COVID-19 and BLM protests.
COVID-19 and BLM
Amidst the double crises, the wave of misinformation is dividing the nation along partisan and racial lines, according to all three witnesses at the hearing.
“Dangerous disinformation often … is targeted at people of color,” wrote Frank Pallone, Chairman of the Energy and Commerce Committee, in his memorandum on the hearing. “Reports indicate that, collectively, this disinformation is exacerbating injury, death, and division among the American people.”
Witness Hany Farid, Professor from UC Berkeley with a specialization in computer science and information said that “COVID-19 provides fertile ground for disinformation and fake news, with people spending more time online… hungry and anxious for any kind of information.”
Common misinformation regarding the virus include beliefs that wearing a mask can cause sickness, that the virus is a bioweapon of the Chinese or the US governments, or that it can be cured by drinking bleach, said Rep. George Kenneth Butterfield (D-NC).
In some cases virtuale misinformation can lead to real-life hazards. For example, two men were hospitalized in Georgia after drinking cleaning products in an attempt to prevent Coronavirus, according to Fox News.
Similar disinformation has also been directed at the overwhelmingly peaceful BLM protests, said witness Brandi Collins-Dexter, director of Color of Change, a progressive civil rights organizationNGO.
Many exaggerate the violence of the protest, including Republican leaders claiming the protestors are trying to “murder Republicans” and “dragging the country into civil war.” The president himself called the protestors “thugs” and “terrorists” in his tweets.
Democrat Rep. Jan Schakowsky (D-IL) condemns President Trump’s endorsement of fake news. “[He] is using his position to kill speech,” she said.
Unfortunately, “misinformation disproportionately targets people of color,” discriminating against blacks during the BLM social unrest and against Asians during COVID-19 according to Boyle.
Witness Spencer Overton, President of Joint Center for Political and Economic Studies, supports the statement with his example of scam accounts on Facebook infringing on black voting rights by trying to convince them to “boycott elections” and give up their vote.
“When companies say they are not willing to remove certain things, what they are really saying is that addressing white nationalism, disinformation and anti-Blackness simply don’t rise to a level of urgency for them,” Collins-Dexter said.
Section 230: The Shield and the Sword
During the hearing, Boyle addressed the Communications Decency Act (CDA) of 1996 and whether it needs reform. According to Section 230 of the Act, social media companies are not legally liable for the content published on their platforms.
“[Section 230] gave platforms a content liability shield so they will wield a content moderation sword,” said witness Neil Fried, the Former Chief Counsel for Communications and Technology.
Schakowsky and Farid believe that this takes away incentives for large companies such as Facebook and Twitter to improve their content moderation processes. As a result, all kinds of hate speech, fake news, fraud, extremism, and illegal businesses flourish on these loosely regulated platforms, said Schakowsky. Rep. Brett Guthrie (R-KY) believes that this leads to a tangible “increase in violence.”
Guthrie is against repealing Section 230. There are many functioning parts of the legislation that “inspire innovation,” regulate the Internet, and “protect consumers,” he said. A complete repeal of the legislation is too “rash” and would likely result in “unintended consequences,” according to Guthrie.
Mark Zuckerberg, CEO of Facebook, welcomes future government regulations but does not think it should be up to private firms to dictate the rules for content moderation.
“We don’t want private companies making so many decisions about how to balance social equities without any more democratic process,” he said at the Munich Security Conference.
On May 28, 2020, Trump signed an Executive Order threatening to limit the scope of Section 230. Trump’s move comes after Twitter and Facebook put warning labels on his posts in which he threatened to shoot BLM protestors and using a Nazi symbol.
However, the order is legally difficult to enforce, according to experts.
A Problematic Business Model
Farid believes the recommendation algorithms of online sites are major perpetrators of political and racial division.
Since social media companies profit through more user engagements, Farid believes that they are motivated to recommend to users what they are likely to click open.
Farid explained this problematic business model in his testimony: As humans, we all naturally seek information that confirms our beliefs. Therefore, platforms almost exclusively recommend affirming views, pushing users down a rabbit-hole of increasing radical views while alienating them from opposing views. Since fake news and extreme views are more eye-catching, the platforms also tend to recommend more radical, controversial stories.
“Ten percent of YouTube’s recommended videos are conspiratory in nature,” he said.
The Path Forward: Transparency and Accountability
While the Republicans and Democrats on the panel agree that more transparency and accountability are needed, they do so for different reasons.
After the labeling of Trump’s posts, Republican leaders Guthrie and Rep. Susan Brooks (R-IN) are concerned that tech giants’ moderation processes are biased against conservatives.
Guthrie and Brooks insist on companies being more transparent with their criteria for content moderation, emphasizing the importance of “politically neutrality”.
“In our history, we’ve never had the power to regulate speech concentrated on so few hands in the private section,” said Brooks. She is wary of companies such as Facebook and Twitter abusing their power to regulate speech for their own political ends.
“Don’t be fooled by made-up claims of bias against conservatives. Today it seems there is rather a bias for conservatives,” retorted Schakowsky, a Democrat. “As of June 19, nine of the ten top-performing political pages on the internet are conservative pages.”
The Democrats, Schakowsky, Butterfield, and Rep. Lisa Blunt Rochester (D-DE), on the other hand, worry about the excessive extremism, terrorism, and hate speech on these platforms.
“Social media companies have failed to prevent white nationalists, scammers, and other opportunists from using their platform to exacerbate [the BLM and COVID-19] crises,” said Rochester.
The Democrats believe, with all the capital, data, and technology available to these tech giants, they have the full means to do a better job with content moderation. They advocate for stricter legislation to create incentives for better moderation.
Under the mounting pressure for better regulations, many social media platforms have made changes.
Along with its AI algorithms, Facebook has “a team of 35,000 people” reviewing content and security on the platform, Zuckerberg told BBC. He also said that “more than a million fake accounts are deleted every day.”
Zuckerberg hopes that clearer regulations from the government will improve Facebook’s products.
“Companies like mine also need better oversight when we make decisions,” wrote Zuckerberg in an article. “The internet is a powerful force for social and economic empowerment. Regulation that protects people and supports innovation can ensure it stays that way.”