In a sign of the growing recognition of the need to address the role social media and technology play in fueling mass shootings, the U.S. Senate Commerce, Science and Transportation Committee held a hearing with representatives from major tech firms to probe what the government could do.

“No matter how great the benefits to society, it is important to consider how [social media] can be used for evil, at home and abroad,” said Sen. Roger Wicker (R-Mississippi), the chairman of the committee, which includes Sen. Cory Gardner (R-CO) as a member, though he was absent for yesterday’s hearing.

He noted a chain of recent shootings, connected by the role social media and technology played: Police say the gunman in the Aug. 3 El Paso shooting posted a manifesto on the “dark web” site 8chan; FBI investigators say the shooter at the Pulse nightclub in Orlando was radicalized by Islamic extremists online before his July 2016 shooting spree; and in March, a mass shooter at a mosque in Christchurch, New Zealand, posted a live video of the massacre to Facebook.

Tech companies at the hearing pointed to a range of stepped-up monitoring systems to identify potential threats.

Monika Bickert, head of global policy management for Facebook, said the company’s AI system can identify policy violations on Facebook Live, where the Christchurch gunman broadcasted the shooting, within 12 seconds. That’s 90 percent faster than it could a few months ago. At the time, video of the Christchurch shooting gained over a million views before it was taken down.

Nick Pickles, public policy director for Twitter, similarly touted technology improvements in monitoring systems. He said Twitter’s system observes patterns of behavior rather than isolated incidents to identify potential problems.

“For abuse, this strategy has allowed us to take three times the amount of enforcement of action on abuse within 24 hours than this time last year,” Pickles said.

According to Derek Slater, Google’s global director of information policy, Youtube updated its hate speech policies in June, cracking down on extremist content. Since then, he said, “the number of individual video removals for hate speech saw a 5x spike to over 100,000, the number of channel terminations for hate speech also saw a 5x spike to 17,000, and the total comment removals nearly doubled in Q2 to over 500 million, due in part to a large increase in hate speech removals.”

There was broad agreement that quick and streamlined cooperation with government and law enforcement was important.

“If we can strengthen, as an industry, our cooperation with law enforcement, we can make sure the information sharing is as strong as it needs to be to support those interventions,” Pickles said. 

There was still little consensus on how best government can intervene at the hearing.

Pickles, who works for Twitter, noted that blocking content can produce the effect of driving online hate speech into less regulated and monitored areas of the internet.

“It is important to recognize content removal online cannot alone solve these issues,” he said.

The committee’s ranking member Sen. Maria Cantwell (D-Washington) agreed, voicing her concern over the “dark web” specifically.

“Adding technology tools to mainstream websites to stop the spread of these ‘dark web’ sites is a start, but there needs to be more to be a comprehensive and coordinated effort to ensure people are not directed into these cesspools,” said Cantwell.

“We need to do more at the Department of Justice to shut down these ‘dark web’ sites, and social media companies need to work with us to make sure that we are doing this,” she continued.

Pickles also suggested that there are deeper societal problems driving the rise of mass shootings, beyond the scope of tech companies or the committee to address.

“Not every issue will be one where the underlying factors can be addressed by public policy interventions led by technology companies,” he said. 

Colorado’s Sen. Gardner (R-CO), though absent from the meeting, once questioned the lack of response to warning signs displayed by Nicolas Cruz, the shooter at Marjory Stoneman Douglas High School in Parkland, Florida last February.

A user with the screen name “Nikolas Cruz” posted, “Im going to be a professional school shooter,” on Youtube in September 2017. Cruz was never contacted by the FBI about the comment.

“We need to understand why those reports weren’t investigated or further action wasn’t taken,” Gardner told The Denver Post at the time.

Gardner condemned white supremacy after the El Paso shooting but did not mention the 8chan manifesto. “The white supremacy voices should be condemned each and every time they raise their heads and their ugliness,” he said at the time.