Science & Technology

/

Knowledge

Graham: Tech companies should 'earn' liability shield

Laura Castro Lindarte, CQ-Roll Call on

Published in Science & Technology News

WASHINGTON -- Changes may be coming to the provision in communications law that limits web platforms, like Facebook and Google, from being sued for user content, if Senate Judiciary Chairman Lindsey Graham has his way.

Following a hearing on protections for children from internet predators before his committee Tuesday, Graham said he wants to hold big tech companies more accountable by making them "earn" liability protections. Those "were given to make sure the industry would flourish, mission accomplished. However, the liability protections now have to be modified so that you earn them," the South Carolina Republican said.

Graham said he wants to work with Silicon Valley giants and other expert groups to create a list of "best business practices" surrounding the protection of minors online and periodically report whether technology companies were maintaining them. If not, digital platform companies would no longer be able to rely on Section 230 of the 1996-era law, titled the Communications Decency Act, to protect themselves from possible lawsuits for content they host.

"To me, that's a combination of letting private sector have input about what they should be doing but making sure they meet that test that they set for themselves," Graham said.

A new federal agency could be created to oversee the process, he added: "Maybe we create a new body because apparently these agencies are not doing very well."

He offered no timetable for bringing a bill forward, however.

 

Tuesday's hearing focused on protecting kids when using using popular digital platforms like YouTube and Snapchat, and covered encryption technology, law enforcement, online pornography and platforms being used to sexually exploit minors.

"I'm confident that they have in their know-how the ability to, within weeks, take care of many of the problems that we've spoken about here," said Christopher McKenna, founder of Protect Young Eyes, an organization focused on keeping kids safe online, who was among the experts to testify. "It's not until they're pushed or there's pressure or there's reputational damage, or something, that they seem to move."

The hearing came in the wake of a New York Times report that detailed how YouTube's recommendation algorithm was automatically suggesting home videos of kids to people watching sexually-themed videos.

YouTube responded by changing its algorithm and noted that it had already removed about 800,000 videos in the first quarter of 2019 that violated safety rules and disabled comments on videos featuring minors. It did not stop recommending videos featuring kids, however.

...continued

swipe to next page
 

Comments

blog comments powered by Disqus