The U.S. Department of Justice issued a new set of legislative proposals on Wednesday aiming to curtail the broad legal protections safeguarding tech giants like Facebook, Twitter and YouTube from being held accountable for what its users post.
The proposals target Section 230 of the Communications Decency Act, a law that, since 1996, has also given tech platforms and websites leeway to moderate content as they see fit. Key changes recommended by the Justice Department include pulling immunity for sites that facilitate criminal activity, like selling drugs, and rules compelling platforms to take action against terrorism and child exploitation.
Platforms would also be able to be sued if they remove content or ban users that didn’t break rules explicitly mentioned in their terms of service. Congress would need to pass the proposals for them to go into effect.
Wednesday’s proposals come a few weeks after President Donald Trump signed an executive order “to defend free speech” against the content moderation policies of several major tech companies. Trump signed the executive order a few days after Twitter added fact-check notifications to a few of his tweets on mail-in voting. The executive order called on the Commerce Department to petition the Federal Communications Commission to review Section 230. If Section 230 were to be changed, it could become a major liability for Twitter, Facebook and Google-owned YouTube.
Still, the president’s executive order lacked “real teeth,” Fordham law Professor Olivier Sylvain recently told TheWrap, in large part because the president doesn’t have the authority to alter Section 230.
Trump has been critical of the “shield” Section 230 provides tech giants — something the Justice Department, led by Attorney General William Barr, appears to agree with.
That legal shield was put there for a reason, though, according to Jeff Kosseff, a law professor at the U.S. Naval Academy and author of “The Twenty-Six Words That Created the Internet,” a book on Section 230. Last year, Kosseff told TheWrap the law was adopted to give sites “breathing room” to moderate content.
“Congress passed it because they did not want the platforms to be hands-off,” Kosseff said. “They gave platforms the tremendous flexibility to moderate objectionable content. That was a policy choice that Congress made.”