Congress' tech plate is full, with little time at the table

Gopal Ratnam, CQ-Roll Call on

Published in Political News

The legislation would give consumers rights over their data and also give them the right to sue companies for violations of privacy. Small businesses, not yet defined in the proposal, would be exempt from the bill’s provisions.

The legislation would preempt a patchwork of state laws and create a national standard, but state attorneys general would have the authority to enforce the law, as would the Federal Trade Commission. The House Energy and Commerce Committee plans to hold a subcommittee hearing this week on the proposal, but the Senate Commerce Committee has yet to schedule one.

A congressional aide said the Senate committee’s staff was working to drum up support.

Maryland measures

While Congress debates how to proceed on a federal privacy measure, one state passed legislation with tougher provisions than most others that have enacted laws of their own.

Maryland’s legislature earlier this month passed a measure that would limit companies to collecting only the data that is necessary for a transaction and would restrict the collection of sensitive data such as location details. If signed by Gov. Wes Moore, the law would go into effect in October 2025.

The data minimization requirements in Maryland’s law go further than legislation enacted in 14 other states and are similar to what Congress is considering in the Cantwell-Rodgers proposal.

Other states have required companies to get users’ consent to collect the data that a business entity deems necessary. But Maryland has avoided the consent approach, said Keir Lamont, director of U.S. legislation at the Future of Privacy Forum, an advocacy group focused on data privacy.

The legislation’s effect is to “deemphasize consent, generally, and instead put default limits on what data can be collected,” Lamont said in an interview. “This is a new approach to protecting privacy at the state level. If we continue to see other states adopt similar language, that may impact the federal conversation about privacy legislation.”


States’ efforts on artificial intelligence systems and kids’ online safety may impel federal legislation as well.

Four states — California, Connecticut, Louisiana and Vermont — have enacted laws to protect individuals from any unintended, yet foreseeable, impacts of unsafe or ineffective AI systems, according to the Council of State Governments, a nonpartisan group that promotes states’ efforts. Other states are requiring companies to disclose to users when they use AI systems in hiring or in other decision-making roles, according to the council.

Congress also has been debating tightening rules to ensure the safety and privacy of kids using online platforms, but those efforts have yet to yield enacted legislation. Two bipartisan bills addressing kids’ online safety were approved by the Senate Commerce Committee last year, but they have yet to get a floor vote.

States yet again are stepping into the breach, Lamont said.

Maryland’s legislature passed a separate measure earlier this month on what’s known as age-appropriate design that would require tech companies to craft their platforms with privacy as a default option and ensure that services are fit for the age of the users.

The measure is similar to a 2021 law enacted by California that has been challenged in courts by NetChoice, a tech industry trade group. Although a U.S. District Court halted parts of the law, California has appealed the ruling.

Vermont also is considering an age-appropriate design law.

“States are increasingly stepping up and legislating on topics that might otherwise be left to the federal government,” Lamont said.

©2024 CQ-Roll Call, Inc. Visit at rollcall.com. Distributed by Tribune Content Agency, LLC.


blog comments powered by Disqus