Experts say balancing the First Amendment, the key to data protection, will make social media safer for children

This session, Pennsylvania lawmakers have introduced a number of bills aimed at making the internet a safer place for children, with a particular focus on social media companies.

For example, the bill that passed the House earlier this month would require juvenile social media users to get parental consent before opening an account and would also require social media companies to develop policies to deal with what the bill calls “hateful conduct.” .

“It is long past time for us to act on behalf of children and parents and install at least some safeguards and guardrails to protect our children,” he added. Rep. Brian Munroe (D-Bucks), sponsor of the 2017 House Bill, told his colleagues.

While numerous studies have raised concerns about the impact of social media employ on teenagers’ mental health, and political scientists have warned about social media’s potential to reinforce social divisions, free speech advocates say bills like Munroe’s Act could conflict with companies and social media users First Right Amendments.

Moreover, supporters of tech policy focused on data privacy warn that bills requiring age verification could result in social media companies collecting even more sensitive data from users.

Similar laws have been passed in other states, such as: OhioTo have have been challenged or blocked in court.

The same concerns apply to other bills that address perceived problems with social media. One bill would ban Tiktok in Pennsylvania due to concerns about Chinese government influence. A federal law banning social media platforms resulted in: a lawsuit by TikTok and its parent company earlier this month.

Another bill similar to Munroe’s was introduced in the Senate. Another bill would ban people under 18 from accessing pornographic websites, effectively requiring adult users to show ID to prove they are not minors – or, more disturbingly, encouraging the collection of biometric data to guess a user’s age.

Advocates warning about the consequences of laws requiring social media companies to create rules about what can and cannot be posted on their websites or requiring age verification for users say lawmakers can address their concerns.

Sophia Cope is a senior staff attorney at the Electronic Frontier Foundation, a nonpartisan advocacy group focused on digital civil liberties. Cope believes lawmakers have reason to be concerned about the impact of social media, particularly on juvenile users, but too often sees laws that address the issue by limiting the free speech rights of social media companies and their users.

“I think there are some very well-intentioned bills out there, but they tend to be too broad,” Cope said. “Whenever you try to regulate anything that has to do with speech, you’re always going to run into the Constitution.”

Cope noted that many states have attempted to regulate content on social media platforms, often driven by the partisan attitudes of their legislatures. Texas and Florida, for example, passed bills aimed at stopping what their Republican-majority legislatures saw as conservative censorship. On the other hand, New York has passed a bill that, like Munroe’s Democrat, would require companies to develop hate speech policies. New York’s law was blocked by the District Court. The Florida and Texas laws are being challenged in the U.S. Supreme Court.

Cope views all of these attempts as violations of companies’ First Amendment rights. She suggested that one way lawmakers could protect juvenile users is to make digital skills classes mandatory.

“That’s how you tell a fake website from a website that’s not fake, that’s how you adjust your privacy settings, which are all the rules you need to follow to stay safe online,” Cope said. “Where is this training?”

Like Cope, Kate Ruane, director of the Free Expression Project at the Center for Democracy and Technology, is concerned about lawmakers’ attempts to directly address how social media companies moderate content.

“I’m particularly concerned about the government telling anyone what can and cannot be available on a social media platform,” Ruane said.

Ruane said lawmakers could instead focus efforts on tackling the incentive structure that can lead to the spread of harmful content.

“Comprehensive consumer privacy legislation can go a long way to reducing some of these major concerns,” Ruane said. “People often point out that these companies are incentivized to keep you on the platform by showing you everything that personally keeps you on the platform. If they no longer have a profit motive, or if that profit motive changes significantly, this should theoretically change the incentive structure of service design.”

Ruane pointed to Google as a possible example of how such regulations can lead to a better and safer product.

“They measure quality based on relevance and other factors that are supposed to lead to better search results because their goal is better search results,” Ruane said. “If the goal is solely engagement, you measure quality by engagement.”

One part of Munroe’s bill would directly address data privacy concerns by making it illegal to collect data on users under 18 for the purpose of serving targeted ads or selling it to third parties.

Ruane, Cope and representatives from the Pennsylvania chapter of the American Civil Liberties Union believe that specific parts of the bill could go further and be separated from aspects that escalate concerns about data privacy and free speech.

“Honestly, why are we limiting it to just children,” Liz Randol, legislative director of the ACLU of Pennsylvania, said in an interview earlier this month.

While federal lawmakers have failed to advance national data privacy laws, other states have led the way.

For example, Illinois has a law that prohibits companies from collecting users’ biometric data without their consent in most circumstances. This law led to a successful $650 million class action lawsuit against Facebook (now Meta) when Facebook (now Meta) used image matching technology to identify people in photos posted on their platform.

In 2018, California passed one of the nation’s broadest digital privacy laws, requiring websites to inform users about how their personal information is shared with third parties and to allow users to opt out of the sale of their data to third parties. The law applies not only to minors, but to all Internet users.

Ruane said lawmakers could also require social media companies to be more clear about how they employ their algorithms to guide users toward content and how they moderate it.

Amid growing concerns about the way social media algorithms shape political discourse and culture, some of the largest social media companies such as Meta AND X, formerly Twitterhave taken steps to limit researchers’ access to data about what users view and share on their sites.

Research has drew mixed conclusions on how social media algorithms influence user behavior, particularly when it comes to political polarization, but advocates like Ruane still argue that requiring companies to be clear about decisions about what content to show users could allow for more informed employ products.

The hope with algorithmic transparency laws is that if users are aware of how their attention is being diverted, they will be able to make more informed decisions about where they spend their time online and think critically about what they see and why .

“Being able to closely examine how these content moderation systems work and how they impact users of social media services… helps us change the situation and make recommendations that can improve these services,” Ruane said. “But the government’s control over these services is deeply concerning.”

Get in Touch

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Related Articles

Latest Posts