An “eraser button”? Focused ideas could help bridle Big Tech

WASHINGTON , — What if Big Tech is broken up? What about reducing the protection afforded tech companies against liability when content they send to users causes damage? Perhaps a new regulator is needed to oversee the sector.

These ideas have attracted attention from the U.S., Europe and U.K. officials as Facebook has been renamed Meta on Thursday. This is in contrast to other giants like Amazon, Google and Amazon. Frances Haugen’s revelations of deep-seated problems, supported by a wealth of documents from the company, has given momentum to regulatory and legislative efforts.

While regulators may still consider major steps like closing down companies or restricting their acquisitions, they might be looking at more concrete and realistic options. People might also see these things in their social media feeds.

Legislators are trying to be creative and introduce a series of bills that will take Big Tech down one notch. One bill proposes an “eraser” button that would allow parents to instantly erase all information about their teens or children. A second proposal would ban certain features that are not appropriate for children under 16 years old, such as push alerts and video auto-play. It also prohibits push notifications, push alerts, “like” buttons, follower counts, and video auto-play. A proposal is also being made to prohibit the collection of personal data without consent from children aged 13-15. A new digital “bill-of-rights” for minors would also limit the collection of personal data from teenagers.

Personal data is essential for all online users, regardless of age. This is the core of social media platforms’ lucrative business model. They harvest data from users to create personalized ads that target specific customer groups. A social network giant like Facebook, valued at $1 trillion, data is its financial lifeblood. Er, Meta. Nearly all of its revenue comes from advertising sales, which amounted to $86 billion last fiscal year.

The proposed legislation to collect personal data from young people could have a significant impact on the bottom line for social media companies. While executives from Snapchat, TikTok, and YouTube offered their principle endorsements during a congressional hearing about child safety, they didn’t pledge to support any existing legislation. Instead, they spoke in Washington lobbyist-speak and said they were looking forward to working with Congress. Translation: They are trying to influence the proposals.

Sens. Sens. They claim they are hearing more stories about teens who took too many opioids online, or who committed suicide after their depression and self-hatred were magnified by social media.

Haugen has made many condemnations about Facebook. But, it seems that her revelation of data from the company showing that the Instagram photo-sharing app was causing harm to some teens resonated the most with the public.

Republican and Democratic legislators are at odds over children’s issues. They are divided by hate speech and perceived political bias on social media. “Won’t someone please consider the children?” Gautam Hans, a Vanderbilt University professor and technology lawyer, said that there is one thing that unites Republicans and Democrats. It’s very appealing on a bipartisan level.

The U.K. is making progress toward tighter rules for protecting social media users, particularly younger ones. Haugen was asked by members of the U.K. parliament for advice on how to improve British online safety legislation. On Monday, she appeared before a London parliamentary committee to warn members that it is too late to regulate social media companies using artificial intelligence to promote “engaging” content.

The European Union competition and privacy regulators have been much more aggressive than their American counterparts in trying to bribe tech giants. Some of these companies have been subject to multibillion-dollar fines and they have adopted new, more stringent rules in recent years. This spring, the U.K. created a new regulator for Google and Facebook.

The Federal Trade Commission (USA) fined Facebook $5 Billion and YouTube $170 Mn in separate cases for privacy violations. This was only when U.S. regulators kicked into gear. The U.S. Justice Department filed landmark antitrust lawsuits against Google last year over its market dominance in online searches. A parallel antitrust case was brought by the FTC and other states against Facebook, alleging that it had used its market power to defeat smaller competitors.

U.S. lawmakers from both parties have presented a wide range of proposals to crack down social media, target anti-competitive practices at Big Tech companies, possibly ordering their dissolution, and to expose the algorithms tech platforms use to decide what appears on users’ news feeds.

These proposals will need to be pushed through the hoops that lead to final implementation.

For example, the Justice Against Malicious Algorithms Act was introduced by senior House Democrats about a week after Haugen testified that social media algorithms push extreme content and anger to increase user “engagement”. The bill would make social media companies accountable by removing Section 230 from their liability to provide tailored recommendations to users deemed to be causing harm.

Experts who advocate stricter regulation of social networks warn that the legislation could lead to unintended consequences. They suggest that it doesn’t specify which algorithmic behavior would result in loss of liability protection. This makes it difficult to see how it would work in practice, and leads to wide disagreement about what it might do.

Paul Barrett, a New York University professor of journalism, says the bill is “very broad” and that its authors might not be able to understand it. He also suggests it could eliminate the liability shield completely. Southern Methodist University First Amendment scholar Jared Schroeder said that the bill has a noble purpose, but constitutional free-speech rights would likely prevent any attempt to sue social media platforms.

Meta, the company that owns Facebook, declined to comment on Friday’s legislative proposals. The company stated that it has been a long-time advocate for updated regulations but did not provide any details.

Mark Zuckerberg, the CEO of Facebook, has proposed changes to ensure that internet platforms have legal protection only if they can show that their systems for identifying illegal material are working properly. This requirement might prove more challenging for startups and smaller tech companies, but critics believe it will work in Facebook’s favor.

Exit mobile version