Opinion | How Meta can — or be forced to — avoid addicting kids

[ad_1]

When 41 state attorneys general and the District of Columbia sued Meta, Facebook’s parent company, more than a month ago, their complaint was redacted to the point of illegibility. Now, an unredacted version has emerged, and it’s well worth the read. The state AGs claim, essentially, that Meta is exploiting its younger users for profit, privately prioritizing growth over teens’ well-being, even as it claims publicly that safety is paramount: “The lifetime value of a 13 y/o teen is roughly $270,” one internal company email counsels. This mind-set, the AGs say, informed the very design of the company’s products. And it has led executives, counter to internal research, to back off proposals that would improve those products by discouraging “problematic use” — a jargony way of saying addiction. (Meta has disputed that characterization.)

Obviously, Meta is a business, and moneymaking is what businesses do. Current law does little to restrain social media services from luring users down the rabbit hole, and, for the most part, that’s how it should be. Yet there is leeway for the government to place restrictions on products that harm children’s health. The type of problematic use the complaint describes, hours spent scrolling, is precisely the kind that research shows damages minds not yet fully formed.

The state AGs may nevertheless struggle in the courts.

The complaint is on the firmest legal footing in its charges relating to the Children’s Online Privacy Protection Act (COPPA), which sets strong restrictions on companies’ relationships with users under the age of 13. COPPA both restricts the ways platforms can collect and process data on those users and requires parental consent to create accounts. Because these rules are onerous, platforms generally prohibit under-13s from signing up at all — but, as any parent well knows, that doesn’t actually stop them. (CEO Mark Zuckerberg, the complaint alleges, received a report in 2015 that about one-third of all 10- and 12-year-olds in the country were on Instagram.)

This means that platforms end up collecting all the data they’re not supposed to gather without any of the consent they’re supposed to obtain. The state AGs contend that Meta’s platforms teem with tweens because the company isn’t searching particularly hard for them. The evidence they present, including unanswered reports of accounts by underage users, appears compelling; Meta predictably responds that the complaint is stuffed with “selective quotes and cherry-picked documents.” A judge will now decide, and COPPA’s strong language gives the AGs a substantial argument.

Less likely to prevail are the AGs’ arguments about children 13 or over — the users that disturbing internal company email concerned, ones whom Meta allegedly addicted with its products. COPPA does not protect them like it does younger children. Regarding teens, the AGs’ filing might be more revealing than effective in persuading a judge to force change.

The state AGs may struggle in the courts, but the excessive social media use they describe deserves attention from legislatures — careful attention. There are plenty of bad ways to prevent platforms from creating a health crisis for the country’s children. Many have been tried already by overzealous lawmakers: Rules that mandate the removal or forbid the promotion of particular types of content run up against the First Amendment. Similarly, policies that expose platforms to liability for generally “causing” a kid to become addicted to social media are far too broad. The wiser route is to focus on design: the little things in a platform’s makeup that push teen users to overuse — and, indeed, those that pull them away, instead.

Perhaps this means automatic alerts after a time threshold has been exceeded. Perhaps at that same marker, apps should toggle greyscale mode: stripping away all color and, according to studies, shaking the user out of senseless screen-staring. “Circuit-breakers” that prevent certain viral items from being displayed to too many users too often could also cut down on the spread of sensationalistic or otherwise emotionally aggravating material — all while remaining content-neutral. That may help address another form of problematic use, related to posts that engender suicidal ideation or body-image issues.

Any of these “nudges” toward healthier behavior ought to be based in careful science. And there’s room for companies to go beyond what the law requires. Many of them already are — and Meta will readily tell you it’s one of them. If the state AGs manage to prove that these efforts are less than earnest, they may still fail to win over a judge. Winning over Congress, however, would be the real victory.

The Post’s View | About the Editorial Board

Editorials represent the views of The Post as an institution, as determined through discussion among members of the Editorial Board, based in the Opinions section and separate from the newsroom.

Members of the Editorial Board: Opinion Editor David Shipley, Deputy Opinion Editor Charles Lane and Deputy Opinion Editor Stephen Stromberg, as well as writers Mary Duenwald, Christine Emba, Shadi Hamid, David E. Hoffman, James Hohmann, Heather Long, Mili Mitra, Eduardo Porter, Keith B. Richburg and Molly Roberts.

[ad_2]

Source link