As concerns grow about the harmful effects of social media on teens, platforms from Snapchat to TikTok to Instagram are pushing for new features they say will make their services safer and more age-appropriate. will adapt. But the changes rarely address the elephant in the room — endless content-pushing algorithms that can drag anyone, not just teenagers, down a harmful rabbit hole.

The devices offer some assistance, such as preventing strangers from sending messages to children. But they also share some deep flaws, starting with the fact that teens can find limits on lying about their age. Platforms also place the burden of enforcement on parents. And they do little or nothing to screen for inappropriate and harmful content provided by algorithms that can affect the mental and physical well-being of teens.

“These platforms are aware that their algorithms can sometimes amplify harmful content, and they are not taking steps to stop it,” said Irene Lee, privacy attorney at the nonprofit Common Sense Media. The more teens keep scrolling, the more engaged they are — and the more engaged they are, the more profitable the platforms are, she said. “I don’t think they have much incentive to change that.”

Take Snapchat, for example, which on Tuesday introduced new parental controls called “Family Center” — a tool that lets parents see who’s messaging their teens, though. Not the content of the messages themselves. One catch: Both parents and their children have to attend the service.

Snap’s director of platform policy and social impact, Nona Farahnik Yadgar, likens it to parents who want to know who their kids are hanging out with.

If kids are visiting a friend’s house or meeting up at the mall, she said, parents will usually ask, “Hey, who are you going to meet up with? How do you get to know them?” The new tool, she said, wants to give parents “the insight into exactly how to have these conversations with their teens while preserving the teen’s privacy and autonomy.”

These conversations, experts agree, are important. In an ideal world, parents would regularly sit down with their children and have honest conversations about the dangers and pitfalls of social media and the online world.

But many kids use a variety of platforms, all of which are constantly evolving — and that stacks the odds against parents and many more, said Josh Golin, executive director of children’s digital advocacy group Fairplay. Monitors controls on platforms.

“There is a need for the platform to design and make its platform safer by default rather than increasing the workload on the already overburdened parents,” he said.

The new controls, Golin said, also fail to address the myriad of existing problems with Snapchat. From misrepresenting the age of children to the “compulsive use” encouraged by the app’s Snapstreak feature to cyberbullying, that’s made easy by the disappearing messages that still serve as Snapchat’s claim to fame. Huh.

Farahnik Yadgar said Snapchat has “strong measures” in place to stop kids falsely claiming they’re over 13. Those who lied about their age have their accounts removed immediately, she said. Teenagers over the age of 13 who pretend to be even older get a chance to correct their age.

Detection of such lies is not foolproof, but the platform has many ways to get to the truth. For example, if a user’s friends are mostly teens, it’s likely that the user is also a teenager, even if they said they were born in 1968 when they signed up. Companies use artificial intelligence to spot age mismatches. A person’s interests can also reveal his real age. And, as Farahnik Yadgar pointed out, parents may even find out that their kids were thinking about their birth dates if they try to turn on parental controls but find their teens unqualified.

Child safety and adolescent mental health are front and center in both Democratic and Republican criticism of tech companies. States, which have been more aggressive about regulating technology companies than the federal government, are also turning their attention to the matter. In March, several state attorneys general launched a nationwide investigation into TikTok and its potentially harmful effects on the mental health of young users.

Leave a Reply

Your email address will not be published. Required fields are marked *