Meta is changing its security settings so that users 18 or younger can no longer receive private messages from strangers by default, a policy change that comes in response to lawmakers and parents pressuring social media companies on teenage mental health.
Facebook and Instagram’s parent company announced on Thursday that teenagers 16 or younger will no longer be able to receive messages or be added to group chats on Instagram from users they don’t know. The policy change builds on a series of security and content standard updates implemented by the Big Tech social media platform to provide additional protections for younger users. Instagram and other platforms are facing growing scrutiny from lawmakers around the United States because of their influence over younger users.
“We’re taking additional steps to help protect teens from unwanted contact by turning off their ability to receive messages from anyone they don’t follow or aren’t connected to, by default,” Meta wrote in its blog post.
If a teenage user wants to speak with someone they don’t already follow or to change these settings, he or she will need to get approval from a parent through Meta’s parental supervision tools — if they’re already set up that way.
These features build on other parental controls the platform launched on Jan. 9. Users aged 16 or under will be restricted by the Sensitive Content Control setting on Instagram and the Reduce setting on Facebook. The setting will bar the users from seeing or searching for content the platform deems harmful to them. For example, if a friend posted something about dieting, the post would be hidden from all adolescent users.
This change should affect a significant amount of the time spent by teenagers on social media. Users 18 and younger spent an average of 65 minutes a day on Instagram in 2023, according to a study by the parental control software developer Qustodio.
The policy update was announced less than a week before several Big Tech CEOs, including Meta CEO Mark Zuckerberg, will appear before the Senate Judiciary Committee and comment on their approach to teenagers on their platforms.
CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER
Over 40 states sued Meta in the U.S. District Court for the Northern District of California in October, alleging that Meta hid the amount of damage its apps had caused to teenagers through the promotion of addictive behavior and harmful content. New Mexico also filed a suit against Meta in December, accusing it of hosting a “marketplace of predators” and failing to do enough to crack down on the sale of child sexual abuse material.
At least four states have attempted to restrict teenage access to social media by requiring the platforms to verify a user’s age through copies of IDs or other means. The tech advocacy group NetChoice filed suits against age verification laws in California, Arkansas, Ohio, and Utah and obtained preliminary holds in all four states.