November 1, 2024
Facebook parent company Meta is calling on Congress to pass legislation that would make age restrictions on its rivals' app stores much tighter to limit inappropriate content for teenagers.

Facebook parent company Meta is calling on Congress to pass legislation that would make age restrictions on its rivals’ app stores much tighter to limit inappropriate content for teenagers.

The company released a blog post on Wednesday backing requirements that app stores get parental permissions for users aged 13 to 15 to download apps. That policy is an alternative to the plan favored by many state and federal lawmakers to have individual tech companies screen users to verify their ages.

CUSTOMERS SURPRISED TO DISCOVER THEIR RENTAL IS AN ELECTRIC VEHICLE

The “best way to help support parents and young people is a simple, industry-wide solution where all apps are held to the same, consistent standard,” Antigone Davis, Meta’s leader on safety, wrote.

“With this solution, when a teen wants to download an app, app stores would be required to notify their parents, much like when parents are notified if their teen attempts to make a purchase,” Davis writes. “Parents can decide if they want to approve the download.”

Age verification has become a commonly proposed tool for limiting damage that parents and experts are claiming social media causes to teenagers’ mental health. Most solutions focus on forcing individual companies to do so rather than third-party distributors like the Apple App Store or Google Play.

If such a law were to be passed, the app stores, rather than app makers, would act as the gatekeepers for any teenagers on the platforms. Both Google and Apple are already facing scrutiny in court for their alleged market monopoly on digital apps.

CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER

The proposed regulations could also be an alternative to more platform-restrictive legislation like the Kids Online Safety Act. KOSA, proposed by Sens. Marsha Blackburn (R-TN) and Richard Blumenthal (D-CT), would require platforms to take steps to prevent a defined set of harms to minors, including the promotion of suicide, substance abuse, sexual exploitation, and drug or alcohol use. It would also require social media companies to implement controls for users, including options for limiting screen time, restricting addictive features, and limiting access to user profiles. The management tools would default to the strictest settings for users younger than 16.

Multiple states have passed laws requiring websites hosting pornographic content to verify a user’s age with a government ID. However, Texas’s age verification law has hit a wall due to a preliminary injunction alleging that it is unconstitutional.

Leave a Reply