Meta is attempting to encourage Instagram’s younger users to put down their phones in favor of a solid night’s rest. According to a January 18 Meta blog post, the company will begin showing “new nighttime nudges” for 13-to-17-year-olds after they spend more than 10 minutes scrolling through Instagram sections such as Reels or Direct Messages sections “late at night.”
“Sleep is important, particularly for young people,” Meta states—which, fair enough.
A sample app screenshot provided in Meta’s newsroom post depicts a black screen asking “Time for a break?” alongside the ever-so-slightly passive aggressive “It’s getting late. Consider closing Instagram for the night.”
In an email provided to TechCrunch on Thursday, a Meta spokesperson confirmed Instagram will enable the new reminders after 10pm local time for some users. Technically, although teens can’t disable the feature, they can simply ignore the message to continue scrolling through their feeds through the wee hours of the morning.
Meta’s “nighttime nudges” are the latest in a string of recently introduced oversight features aimed specifically at addressing longrunning criticisms regarding social media’s harmful psychological effects on users—particularly younger audiences. Last week, the company announced impending plans to enforce new, mandatory Instagram and Facebook content restrictions for teens and minors. Established “in line with expert guidance,” the new guidelines will institute new privacy safeguards meant to block content related to self-harm, graphic violence, and eating disorders. A staggered rollout of Instagram’s and Facebook’s respective “Sensitive Content Controls” and “Reduce” features is expected to finish “in the coming months,” according to Meta’s January 9 update.
Before that, Meta instituted a suite of parental supervision tools in the latter half of 2023, including the ability to see their children’s time spent on Facebook, Messenger, and Instagram, an option to schedule breaks, and access to teens’ blocked contacts list. Last December, Meta also finally made good on its years’ long promise to establish default end-to-end encryption protocols for its over one billion global Messenger and Facebook users.
The belated slow-drip of new self-regulations may not be enough to prevent continued public and political pressure, not to mention potential legal consequences. Meta CEO Mark Zuckerberg—along with the heads of TikTok, Snap, Discord, and X—are currently scheduled to testify at a Senate hearing pertaining to online child safety on January 31. Meanwhile, major social media providers still face a number of high-profile lawsuits filed by multistate coalitions accusing them of wantonly ignoring their products’ adverse effects on adolescent users in favor of corporate profits.