Meta says it is reinforcing its digital safety commitment to teens by rolling out new protections on Facebook and Messenger, as part of a sweeping update to its Teen Accounts framework now reaching global shores, including Africa.
The tech giant says the latest controls are “designed to foster safer online environments” for users under 18, by automatically limiting who can find or message them, filtering potentially sensitive content, and giving parents more visibility into their children’s digital habits.
“We want to make it easier for parents to have peace of mind when it comes to their teens’ experiences across Meta’s apps,” Meta says in a statement seen by Technology Times.

The expanded rollout is starting this week in the United States, United Kingdom, Canada, and Australia, with Meta confirming that parts of Africa are next in line to receive the update.
The expanded rollout is starting this week in the United States, United Kingdom, Canada, and Australia, with Meta confirming that parts of Africa are next in line to receive the update.
The move comes as Nigeria and other African nations are grappling with growing concerns over youth exposure to online exploitation, cyberbullying, and harmful digital content.
In line with the policy, teens are now being shielded from unwanted contact as Meta automatically restricts message requests and visibility from unknown users. Overnight notifications are also disabled, while usage prompts nudge teens to take screen-time breaks.
This development, the tech giant says, follows Meta’s 2024 revamp of Instagram’s teen experience, where the company introduced default privacy settings, restricted live video capabilities, and screen-time management tools.

According to Meta, more than 97% of users aged 13 to 15 on Instagram are continuing to use the recommended safety settings—a signal that both teens and parents are embracing safer digital behaviours.
More than 54 million teenagers worldwide are now operating under these updated safety models, Meta says. With the latest update, Instagram is adding even more guardrails: teens under 16 will no longer be able to host live videos without parental consent, and the app will block the sending or receiving of suspected nude images in direct messages.
To further enforce content moderation, a new feature will now blur images flagged for nudity, and turning off that blur will require parental approval.
The safety enhancements mark Meta’s growing response to regulatory and societal pressure to protect young users online. As the changes begin their global rollout, African digital communities, particularly parents and educators, are being urged to take full advantage of the tools to help shape safer online experiences for teens.
Meta says it is positioning these updates as part of a broader vision where online platforms serve as empowering—not perilous—spaces for the continent’s growing base of digitally savvy youth.