Instagram, the popular social media platform owned by Mark Zuckerberg’s Meta, is set to launch a new safety feature aimed at protecting minors from receiving unwanted nude images in their direct messages.
The Verge reports that Zuckerberg’s Meta announced on Thursday that it will soon roll out a feature designed to blur images detected to contain nudity and discourage users from sending such content to children. The feature, which will be enabled by default for teenage Instagram users based on their account’s birthday information, aims to create a safer environment for the platform’s youngest users. Adult users will also be encouraged to enable the protection through a notification.
The announcement comes amidst…