Meta Platforms, Inc., the parent company of Instagram, is currently developing new features aimed at protecting underage users from explicit content and sextortion scams. One of their most prominent additions is a feature titled "Nudity Protection in DMs." This new mechanism automatically censors any suggestive images, reducing their explicitness for adolescent viewers.
The feature will automatically recognize and obscure inappropriate images, inspiring users to reconsider before sending such messages and offering resources for safely transmitting sensitive images online. The nudity filter will be automatically activated for users under 18, while adults will receive a notification suggesting they should switch it on.
Meta has further increased its data sharing with the child safety program, Lantern, to include more indicators related to sextortion. Sextortion scams involve the solicitation of sexual material from children or teenagers and then demanding money from them.
Moreover, Meta will urge young users to review before sharing intimate images to prevent becoming a target of fraudsters who exploit these images for manipulative purposes. The company aims to make it harder for scammers to prey on teenagers. Meta is working on cutting-edge technology to recognize accounts potentially involved in sextortion and limit their interactions. For users who interacted with accounts implicated in sextortion, pop-up messages will direct them to relevant resources.
This new protection will be introduced gradually, with testing commencing in the coming weeks and global implementation over the subsequent months. However, it should be noted that Meta has no current plans to extend these protective measures to other platforms such as Facebook Messenger or WhatsApp.