Meta, the parent company of Facebook and Instagram, has introduced new measures to protect its young users from harmful content, particularly sextortion. The company is focusing on teens, implementing stricter controls on who can follow and message their accounts.
To enhance safety, Meta will send notifications through Instagram Direct messages and Facebook Messenger if there are suspicious conversations. They are also restricting scammers’ access to follower lists and preventing them from taking screenshots of private messages.
Additionally, Meta is rolling out a nudity protection feature that uses on-device machine learning to analyze images in Instagram direct messages. This feature will blur potentially nude images and alert teens before they send or share such content. In the U.S. and the U.K., Instagram will also show videos in teens’ feeds to help them recognize sextortion scams. These scams often involve individuals who act overly friendly, ask for photo exchanges, or try to move conversations to other apps.
The FBI reports a significant increase in sextortion cases, especially among teenage boys, with about 12,600 identified victims between October 2021 and March 2023, including cases that led to suicides. Meta’s initiative comes as the company faces growing criticism regarding its impact on youth mental health, including issues like addiction, depression, and cyberbullying.