Instagram is rolling out new tools to help protect against s*xtortion and intimate image abuse. The social media platform aims to test new features to enhance teenager protection, addressing concerns about nude image sharing, potential scams, and criminal contact.
Instagram DMs are often used for innocuous purposes like sharing cute puppy videos or food snapshots. Unfortunately, some individuals have ulterior motives—beyond just typical flirtation.
Scammers exploit Instagram DMs to initiate contact and solicit or distribute intimate images, creating a concerning environment for users.
Instagram recently announced plans to combat this issue by testing a new nudity protection feature, aiming to address the misuse of its platform.
One of the key features, unveiled on April 11 via its website, involves automatically blurring images flagged for nudity, allowing recipients to decide whether to view them. This blur is applied using on-device machine learning, ensuring user privacy. Meta has no access to the actual photo.
“We’ll also show them a message encouraging them not to feel pressure to respond, with an option to block the sender and report the chat,” Instagram says.
Additionally, another feature will prompt users to reconsider sending such photographs by issuing a reminder message. It emphasizes that these images can be unsent if the sender changes their mind.
Instagram adds: “Anyone who tries to forward a nude image they’ve received will see a message encouraging them to reconsider.”
Instagram aims to halt the exposure to ‘unwanted nudity in their DMs’ while safeguarding users from scammers who might send nude images to coerce reciprocal sharing.
The platform’s initiative seeks to deter scammers from exploiting users with nude images, reducing the likelihood of users falling victim to deceptive requests for reciprocal image sharing.
They continue: “When sending or receiving these images, people will be directed to safety tips, developed with guidance from experts, about the potential risks involved. These tips include reminders that people may screenshot or forward images without your knowledge, that your relationship to the person may change in the future, and that you should review profiles carefully in case they’re not who they say they are.”
Instagram plans to connect users with support sites like StopNCII.org and Take It Down, catering to both adults and teens.
The new feature will be automatically enabled for users under 18 worldwide, with a notification prompt for those over 18 to activate it as needed.
Additionally, Instagram is developing technology to detect potential sextortion scammers, complementing its earlier measures introduced in January to restrict messaging for users under 16 or 18 in certain regions.
They add: “Now, we won’t show the ‘Message’ button on a teen’s profile to potential sextortion accounts, even if they’re already connected. We’re also testing hiding teens from these accounts in people’s follower, following and like lists, and making it harder for them to find teen accounts in Search results.”
Instagram is enhancing its efforts by providing more sextortion-specific signals to Lantern. Lantern is a program managed by the Tech Coalition, enabling tech companies to exchange signals concerning accounts and behaviors that breach child safety policies. Lantern then sends these signals to partner companies so they can act accordingly.
The new features will commence in the coming weeks and months.