
Thursday 26/February/2026 – 04:18 PM
The Instagram application has launched a new feature that notifies parents if their teenage children conduct repeated searches over a short period of time for terms related to suicide or self-harm, in a step aimed at strengthening the digital protection system for minors.
Digital protection system for minors
He explained Applicationa subsidiary of Meta Platforms, said that alerts will be sent to parents who subscribe to optional moderation settings if their children attempt to access content that addresses sensitive topics related to self-harm.
This step comes at a time when calls are mounting globally to impose stricter restrictions on teenagers’ use of social media platforms, especially after Australia’s decision to ban the use of social media for those under 16 years of age.
The United Kingdom also announced that it is considering tightening controls to protect children online, while Spain, Greece and Slovenia are considering similar measures.
The platform confirmed that it has strict policies that prohibit content that promotes or praises suicide or self-harm, noting that the new feature comes within a broader package of tools aimed at creating a safer digital environment for teenagers and enhancing the family’s role in early follow-up and prevention.








