Instagram is getting significantly tougher on the content it allows its teenage users to see, implementing a new PG-13 inspired system that includes stricter rules and blocked searches. This move by Meta is part of a broader effort to address long-standing safety concerns.
The new “13+” setting, which will be the default for all users under 18, will automatically hide or demote a wider range of content. This includes posts with strong language, dangerous stunts that could lead to injury, and imagery that normalizes harmful activities like substance use.
A key part of this tougher stance is the implementation of search blocks. Instagram will now prevent users from searching for sensitive terms, even if they are misspelled, cutting off a primary method teens might use to find inappropriate content deliberately.
This crackdown comes in response to immense pressure from the public and regulators, fueled by reports claiming the platform is unsafe for children. A recent study involving a former Meta whistleblower found existing tools to be largely ineffective, adding to the urgency for more drastic measures.
The new system will launch in the US, UK, Canada, and Australia before a global release. While the tougher rules are a welcome announcement for some, safety advocates insist that their true value will only be known after independent verification proves they are effective.