Instagram is implementing new safeguards to protect teenage users by restricting them to PG-13 rated content by default. This means accounts belonging to users under 18 will automatically be set to a restrictive mode, preventing them from viewing content related to sex, drugs, dangerous stunts, strong language, and potentially harmful behaviors like marijuana use. Parents will need to grant permission for teens to opt out of these restrictions.
These changes come in response to ongoing criticism regarding the harms social media platforms, including Instagram, can inflict on children. While Meta has previously promised to filter out sensitive content such as self-harm or eating disorder material, recent reports indicate that even teen accounts have been recommended age-inappropriate sexual and harmful content. Meta has labeled such reports as misleading.
In addition to the default PG-13 setting, parents can opt for an even stricter "limited content" setting that further restricts what their children can see and interact with online. The new measures also include preventing teens from following accounts with inappropriate content or keywords, and expanding the filtering of search terms for sensitive topics. AI chatbots targeted at teens will also be programmed to provide age-appropriate responses.
Despite these announced changes, some child advocacy groups remain skeptical, viewing the announcements as a public relations effort to... download the app to read more
Follow top global news sources, read AI-powered summaries, ask AI your questions, translate news into your language, and join live chats — all with YoyoFeed!