Meta has announced enhanced safeguards aimed at protecting teen users on its social media platforms, Instagram and Facebook.
Meta, the parent company of popular social media platforms, disclosed on Thursday its plans to fortify defenses against unwanted direct messages for teen users. This strategic move, which includes adjustments on both Instagram and Facebook, follows Meta’s recent commitment to concealing more content from teens, aligning with regulatory calls to shield children from potentially harmful content.
Notable Changes Include:
- Restricted Direct Messages on Instagram:
- Teens will no longer receive direct messages from individuals they don’t follow or aren’t connected to by default.
- Parental approval will be mandatory for specific app settings changes, ensuring a more controlled and monitored experience for younger users.
- Messenger Protocol for Users Under 16:
- Messenger, Meta’s messaging app, will implement stricter controls for users under 16 and under 18 in specific regions.
- Messages will be limited to those from Facebook friends or individuals connected through phone contacts.
- Age-Appropriate Messaging on Facebook:
- Adults aged 19 and above will be restricted from messaging teens who do not follow them, reinforcing the boundary between adult users and teens.
This proactive step by Meta comes amid heightened regulatory scrutiny triggered by a former Meta employee’s revelations during a U.S. Senate hearing. The ex-employee alleged that Meta was aware of instances of harassment and harm to teens on its platforms but failed to take appropriate action.
These adjustments reflect Meta’s commitment to addressing concerns related to teen safety and well-being on its platforms. By enforcing stricter controls over direct messaging and involving parental consent in certain settings, Meta aims to provide a safer digital environment for its younger user base.
- Tags: Meta