Meta strengthens parental protection on Instagram and Facebook Messenger

Meta has announced a series of changes to Instagram and Facebook Messenger to improve the protection of minors from unwanted contact online. The company said that, by default, children under the age of 16 (or 18 in some countries) will no longer be able to receive messages or be added to chat groups from users they don’t follow or know.

These new measures are in addition to a number of safeguards that Meta has introduced in recent months, in response to allegations that it has facilitated encounters between sexual predators and minors on its platforms. Unlike previous restrictions, which only restricted adults over the age of 19 from contacting minors without their consent, the new rules will apply to all users, regardless of age. Meta said Instagram users will be notified of the change via a message at the top of their page. Minors with supervised accounts will need to request permission from the parent or guardian monitoring their account to change this setting.

Parental supervision tools on Instagram will also be expanded. Instead of simply being informed when their child makes changes to their security and privacy settings, parents will now be asked to approve or deny their requests, preventing them from switching from a private to a public profile, for example.

Meta also announced the development of a new feature that should protect users from sending or receiving unwanted or inappropriate images via messages. The feature will also work in encrypted chats and will be released later this year.

These measures represent an important step forward in advocating for children online and encourage other social media platforms to follow Meta’s lead.

LEAVE A REPLY

Please enter your comment!
Please enter your name here