Meta rolls out new privacy updates in bid to keep teen users safe
New online safety precautions have been introduced by social media behemoth Meta to protect teenage users.
Last year, Meta introduced some of the steps that might be taken to keep teenagers away from possibly suspicious adults. For instance, the company forbids adults from contacting minors with whom they are not connected or from seeing adolescents in its People You May Know suggestions.
On Monday, Meta said that it would add new safeguards to its social networking sites Facebook and Instagram to safeguard young users from harm online. Facebook announced in a blog post that anybody below 16, or 18 in some regions, will now automatically have more private settings for their accounts.
The company is experimenting with additional safeguards to prevent teens from communicating with questionable adults they are not related to, and they won’t display them in the People You May Know section for teens.
Read More: Meta Might do ‘Quiet Layoff’ and Axe over 12,000 Facebook Employees
The company is also completely eliminating the message button from teen Instagram profiles when they are visited by suspicious adults as an additional degree of security. According to Meta, an adult account that has lately been reported or blocked by a youngster qualifies as suspicious.
The updated tools come with more standard privacy settings and safety notifications. Some privacy options allow kids to control who may view their friends list, posts that they’re tagged in, and who can add comments on their public posts.
Meta noted, “We’re working with the National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried intimate images they created might be shared on public online platforms without their consent.”
In order to combat the online proliferation of self-generated personal photographs, new tools and training are also being developed. Meta is collaborating with Thorn and their NoFiltr brand to develop instructional resources that will assist youngsters in lessening the shame and stigma associated with intimate photographs.
Meta noted, “We found that more than 75% of people that we reported to NCMEC for sharing child exploitative content shared the content out of outrage, poor humor, or disgust, and with no apparent intention of harm. Sharing this content violates our policies, regardless of intent.
We’re planning to launch a new PSA campaign that encourages people to stop and think before resharing those images online and to report them to us instead.”
Teens can use a variety of tools created by Meta to inform the platform when something in one of its apps causes them to feel uncomfortable. They are encouraged to use these tools through new notifications the company has introduced.
For instance, the company sends teens safety notifications with instructions on how to deal with abusive messages from adults and prompts them to report profiles to the platform when they block someone.
Over 100 million users viewed safety notifications on Messenger in a single month in 2021. Additionally, the company has made it simpler for users to access the reporting tools. As an outcome, the company received over 70% more reports from minors, in Q1 2022 compared to Q1 2021, on Instagram DM, and Facebook Messenger.
I am a law graduate from NLU Lucknow. I have a flair for creative writing and hence in my free time work as a freelance content writer.