Meta Reinforces Safety Measures for Teens Amid Mounting Regulatory Pressure
In response to global regulatory scrutiny and mounting allegations of fostering a mental health crisis in youths, Meta Platforms announced a significant overhaul of content controls on its platforms.
While using features like Search and Explore on Instagram, teens will be shielded from exposure to sensitive content.On Tuesday, the social media giant revealed a set of safety measures to secure teenagers from harmful content on Instagram and Facebook. This move is intended to protect young users from malicious content related to self-harm, suicide, and eating disorders.
The new policies of Meta include concealing age-inappropriate content" from teenagers. By default, teens will receive restrictive news feeds.
Meta is likely to prompt teen users to review their privacy settings to address concerns about adult strangers sending messages to them. All these changes will create a more secure and safer digital environment for teenagers on Meta platforms as it braces up to provide a safer experience to its users.
Meta also announced the timeframe by which these features will be rolled out. It expressed its commitment to ensure a more age-appropriate" experience on Meta platforms, as the new features become available in the next few weeks.
Meta's Response a Result of Mounting Pressure in US and EuropeNotably, Meta has been facing intensified regulatory pressure regarding content moderation for young users both in the US and Europe.
The allegations state that Meta platforms are additive to young minds and lead to mental health issues.
The EU has sought information from Meta regarding its mechanisms to protect children from harmful and illegal content.Meta was also accused in October by attorneys general from 33 U.S. states including New York and California, who filed a lawsuit claiming that the company habitually misled people regarding the harm associated with its platforms.
The move by Meta follows the testimony of former employee Arturo Bejar in the U.S. Senate, alleging that it was aware of the harassment that teens faced, but it didn't take any remedial action.
Speaking to media, he stated, This should be a conversation about goals and numbers, about harm as experienced by teens".
Of late, Meta has been facing raging competition from TikTok to engage young users. Meta's continual efforts to retain a young audience on its platform come as information from a 2023 Pew Research Center survey revealed that in the US, 63% of teens used TikTok, 59% used Instagram, and just 33% used Facebook.
Series of Allegations Made Meta Buckle Under PressureCourt documents further allege that Meta knowingly refused to shut down accounts belonging to children under 13. The platform also failed to seek parental consent before collecting the personal information of minors.
Meta faced another lawsuit in December from New Mexico's Attorney General, who accused it of nurturing a breeding ground" for predators aiming at children. This consistent pressure on Meta prompted it to restrict sensitive content for users under 18.
The post Meta Reinforces Safety Measures for Teens Amid Mounting Regulatory Pressure appeared first on The Tech Report.