Meta Platforms Inc. has been ordered to pay $375 million in civil penalties by a New Mexico jury, which found the company failed to adequately protect children from sexual predators and misled users about the safety of its social media platforms. The verdict, reached Tuesday after six weeks of testimony, represents a significant legal setback for the tech giant amidst growing concerns over online child safety.
Jury Finds Meta Violated New Mexico Consumer Protection Laws
The civil trial, initiated by New Mexico Attorney General Raúl Torrez, centered on allegations that Meta violated state consumer protection laws by failing to safeguard its family of apps, including Facebook and Instagram, from child predators. The jury concluded that Meta willfully violated the state's unfair practices act.
Torrez sued Meta in 2023 following an undercover operation that involved creating a fake social media profile of a 13-year-old girl. According to Torrez, this profile was quickly "inundated with images and targeted solicitations" from child abusers.
Allegations of Prioritizing Growth Over Child Safety
Prosecutors argued that Meta did not enforce its stated minimum age requirement of 13 and failed to prevent harmful content and predatory behavior on Facebook and Instagram. The lawsuit claims that internal systems and algorithms allegedly made it easier for predators to find and contact minors.
“The safety issues presented in this case were not accidents,” state attorney Linda Singer told jurors during closing arguments. “They were the result of corporate decisions that prioritized growth and engagement over the safety of children.”
Undercover Operation Reveals Exploitation Risks
Investigators said the state conducted a sting operation using test accounts created to simulate young users. According to court filings, those accounts quickly received explicit messages, sexual images, and offers from adults, including proposals involving pornography and large payments. Authorities stated that the investigation led to several arrests connected to online exploitation cases.
Former Meta Employee Testifies on Platform Risks
During the trial, jurors heard testimony from former Meta safety researcher Arturo Béjar, who described how his teenage daughter received inappropriate messages shortly after opening an Instagram account. He testified that the platform’s recommendation system could unintentionally connect minors with adults seeking to exploit them.
“The product is very effective at connecting people with similar interests,” Béjar said. “If someone’s interest is children, the system can connect them with children.”
Internal Warnings About Harmful Activity Ignored?
Court documents also revealed internal company communications warning executives about the scale of harmful activity on the platforms, including estimates suggesting hundreds of thousands of potential exploitation incidents daily across Facebook and Instagram. These revelations raise questions about Meta's responsiveness to the documented risks.
Meta Plans to Appeal the Verdict
Meta has stated its disagreement with the verdict and plans to appeal. In a statement, the company said it has invested heavily in safety tools and employs tens of thousands of workers dedicated to removing harmful content and protecting users.
“We work hard to keep people safe on our platforms,” a Meta spokesperson said. “We are clear about the challenges of identifying bad actors online, and we will continue improving our systems.”
Meta’s lawyers argued during the trial that the company has developed advanced automated detection tools and has committed significant resources to online safety.
Broader Implications for Social Media Regulation
The New Mexico case is one of several lawsuits across the United States seeking to hold major technology companies responsible for harmful content on social media platforms. Similar cases are ongoing in other states, including California, where Meta and Google’s YouTube face claims related to social media addiction and youth mental health.
Legal experts say the ruling could influence future cases involving Big Tech companies and may lead to stricter regulation of social media platforms, particularly regarding the protection of minors. The company’s appeal is expected to take months, and the final outcome could have wide-ranging implications for the technology industry.
Parental Guidance and Online Safety Resources in Negros Oriental
While the New Mexico case highlights the serious risks associated with social media platforms, it also underscores the importance of parental guidance and education regarding online safety. Parents and guardians in Negros Oriental can take proactive steps to protect children from online exploitation. This includes:
- Openly communicating with children about online safety risks and responsible social media use.
- Monitoring children's online activity and social media accounts.
- Utilizing parental control features available on social media platforms and devices.
- Educating children about the dangers of sharing personal information online and interacting with strangers.
- Reporting any instances of online exploitation or abuse to the appropriate authorities.
Several organizations and resources are available to provide support and guidance on online safety, including:
- The National Center for Missing and Exploited Children (NCMEC)
- The Internet Watch Foundation (IWF)
- Local child protection agencies in Negros Oriental
Community Awareness Programs Needed
The case serves as a reminder of the need for continued community awareness and education programs on online safety and child protection. Schools, local government units, and community organizations in Negros Oriental can play a crucial role in raising awareness among parents, children, and educators about the risks associated with social media platforms and the steps that can be taken to mitigate those risks.
Photo credit: Photo from Pixabay
