
meta had a 17-strike policy for sex Meta’s internal policies regarding user accounts involved in sex trafficking have come under scrutiny, revealing a troubling 17-strike policy that allowed accounts engaged in such activities to operate for an extended period before facing suspension.
meta had a 17-strike policy for sex
Background of the Testimony
The allegations surfaced during a deposition by Vaishnavi Jayakumar, Meta’s former head of safety and wellbeing. Her testimony was part of an unredacted court filing linked to a significant lawsuit concerning child safety on social media platforms. This lawsuit has been filed by various school districts across the United States, highlighting the urgent need for better protections against online exploitation.
The 17-Strikes Policy
According to Jayakumar, Meta’s policy allowed accounts involved in the trafficking of humans for sex to incur up to 16 violations before facing suspension. She stated, “That means that you could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended.” Jayakumar characterized this threshold as “very high” compared to industry standards, raising significant concerns about Meta’s commitment to user safety.
Internal documentation allegedly supports Jayakumar’s claims, with lawyers asserting that the existence of this policy is confirmed through various records. The implications of such a policy are profound, as it raises questions about Meta’s prioritization of user safety over engagement metrics. Critics argue that allowing multiple violations before taking action on accounts involved in serious criminal activities is indicative of a systemic failure to protect vulnerable users.
Concerns Over Reporting Mechanisms
In addition to the 17-strike policy, the unredacted filing revealed other alarming issues regarding Meta’s approach to child safety. Jayakumar noted that there was no specific mechanism for Instagram users to report child sexual abuse material (CSAM) on the platform. This lack of a clear reporting structure raises significant concerns about the efficacy of Meta’s efforts to combat child exploitation.
Internal Resistance to Improvements
When Jayakumar became aware of the absence of a reporting mechanism, she reportedly raised the issue multiple times. However, she was met with resistance, as company leadership deemed it “too much work” to implement a solution. This response illustrates a troubling pattern within Meta, where the prioritization of user engagement appears to overshadow the need for robust safety measures.
Legal and Regulatory Pressures
Meta is currently under increasing legal and regulatory scrutiny regarding child safety on its platforms. The unredacted filing is part of a larger lawsuit that also targets TikTok, Google, and Snapchat. The plaintiffs, which include dozens of school districts, attorneys general, and concerned parents, argue that these platforms contribute to a “mental health crisis” among youth by promoting addictive and potentially harmful content.
Meta’s Response to Allegations
In response to the allegations, Meta spokesperson Andy Stone issued a statement to The Verge, asserting that the company “strongly disagrees” with the claims made in the lawsuit. Stone characterized the allegations as relying on “cherry-picked quotes and misinformed opinions” that create a misleading narrative about Meta’s practices. He emphasized that the company has made significant efforts over the past decade to listen to parents and implement changes aimed at protecting teens.
Engagement vs. Safety: A Troubling Trade-off
The lawsuit highlights multiple instances where Meta is accused of downplaying the potential harms of its platforms in favor of maximizing user engagement. For example, in 2019, the company considered making all teen accounts private by default to prevent unwanted messages. However, this proposal was reportedly rejected after the growth team determined it would likely “smash engagement.” It wasn’t until last year that Meta began implementing private accounts for teens on Instagram, suggesting a reactive rather than proactive approach to user safety.
Impact of Engagement Metrics on Policy Decisions
Further allegations indicate that Meta’s internal research has often been sidelined when it conflicts with engagement metrics. For instance, the company discovered that hiding likes on posts could lead to users feeling “significantly less likely to feel worse about themselves.” Despite this finding, plans to implement this change were reportedly abandoned after it was determined that it could negatively impact Facebook’s metrics.
Additionally, the lawsuit claims that Meta reinstated beauty filters in 2020, despite research indicating that such features actively encourage body dysmorphia among young girls. The company allegedly stated that removing these filters could have a “negative growth impact,” as any restrictions might drive users away from the platform. This pattern of prioritizing engagement over user well-being raises serious ethical questions about Meta’s operational priorities.
Stakeholder Reactions
The revelations from the unredacted filing have prompted strong reactions from various stakeholders, including child safety advocates, parents, and mental health professionals. Many are calling for more stringent regulations on social media platforms, emphasizing the need for accountability in how these companies manage user safety.
Calls for Regulatory Action
Advocates for child safety argue that the current regulatory framework is inadequate to address the complexities of online platforms and their impact on youth. They are urging lawmakers to consider more comprehensive legislation that would impose stricter requirements on social media companies regarding user safety and data privacy. The ongoing lawsuit against Meta and other platforms may serve as a catalyst for such regulatory changes, as it highlights the urgent need for reform in how these companies operate.
Conclusion
The allegations against Meta regarding its 17-strike policy for accounts involved in sex trafficking, along with the lack of effective reporting mechanisms for child sexual abuse material, underscore a troubling trend in the tech industry. The prioritization of engagement over user safety raises ethical concerns that cannot be overlooked. As the lawsuit unfolds, it may lead to significant changes in how social media platforms are regulated and held accountable for their impact on vulnerable users.
Source: Original report
Was this helpful?
Last Modified: November 24, 2025 at 9:36 pm
2 views

