Newly unsealed documents in the national social media litigation reveal a troubling pattern across multiple major platforms, including Instagram, Facebook, TikTok, Snapchat, and YouTube. According to the filings, these companies repeatedly received internal warnings that their products were harming young users, yet consistently prioritized growth, engagement, and profit over the safety of children.
The allegations extend far beyond any one company or any single type of harm. Internal records suggest that major platforms continued to deploy design features known to keep kids online longer, even when their own researchers warned about the consequences. These dangers reportedly included excessive screen time, worsening body image, heightened anxiety and depression, and exposure to inappropriate or harmful content. In some instances, platforms allegedly tolerated rule violations or failed to remove repeat offenders, contributing to environments where exploitation and predatory behavior could occur.
One example cited in the filings describes a former Meta executive referencing a “17-strike policy” before certain violations resulted in an account being removed. That level of tolerance, if proven true, illustrates how safety concerns may have been systematically downplayed. Yet this example reflects a broader industry-wide problem: companies continuing to make engagement-driven product decisions even when internal research highlighted significant risks to young users.
These revelations matter deeply. Parents may believe that the greatest threats to their children exist outside the home, yet today some of the most serious dangers emerge through the screens in their pockets. When companies knowingly design features that foster compulsive use, amplify harmful content, or create unmonitored spaces where children can be targeted for exploitation, the consequences can be devastating and echo for years.
For those of us who represent survivors of exploitation, harassment, and other forms of online harm, these disclosures reflect a familiar pattern of powerful institutions ignoring clear warnings, minimizing foreseeable risks, and allowing preventable harm to continue.
The lawsuit against these tech companies may mark a pivotal shift in how society views the responsibilities of social media platforms. If successful, the litigation could force meaningful changes, such as stronger age-verification systems, safer product design, greater transparency, and better tools to detect and report harmful behavior. But even now, the case has sparked a necessary national conversation about how these platforms shape the emotional, psychological, and social development of young people.
At Singleton Schreiber, we stand unwavering in our commitment to holding powerful institutions accountable when their choices put children at risk. Representing survivors is not just our work; it is our mission. We believe that young people deserve digital spaces that protect their well-being, not exploit it.
Accountability cannot stop with this lawsuit alone. Tech companies must not be allowed to profit from the engagement of children while disregarding the harm their products may cause. With each new disclosure, it becomes increasingly clear that the legal system has a crucial role in ensuring these companies prioritize safety, not just growth.
If you or your child has been harmed through online exploitation, harmful platform design, or addictive social media use, you are not alone and you have options. Our firm is here to help survivors navigate these complex cases with the compassion, support, and advocacy they deserve.
- Counsel
Meagan Verschueren leads Singleton Schreiber’s Sexual Assault and Sex Trafficking Practice Group, where she devotes her career exclusively to representing survivors of sexual violence and exploitation. Known for her ...
