Caroline Koziol once dominated Connecticut’s high school swim scene, crushing the 100-yard butterfly as a freshman. By her senior year, things had totally unraveled—just making it up a set of stairs left her dizzy and gasping, like her body was working against her.
Then came 2021. She straight-up collapsed in the middle of swim practice. Not because of some pulled muscle or sprained ankle, but because anorexia had sunk its claws in deep. Social media? It played a massive part, just this relentless barrage of “perfect” bodies and toxic advice that totally spun her out.
Her struggles ramped up during the pandemic. Stuck at home, Koziol started browsing Instagram and TikTok, just hoping for some healthy recipes or fitness inspiration.
Instead, her feeds became saturated with extreme diets, grueling workouts, and endless body-check videos from influencers. “One innocent search turned into this avalanche,” she recalled.
Now, as a college junior, Koziol is one of more than 1,800 plaintiffs taking Meta and TikTok’s parent companies to court. The lawsuit claims these tech giants didn’t just provide a platform for harmful material—they designed their platforms to be addictive and, for young users, potentially devastating.
And this isn’t some isolated legal battle. It’s a massive, multi-district lawsuit, with teens, parents, schools, and even 29 state attorneys general involved.
What was the core accusation made by Caroline Koziol?
These companies exploit vulnerable users by deploying algorithms that chase engagement, even when it’s at the expense of mental health. Internal Meta documents reportedly admit Instagram made body image issues worse for about a third of teen girls.
Koziol doesn’t blame a single post or influencer. She points to the design of the platforms—the infinite scroll, the algorithms that keep serving up more of the same damaging content—that pulled her deeper into her eating disorder.
Caroline Koziol: That’s when it became obsessive
“If I saw one or two videos, it wouldn’t have made a difference,” she said. “But when it was all that I was seeing, that’s when it became obsessive.”
Her health deteriorated. She lost a dangerous amount of weight; her throat and hormones were affected; her memory faded. Even after therapy and inpatient treatment, the platforms’ algorithms kept surfacing harmful posts.
During group therapy, she learned that women from all backgrounds were dealing with the same flood of pro-anorexia content.
Her goal of swimming in college? Gone. She passed on the University of South Carolina and stayed closer to home to focus on recovery. The future she’d imagined for herself unraveled.
Do Meta and TikTok have content filters?
As per reports, Meta and TikTok claim that they have introduced protections—content filters, AI moderation, and parental controls. Critics call these measures inadequate. Age checks are easily bypassed, and harmful content still finds its way into teen feeds.
The lawsuit isn’t targeting the content itself (which Section 230 protects), but instead accuses the companies of negligence and defective design. There’s a push for compensation, but the broader aim is to force real changes in how these platforms operate.
Koziol is still working on her recovery and staying in school, but even now, the algorithms sometimes slip toxic posts into her feed. “The second I click on it, I know I’ll see more tomorrow,” she said.
She hopes this lawsuit will finally hold the tech companies accountable—and force meaningful change. “They knew they were harming girls like me,” she said. “And it ruined my life.”
ALSO READ: Who Was Mikayla Raines? Beloved Save A Fox Founder And YouTuber Dies By Suicide At 29