In a free and self-governing nation like ours, the safety of our children should never be sacrificed for corporate profit. But court documents filed in California reveal that Meta, the parent company of Facebook and Instagram, may have done just that. According to these filings, Meta knowingly allowed its platforms to harm kids, ignored dangerous content, and hid these facts from parents.
This lawsuit, which involves more than 1,800 plaintiffs, paints a troubling picture. It claims that Meta not only failed to protect children from predators and harmful content, but also chose to put growth and profit above their safety. The filings say Meta knew that its platforms were addictive and damaging to teen mental health. It also knew that adults were using its platforms to contact minors, and that harmful content involving suicide, eating disorders, and even child sexual abuse was widespread. Yet, according to the evidence, the company did very little to stop it.
Vaishnavi Jayakumar, a former head of safety at Instagram, said that Meta’s policy for dealing with sex trafficking was shockingly weak. One could violate the rules over a dozen times before facing any real punishment. Imagine a company allowing sixteen violations related to sexual exploitation before taking action. That’s not just careless. That’s reckless.
She also said that Meta never told parents or the public how few accounts it actually removed—even those involved in sex trafficking activity. Parents believed their children were safe online. But the truth, according to the lawsuit, was far different.
Even worse, the documents say Meta employees came up with ways to improve safety, but higher-ups shut them down. Why? Because stronger safety rules might reduce how many teens used the app. In other words, they chose profits over protection. That’s not just bad business. It’s morally wrong.
The lawyer for the plaintiffs, Previn Warren, compared Meta’s actions to what the tobacco companies once did—targeting kids even when they knew their product was harmful. Meta’s own data showed that changing privacy settings could have stopped over five million unwanted adult-to-teen interactions each day. But they didn’t act because they feared losing users.
This is a clear example of why our Founding Fathers believed in strong limits on power. Whether it’s government or a giant tech company, no one should be allowed to operate without accountability. The Constitution exists to protect our God-given rights—especially the rights of families to protect their children. When corporations like Meta put profit ahead of public safety, it is the duty of the courts and the states to step in.
It’s also a reminder of why parents—not tech companies—must be in charge of their children’s wellbeing. Big Tech doesn’t know your child. They don’t care what happens to them. And as this case shows, they will sacrifice your child’s safety if it helps their bottom line.
Meta’s former vice president of partnerships, Brian Boland, said it best: “They don’t meaningfully care about user safety.” That’s coming from someone who worked there for over a decade.
This lawsuit should wake up lawmakers and parents across the country. We must demand that tech companies be held responsible for the harm they cause. The states have every right—under the 10th Amendment—to protect their citizens, especially the youngest among them. Congress, too, must ensure that laws reflect our nation’s values, not Silicon Valley’s greed.
This is not just a legal issue. It’s a moral one. If we let companies like Meta continue unchecked, we risk raising a generation exposed to real danger without warning or protection.
The time to act is now. Let’s stand up for our kids, defend our families, and restore the founding principle that no one—no matter how rich or powerful—is above the law.
