A Parent’s Worst Fear Often Starts With a Screen

Most parents have had that moment.
Your child is sitting quietly on the couch, tablet in hand, headphones on. At first glance, everything seems harmless. Maybe they are watching videos, playing a game, or chatting with friends. But then a story breaks in the news about a child who was manipulated online, exposed to violent content, or targeted by predators on a platform that millions of kids use every day.
Suddenly the question hits you: Is my child really safe online?
For many families, that fear is not hypothetical. Parents across the country have watched their children suffer harm through social media platforms and online gaming environments. Some have discovered extremist communities targeting kids. Others have uncovered grooming, exploitation, or abuse happening in spaces that were supposed to be safe.
Now, a controversial bill moving through Congress could make it much harder to hold those tech platforms accountable.
Attorneys Alexandra Walsh and Patrick Huyett of Anapol Weiss recently traveled to Capitol Hill alongside families affected by online harm to speak out about proposed legislation called the Kids Internet and Digital Safety Act or KIDS Act. Despite its name, many advocates believe the bill could actually make it harder to hold tech companies responsible when children are harmed online.
For parents trying to protect their kids in a digital world, that debate matters more than ever.
Child Online Safety: Why More Parents Are Asking If Their Kids Are Truly Protected
Children today grow up online.
They play games with friends across the country, build virtual worlds, watch livestreams, and communicate through apps that did not exist when most parents were growing up.
Many of these spaces are creative and exciting. However, they can also expose children to serious risks.
Investigations and lawsuits have revealed cases where predators allegedly used gaming platforms to contact minors, manipulate them through private messaging, and move conversations to other apps where abuse occurred.
One of the most widely discussed platforms in these cases is Roblox, an online gaming environment used by tens of millions of young players each day. In recent years, parents have filed lawsuits claiming predators used the platform to groom children and exploit them through online communication tools.
In one major case reported by ABC News, a California judge refused to allow Roblox to move a child exploitation lawsuit into private arbitration. The case involves a father who alleges his 13-year-old son was contacted by a predator through a game on the platform before the abuse escalated on another app. Instead of allowing the case to be handled behind closed doors, the judge ruled that it should remain in public court.
Alexandra Walsh, who represents this family and about a dozen others in the litigation, said the company should have to defend itself openly. As she told ABC News, ‘Roblox has the right to defend itself, but it should do so in the light of day so the public can see, and so that a jury made up of citizens of this country can decide if they are liable or not.’
For many families bringing these cases, keeping the lawsuit in open court is about more than one child. They believe transparency is the only way to fully examine how children can be harmed on platforms used by millions of young players.
The KIDS Act: Could This Law Weaken Online Protections for Children?
At first glance, the Kids Internet and Digital Safety Act sounds like a law designed to strengthen protections for children online.
Section 3 of the bill, titled “Preventing Harm to Minors,” requires technology platforms to create and enforce policies aimed at addressing serious risks to young users. Specifically, the bill says covered platforms must maintain reasonable policies and procedures that address threats such as physical violence, sexual exploitation and abuse, drug or alcohol activity involving minors, and financial harm caused by deceptive practices.
The law also says those policies should be tailored to the size and complexity of the platform and what is technologically feasible. In other words, a massive global platform would not necessarily be expected to implement the same safety systems as a smaller website or service.
At the same time, the bill includes important limitations on what platforms are required to do. For example, it states that companies are not required to stop minors from deliberately searching for specific content themselves. It also clarifies that platforms cannot be regulated based on the viewpoint of user speech that is protected by the First Amendment.
Supporters of the bill argue these provisions attempt to balance child safety with free speech rights and technological practicality.
However, critics say the language could leave significant gaps in accountability.
One concern involves how the law frames platform responsibilities. Instead of creating a clear legal standard requiring companies to protect children from foreseeable harm, the bill focuses on whether companies have internal policies that address certain risks. Because platforms create and enforce those policies themselves, critics argue the standard could be difficult to challenge even when harm occurs.
Advocates also point to other provisions in the bill that they believe could affect ongoing lawsuits and stronger state level protections designed to address online harm to minors.
For families already pursuing legal action against technology companies, those concerns are not abstract. Many of the cases currently moving through the courts rely on arguments that platforms failed to take reasonable steps to prevent foreseeable dangers to children using their services.
If federal law changes how those claims can be brought, it could reshape the legal landscape surrounding online safety for years to come.
When Online Platforms Fail Kids: The Real Stories Parents Want Lawmakers To Hear
For some families, these dangers have already become a painful reality.
Parents have shared devastating stories about how their children were pulled into harmful online communities.
In one widely reported case, a mother began warning other parents after her daughter became involved with an online extremist network that glorified school shooters. The young girl eventually died by suicide after interacting with individuals connected to that group.
Investigators say some of these networks operate across social platforms, gaming communities, and messaging apps, often targeting vulnerable children and encouraging dangerous behavior.
Law enforcement agencies have also warned about extremist online networks such as the group known as 764, which authorities say manipulates and exploits minors through online platforms and chat communities.
For parents who hear stories like these, the question becomes unavoidable: If a company creates a digital environment where millions of children interact, should it have a responsibility to keep that space reasonably safe?
Online Platform Accountability: How Anapol Weiss Is Taking The Fight To Court
Families across the country are turning to the courts to demand answers about how major technology platforms protect children online.
Anapol Weiss represents families in litigation involving Roblox, where parents allege the platform failed to implement safeguards that could have prevented predators from targeting minors.
Attorney Alexandra Walsh has been appointed as Plaintiffs’ Co-Lead Counsel in the federal Multi-District Litigation involving Roblox, helping coordinate legal efforts as families pursue claims against the platform.
For many parents involved in these cases, the goal is not only accountability for what happened to their own child. They also want transparency about how these platforms operate and whether stronger protections could prevent similar harm to other families.
When cases move forward in court, companies may be required to disclose internal safety policies, reporting systems, and how they respond to warnings about dangerous activity involving minors. That process can reveal whether platforms took meaningful steps to reduce risks for young users.
The outcome of these cases may also influence how technology companies design safety features, monitor suspicious behavior, and respond to reports of abuse involving children.
Protecting Kids Online: Why Families Are Urging Lawmakers To Listen
The families visiting Capitol Hill this week are not technology experts or policy insiders.
They are parents who experienced firsthand how online platforms can expose children to serious harm.
Their message to lawmakers is simple.
If Congress wants to pass a law about online child safety, it should strengthen protections rather than weaken them.
They believe technology companies should take real responsibility for the environments they create, especially when those environments are designed for children.
By bringing families to Washington, attorneys Alexandra Walsh and Patrick Huyett hope lawmakers will hear directly from the people most affected by these issues.
Internet Safety For Children: How Can You Help Protect Your Kids Right Now?
While lawmakers debate the future of online safety regulations, parents still face the day-to-day challenge of protecting their children online.
Experts recommend several practical steps:
- Talk regularly with your child about the apps and games they use
- Ask who they interact with online and whether strangers contact them
- Enable parental controls when possible
- Monitor chat functions and private messaging features
- Encourage children to report anything that makes them uncomfortable
Most importantly, parents should remind their children that not everyone online is who they claim to be.
Many predators rely on anonymity and false identities to gain trust before manipulating young users.
Open communication can make a major difference.
The KIDS Act Debate: Will Congress Protect Children Or Big Tech?
Technology has transformed childhood.
Games, social media, and digital communities allow kids to connect, learn, and create in ways previous generations never imagined.
But when those same platforms become tools for predators, extremists, or manipulators, families are left asking a difficult question.
Who is responsible for protecting children online?
The debate unfolding in Congress right now will help answer that question.
For many parents, the hope is simple. Laws designed to protect children should actually make the internet safer, not give technology companies a legal shield when harm occurs.
Because behind every headline about online exploitation or digital abuse is a family that once believed their child was safe behind a screen.
If your child was harmed on Roblox or another online platform, contact Anapol Weiss for a free and confidential consultation. Your family should not have to face this alone.
Disclaimer: This blog is intended for informational purposes only and does not establish an attorney-client relationship. It should not be considered legal advice. For personalized legal assistance, please consult our team directly.

