The internet should be a space where children can learn, play, and connect safely. But for too many families, it's become a place of unimaginable danger. A growing online network known as 764 is at the center of a sweeping FBI investigation — and has already victimized children across the country, many through popular platforms like Roblox and Discord.
As evidence of these online threats continue to mount, Anapol Weiss has filed lawsuits against Roblox and Discord on behalf of affected families, seeking to hold the platforms accountable for enabling this predatory abuse.
What Is “764”? A Dark Online Threat to Children
The FBI describes 764 as a loosely organized but highly dangerous group operating on the dark edges of the internet. Originating in 2019, the group promotes neo-Nazi, satanic, and extremist ideologies and is known for actively recruiting minors and encouraging them to engage in self-harm, animal abuse, and suicide.
What makes 764 particularly dangerous is how it infiltrates platforms where children are already active. According to the FBI's Internet Crime Complaint Center, more than 250 investigations have been launched into individuals and subgroups affiliated with 764 across all 50 states.
How 764 Operates on Roblox and Discord
Roblox, a massively popular gaming platform, and Discord, a voice and text chat app, are among the key digital spaces where 764 thrives. Here's how:
- On Roblox, predators have been known to lure children via in-game chat, grooming them under the guise of friendship or shared gaming interests. Some abusers create custom games or avatars that subtly mimic 764 symbolism to identify fellow group members or prey on vulnerable players.
- On Discord, 764-affiliated users set up private servers where minors are manipulated into joining through peer pressure, threats, and blackmail. These servers have hosted graphic content, shared coercion tactics, and coordinated efforts to target specific children.
A recent investigation in Florida uncovered several minors were enticed to join Discord groups run by individuals claiming allegiance to 764. These children were then encouraged to self-harm on camera, with threats of public exposure or harm to loved ones if they resisted.
The FBI emphasizes that the group targets at-risk children, including those experiencing trauma, depression, or social isolation — making their tactics especially insidious.
Legal Accountability: Lawsuits Against Roblox and Discord
Anapol Weiss is representing multiple families whose children were harmed after interactions with predators on Roblox and Discord.
One ongoing lawsuit filed by the firm alleges that a 13-year-old girl in Texas was groomed, sexually abused, and emotionally tormented by a predator she met through these platforms. The complaint argues that Roblox and Discord failed to implement basic safety protocols despite promoting their platforms as safe for minors.
“These companies knew—or should have known—that predators were exploiting their platforms. Yet they chose profits over protection,” says Alexandra Walsh, shareholder at Anapol Weiss. “Our clients are demanding accountability, and so are we.”
Anapol Weiss is currently investigating hundreds of similar cases and encourages families to come forward if they suspect their child has been targeted.
What Parents Can Do: Prevention and Reporting
If your child is active on Roblox, Discord, or other online platforms, here are some steps the FBI and cybersecurity experts recommend:
- Talk to your children regularly about their online activity and the people they interact with.
- Review chat logs and friend lists frequently.
- Enable parental controls and privacy settings on all apps and devices.
- Be aware of signs of distress, secretive behavior, or changes in mood that may indicate online manipulation.
- Report suspicious behavior immediately to CyberTipline.org or call 1-800-THE-LOST.
Final Thoughts
The emergence of 764 proves that parental vigilance alone cannot safeguard children online—robust corporate accountability is legally imperative. Despite their massive reach, platforms like Roblox and Discord have failed to implement commensurate safety measures. The lawsuits spearheaded by Anapol Weiss will test whether these companies finally prioritize child safety or continue to place profits above protection.
With Alexandra Walsh and the legal team at Anapol Weiss leading a determined charge in the courts this moment could mark a seismic shift in how digital platforms protect their youngest users, or face legal consequences when they don’t. In the meantime, families must stay informed, stay engaged, and know that committed advocates stand ready to fight for justice when platforms fall short.