Roblox is beloved by kids. These parents say predators are lurking.
Updated Dec. 18, 2025, 8:50 a.m. ET
SAN MATEO, Calif. — At first, the messages seemed normal. Innocent, even.
“Hi,” “How are u?” And then: “Do you want to make Robux?”
To millions of kids who play Roblox every day, those words might seem irresistible, and the same was true for Amie’s then-13-year-old daughter.
Roblox has nearly 83 million daily active users, and 42% of its global player base is under the age of 13, meaning there are approximately 34.9 million children playing daily. Created in 2006, the online platform now hosts more than 40 million games, known as “experiences.” Robux, its in-game currency, powers these virtual realities, and players use this in-game cash to purchase new avatars and accessories, or to get premium in-game features in popular games like Grow a Garden or Dress to Impress.
Amie’s daughter struggled to make friends at school. When she asked her mom if she could download Roblox, Amie hesitated. She worried about too much screen time and didn’t see value in the games. But eventually, she gave in. After all, the rest of the kids in their neighborhood were already on it.

But last March, the stranger who messaged Amie’s daughter offering Robux directed her to move their conversation over to Discord, a messaging platform popular among gamers.
“Before we start could u verify for me? Take a pic of u holding up 2 fingers,” the predator instructed, taking the time to confirm the child’s identity before he told her to send sexually explicit videos and images.
USA TODAY is withholding Amie’s last name to protect her daughter’s privacy.
While Amie’s experience is disturbing, it’s not entirely unique. Families across the country describe similar incidents: predators using Roblox as an entry point for grooming, which in some cases escalated into sextortion, sexual assault or kidnapping. Some have already sued Roblox, and others plan to do so.
Roblox says it’s made strides toward safer guardrails in recent years, and even in the past few months, to protect against these kinds of incidents. USA TODAY spoke with company representatives at length about these efforts, and we visited their headquarters to see a demo of Roblox’s age-verification technology, powered by AI. Currently, the feature is voluntary, but it will become mandatory to use chat in January.
Still, some experts and those families whose children encountered predators, say the changes may be insufficient − and are coming far too late.
‘A haven for adult sexual predators and pedophiles’
To Amie’s daughter, the abuse was framed as a game in which she received validation. Her learning disability, anxiety and ADHD had made it hard for her to connect with her peers, and the predator helped fill that gap.
“I’ve missed u” he wrote when she logged on after school. The predator showered her with affection: “I love u sm (so much),” he would tell her, often sending her sexually explicit content of himself. “I would never leave u bb (baby).”
If she went more than a few hours without messaging him, he would contact her relentlessly. One day, when she wrote she was in class, he suggested she go to the school bathroom to take sexually explicit photos.
When Amie discovered what was happening and messaged the predator on Discord, he didn’t give up. Amie discontinued her daughter’s use of both Roblox and Discord and reported the predator’s username to the FBI.
Her daughter never received any of the promised Robux.
Long before the Roblox abuse, Amie monitored her daughter’s text messages and used parental control apps like Net Nanny, Qustodio and Kaspersky Safe Kids. But she hadn’t thought of Roblox as the kind of place she needed to look.
“She got hooked in, emotionally. [She] thought this man loved her,” Amie says.
Currently, attorneys general around the country — in states like Louisiana, Kentucky, Texas, South Carolina and Florida — are scrutinizing Roblox. Simultaneously, there are nearly 80 active lawsuits against Roblox with claims tied to sexual predation.

A California man was arrested after authorities charged him with kidnapping and sexually assaulting a 10-year-old he initially communicated with on Roblox and Discord. An Oklahoma mother alleged that a man posing as a 15-year-old boy on the platform sexually extorted her 12-year-old daughter and threatened to hurt her family if she didn’t send more photos and videos.
When the family brought the exploitation to the police, they told her the suspect was an offender with a history of sexually inappropriate behavior. The complaint and similar ones filed by the firm Gould Grieco & Hensley call Roblox “a haven for adult sexual predators and pedophiles.”
“This is a fertile hunting ground for children,” says Matthew Dolman, managing partner of Dolman Law Group, which has received more than 3,600 inquiries from families about abuse on Roblox. “Roblox is materially misrepresenting the safety of their product, to both their shareholders and the general public.”
In Chicago, another mother tells USA TODAY she learned too late about two-way communication between Roblox players. In August of 2020, an adult man posing as a teenager started chatting with her 10-year-old daughter in a Roblox building game before moving the conversation to a text chain. He sent her sexually explicit photos and requested them in return.
After weeks of chatting, the predator convinced her child to leave home to meet up. In the middle of the night, her 10-year-old walked five blocks to meet him. He took her to a motel on Chicago’s south side, where he and four other men raped her.
“This was the ultimate breakage for my family,” says the mom, whose name we are withholding to protect her daughter’s privacy. “I feel like I’m in a movie and I cannot wake up.”
The mother sued Roblox in October “for recklessly and deceptively operating a business in a way that led directly to the sexual exploitation and abuse” of her daughter.
Roblox’s new age estimator software helps curb abuse
At Roblox, Matt Kaufman, the company’s Chief Child Safety Officer since 2023, is tasked with addressing concerns about kids’ safety.
The company has faced scrutiny over its removal of so-called vigilantes from the platform and its push for some legal cases to be resolved via arbitration instead of in public court, the latter of which Roblox declined to comment on.
“Roblox takes these things incredibly seriously,” Kaufman says. “When we find out about something or we detect something, we immediately investigate, and we have all kinds of systems in place to manage that.”
Roblox historically used self-reported age sign ups to safeguard users under 13 from accessing features like private chat messaging and voice chat, but it hopes its plan for a new AI-powered age estimation feature, announced Nov. 18, will be a game-changer and help address criticism the company has faced for allowing child and adult users to interact.
The feature began rolling out in Australia, New Zealand and the Netherlands in December. On Jan. 7, it will be mandatory worldwide for all users.

The photos used for the app’s one-time age estimation are immediately deleted, Kaufman says. But safety moderators continually monitor the types of games users play, who they add as friends and who they interact with on the platform, he says. If there are signs the age estimation appears incorrect based on a user’s activity, moderators might prompt a user to reverify their age to ensure the account isn’t being shared between multiple users or that the detection wasn’t done with someone else’s face.
“Practically every other application platform that’s out there uses self-declared age, either typing in a birthday or hitting a check box to assert that you’re over 13, and when you compare that process to what we’re doing, we’re dramatically more accurate,” Kaufman says.
Similar popular games, like Minecraft and Fortnite, rely on self-reported age checks and parental controls to monitor safety. But these apps haven’t experienced the same level of issues with child predators that Roblox has, something Roblox attributes in part to the sheer volume of users.
The age check is optional, but without it, come Jan. 7, users won’t be able to chat. After users complete the age-check process, they’re assigned to one of six age groups from under 9 to 21+, which Kaufman says mimic the clusters of age groupings you would see in real life.

Kaufman says Roblox is proud to be one of the largest platforms in the world doing age estimation. But some safety advocates and impacted families argue this feature isn’t enough.
Becca Dallas, of San Diego, is one such critic. She lost her 15-year-old son Ethan to suicide following the trauma he experienced with a predator he met on Roblox.
The predator, a 37-year-old who posed as a teenage boy, gained Ethan’s trust when he was 12-years-old. The man encouraged Ethan to chat on Discord and coerced him to send explicit photos. Dallas recalls she set up Ethan’s Roblox account with parental controls, but the predator showed her son how to work around them.
He was autistic, and Roblox was a space where he felt accepted, she says. Dallas thought the app was safe.
“Ultimately, what he loved killed him,” she says. She’s suing Roblox and Discord, accusing them of wrongful death. “It’s not going to bring my son back, but you need to do what you can to save these other families.”
How to stay safe on gaming apps
Stranger danger and sexual abuse on the internet has been an issue for decades, but safety experts say modern technology has exacerbated it. Now, predators have near-constant access to their victims via smartphones.
Ben Gillenwater, a cybersecurity expert known as the “Family IT Guy” on TikTok, said that parents should open up conversations with their children about safe game play online, and that the best thing parents can do is turn off chat features manually.
Roblox’s Global Head of Parental Advocacy Dr. Elizabeth Milovidov emphasizes the importance of properly setting up parental controls on Roblox or any other similar app. Roblox’s parental controls allow parents to manage privacy and communication settings, with more strict oversight for users under 13.
If users encounter a predator online, victims should report the person’s account but keep their own documentation of all messages, according to the National Center for Missing & Exploited Children. A paper trail with time stamps and messages can be vital for finding a criminal’s identity.
Victims should block the predator and report the abuse to authorities by using NCMEC’s CyberTipline at report.cybertip.org or 1-800-843-5678, tips.fbi.gov/home or the Know2Protect Tipline at 833-591-5669.
Young victims of sexual abuse often experience feelings of shame, guilt or confusion over what’s happened to them.
When Amie’s daughter told her best friend about the abuse, the friend didn’t understand. She said she didn’t want Roblox to get taken away.
“It’s not Roblox’s fault,” the friend told her. “Roblox is my life.”
When Amie deleted the app, her daughter felt like she had lost a social outlet. She had other friends who were on the app, constantly.
“I know my mom’s heart was in the right place,” she says. “But [I was like] let me get the game back.”
Rachel Hale’s role covering Youth Mental Health at USA TODAY is supported by a partnership with Pivotal Ventures and Journalism Funding Partners. Funders do not provide editorial input. Reach her atrhale@usatoday.com and@rachelleighhale on X.



Publicar comentário