OKLAHOMA CITY (KFOR) — A mother and her 14-year-old son living in Oklahoma County have filed a lawsuit against Roblox, alleging sexual exploitation.
In the lawsuit, attorneys describe the popular gaming platform as a “hunting ground” for child sex predators.
“It’s really dangerous, it’s very scary. Parents are not getting the warnings of the very real and very persistent dangers that exist on the Roblox platform,” said attorney Sara Beller.
Beller, from Dolman Law Group, said the child was 12 years old at the time, and relied heavily on the app for entertainment and social interaction, making him a “prime target.”
“A lot of people in the past have been pretending to be a 10-year-old or a 12-year-old, and then invest a lot of time in developing a relationship with the child. Grooming, as they say, and they actually feel like best friends,” Ron Vaughn, a cybersecurity expert from Emsco Solutions, explained.
According to the complaint, the boy believed he was speaking with someone his age on Roblox’s “chat” feature.
The lawsuit details behavior with the person escalating, with the person sending graphic and sexually explicit messages and images, eventually manipulating the 12-year-old to send inappropriate images and videos of himself back.
More than 800 families have already sued the platform over alleged “sextortion.”
Vaughn told Nexstar’s KFOR that the biggest danger with Roblox is the chat feature.
“That’s how other people get access to the child,” Vaughn said.
Roblox sent KFOR a statement about the incident:
“We are deeply troubled by any incident that endangers our users. While we cannot comment on claims raised in litigation, protecting children is a top priority, which is why our policies are purposely stricter than those found on many other platforms. We recently announced our plans to require facial age checks for all users accessing chat features, making us the first online gaming or communication platform to do so. This innovation enables age-based chat and limits communication between minors and adults. We also limit chat for younger users, don’t allow the sharing of external images, and have filters designed to block the sharing of personal information.
“We dedicate substantial resources—including advanced technology and 24/7 human moderation—to help detect and prevent inappropriate content and behavior, including attempts to direct users off-platform where safety standards and moderation may be less stringent than ours. We understand that no system is perfect, which is why we are constantly working to improve our safety tools and platform restrictions. We have launched 145 new safety initiatives this year alone and recognize this is an industry-wide issue requiring collaborative standards and solutions.”
The spokesperson said they encourage anyone to report content or behavior that may violate the app’s community standards, using the “report abuse” feature.
Vaughn wants to remind families and children that anything uploaded to the internet isn’t always safe, as you never know where it’s going.
“Even the most innocent picture can be altered with artificial intelligence to be an embarrassing picture, and that’s what you don’t want,” Vaughn said.
The attorney representing the family said their lives have been forever changed because of the incident, with a hearing scheduled in December.
Last month, Roblox was issued a subpoena by Florida’s attorney general, requesting information about how the company regarding its age-verification requirements, chat rooms, and marketing toward children.
The Associated Press contributed to this report.