Indiana Becomes Latest State to Sue Roblox as Child Safety Lawsuits Grow
Editors carefully fact-check all Consumer Notice, LLC content for accuracy and quality.
Consumer Notice, LLC has a stringent fact-checking process. It starts with our strict sourcing guidelines.
We only gather information from credible sources. This includes peer-reviewed medical journals, reputable media outlets, government reports, court records and interviews with qualified experts.
Indiana has become the latest state to sue Roblox over claims that the popular gaming platform failed to protect its young users from child predators.
The lawsuit, filed on Wednesday, argued that Roblox has “corralled children into their online communities, and then opened the gates wide to adult predators.” The lawsuit referenced multiple instances of adults being arrested and facing charges of exploiting children they met on Roblox.
The popular communication platform Discord was also named as a defendant in the lawsuit. Some cases claim that predators made initial contact with children on Roblox before moving communication to Discord.
“These companies, which cater to kids and young individuals, know full well that numerous predatory sex criminals have used these platforms to contact and lure their victims,” Indiana Attorney General Todd Rokita said in a statement. “And yet, they continue promoting themselves as safe for children.”
Roblox, which is wildly popular and has hundreds of millions of users, has faced growing legal scrutiny in recent months over concerns that its platform has served as a channel for child predators to connect with, groom and exploit minors.
Nearly 150 Roblox lawsuits have been filed in federal court by individuals and families over these claims. Numerous states have also filed similar lawsuits or launched investigations into the platform’s safety.
The lawsuits argue that Roblox prioritized growth over keeping its users safe, failing to regulate its own platform and allowing predators to easily get in contact with children of varying age groups.
As Legal Pressure Mounts, Roblox Implements New Safety Features
In the wake of the growing number of lawsuits, Roblox has announced several safety changes to its platform, some of which were spurred by state settlements.
The company launched an AI facial recognition tool earlier this year to estimate a user’s age and limit their ability to communicate with other age groups. It also plans to roll out age-based accounts that will limit content and communication options for younger users.
Those changes come as Roblox has agreed to multimillion-dollar settlements with different states over safety concerns.
In April 2026, the company agreed to pay a combined $35 million to Alabama, Nevada and West Virginia. That money is set to be spent on public safety campaigns, school resource officers and non-digital activities within those states.
As part of the settlements, Roblox also affirmed its commitment to improving safety features on its platform nationwide.
Roblox Lawsuits Consolidated in Federal Court
Even as Roblox has agreed to settlements with some states, litigation surrounding the platform’s safety is still growing.
In December 2025, the Roblox lawsuits in federal court were consolidated into a multidistrict litigation (MDL). This is a legal procedure used when many similar lawsuits are expected to be filed.
An MDL places all of those cases before one judge, allowing them to go through streamlined, coordinated proceedings. Many Discord lawsuits are also included in the MDL.
More lawsuits are expected to be filed in the coming months as the litigation grows.