Courtroom and Office of the Attorney General .jpg

A lawsuit filed by the Nevada Attorney General’s Office is taking aim at a popular online messaging platform, alleging its design puts children at risk and fails to warn parents about those dangers.

The complaint, filed against Discord, claims the platform has created an environment where children can be contacted by adult strangers with few barriers in place.

According to the Attorney General’s Office, the lawsuit seeks to hold the company accountable for what it describes as unsafe design choices and a lack of transparency about risks to young users.

“My office’s investigation has revealed that Discord’s lack of age verification, hands-off approach to moderation and account banning, and refusal to limit online interactions between children and adult strangers has made Discord the go-to chat option for child abusers, including in Nevada,” said Attorney General Aaron D. Ford. “If a platform is marketed as a fun place for kids to game and chat with each other, it is the responsibility of that platform to not let adults pretend to be children and create an unsafe space for youth.”

The lawsuit alleges the platform has prioritized growth over safety and misled parents and the public about conditions on the platform. It also claims the company failed to enforce its own minimum age requirement, which the office says amounts to repeated violations of the Nevada Deceptive Trade Practices Act.

State officials say there are limited safeguards preventing adult strangers from finding and privately messaging children. The complaint states people can create accounts without verifying their identity or age, allowing them to bypass bans or restrictions.

The Attorney General’s Office also points to several criminal cases in Nevada in which adults used the platform to contact children before being accused of sexual assault, grooming, or solicitation.

The lawsuit further alleges the company recently attempted to improve age verification measures but later reversed course following backlash from users.

This case is part of a broader effort by the state to address online safety concerns involving children. The Attorney General’s Office has filed similar lawsuits against other major platforms, including TikTok, Snap, Meta, YouTube, and Kik, alleging harmful design features and insufficient safety protections.

Trials in cases involving TikTok and Snap are currently scheduled for 2027.