TRENTON – Discord, the communication app popular with young people, is being sued by the state of New Jersey, on charges of misleading parents and kids, particularly about its safety settings for direct messages, officials said Thursday.
The Office of the Attorney General and Division of Consumer Affairs has filed a suit against Discord, Inc. for “deceptive and unconscionable” business practices that misled parents and obscured the risks children faced when using the application, exposing children to sexual and violent content, according to the complaint.
The lawsuit stems from a multi-year investigation that determined Discord's conduct violated New Jersey’s consumer protection laws, leaving children vulnerable to online predators lurking on the app, New Jersey Attorney General Matthew J. Platkin said.
"Discord knew its safety features and policies could not and did not protect its youthful user base, but refused to do better," the complaint says.
The complaint, filed partially under seal Thursday in the Superior Court of New Jersey, Chancery Division, Essex County, claims Discord engaged in multiple violations of the New Jersey Consumer Fraud Act, Platkin said.
“Discord markets itself as a safe space for children, despite being fully aware that the application’s misleading safety settings and lax oversight have made it a prime hunting ground for online predators seeking easy access to children,” Platkin said. “These deceptive claims regarding its safety settings have allowed Discord to attract a growing number of children to use its application, where they are at risk."
Based in San Francisco, Discord owns and manages an application that allows users to communicate through text, audio and video. Since its launch about a decade ago, the app has become one of the most popular online social platforms in the world, especially among children, who make up a significant portion of Discord’s user base.
Discord has represented its app as safe, relying in part on its policies barring underage use of the app and the circulation of explicit material, including child sexual abuse content, the complaint said. Most importantly, Discord has promoted its Safe Direct Messaging feature and its successors, which it claimed automatically scans and deletes private direct messages that contain explicit media content.
"But Discord’s promises fell, and continue to fall, flat," the complaint said.
News accounts and reports from prosecutors’ offices show that despite the app’s promises of child safety, predators use the app to stalk, contact and victimize children, Platkin said.These sources identify alarming cases where adults were charged and convicted of using Discord to contact children, often posing as children themselves, and transmitting and soliciting explicit images through the app, including through the use of sextortion, Platkin said.
In many criminal cases involving sexual exploitation of children on Discord, the children were under the age of 13, despite Discord’s claim to enforce its policy prohibiting children under 13 from using the app, Platkin said.
“Discord claims that safety is at the core of everything it does, but the truth is, the application is not safe for children," said Cari Fais, director of the New Jersey Division of Consumer Affairs. "Discord’s deliberate misrepresentation of the application’s safety settings has harmed – and continues to harm – New Jersey’s children and must stop.”
Discord’s platform is structured to encourage unchecked and unmoderated engagement among its users, designed to appeal to children's desire for personalization and play by offering custom emojis, stickers and soundboard effects, the complaint said.
All of these components are intended to make chats more engaging and kid-friendly, and the platform has created or facilitated “student hubs” as well as communities focused on popular kids’ games, like Roblox, the complaint said.
Discord encourages and facilitates free interaction and engagement between its users, and default settings allow users to receive friend requests from anyone on the app – and to receive private direct messages from friends and anyone using the same server or virtual “community” – enabling child users to connect easily and become “friends” with hundreds of other users, according to the complaint.
Then, because Discord’s default safety settings disable message scanning between “friends,” child users can be – and are – inundated with explicit content. This explicit content can include user-created child sexual abuse material, messages intended to sexually exploit or coerce a child to engage in self-harm, internet links to sexually explicit content, images, and videos depicting violence, and videos containing sexually explicit content.
The complaint also states that Discord misled users about the safe direct messaging feature, offering three options:
- Keep me safe. Scan direct messages from everyone.
- My friends are nice. Scan direct messages from everyone unless they are a friend.
- Do not scan. Direct messages with not be scanned for explicit content.
For most of the feature’s existence, Discord made the “My friends are nice” option the default setting for every new user on the app, Platkin said. This option only scanned incoming direct messages if the sender was not on the user’s friends list. For both the “Keep me Safe” and “My friends are nice” settings, Discord said it would automatically scan and delete direct messages that contained explicit content.
"But this was not true,” according to Platkin. "Despite its claims, Discord knew that not all explicit content was being detected or deleted."
Combined with Discord’s deception about its Safe Direct Messaging features, Platkin said Discord’s other design choices worked together to virtually ensure that children were harmed or placed at risk of harm on its app. For example:
- By default, Discord allows users to exchange DMs if they belong to a common server. Therefore, a malicious user – adult or child – just needs to join a community server, which could contain over a million users, to exchange DMs with an unsuspecting child user, the complaint said.
- DMs among “friends” are even more dangerous, according to Platkin. Discord’s default settings not only allow any user to send a friend request to a child, they also then permit those users, once “friends,” to exchange unscanned DMs through the default “My friends are nice” setting. Children can receive and accept friend requests from users whom they do not know and with whom they have no connection and then engage privately on the platform without any oversight – all by design, the complaint said.
- Users may also create multiple accounts to hide their activities and circumvent being banned from servers, or from facing other repercussions. And even if users are banned from a server, or from Discord itself, Discord’s design allows them to simply re-engage using a new account, the complaint said.
Finally, the lawsuit alleges that Discord misrepresented that users under the age of 13 are not permitted to create accounts and are banned from Discord upon discovery.
"Simply put, Discord has promised parents safety while simultaneously making deliberate choices about its app’s design and default settings, including Safe Direct Messaging and age verification systems, that broke those promises," Platkin said. "As a result of Discord’s decisions, thousands of users were misled into signing up, believing they or their children would be safe, when they were really anything but."
The lawsuit seeks a number of remedies, including an injunction to stop Discord from violating the New Jersey Consumer Fraud Act, civil penalties, and the mandated repayment of any profits generated in New Jersey through this unlawful behavior, Platkin said.
This lawsuit against Discord is the latest action taken by the Office of the Attorney General to keep children safe online, Platkin said. Last fall, the office sued TikTok for unlawful conduct tied to features that keep children and teens online for ever-increasing amounts of time despite the harms that result.
A year prior, it sued Meta Platforms, the owner of Instagram and Facebook, for similar unlawful conduct, Platkin said. Both the Meta and TikTok complaints stemmed from the same national investigation, which was co-led by New Jersey.
Additionally, in recent years, the New Jersey Division of Criminal Justice has prosecuted numerous cases in which defendants allegedly used social media platforms and chat apps–including Discord—to prey on children and engage them in sexually explicit conversations as a means of obtaining child sexual abuse material.