In a significant legal action, the New Jersey Attorney General's office announced on Thursday that it has filed a lawsuit against Discord, a popular gaming-centric social messaging application. The lawsuit claims that the company has misled both parents and children regarding the effectiveness of its child safety features, a matter of growing concern as social media platforms increasingly face scrutiny over user safety.

The lawsuit was lodged in the New Jersey Superior Court by Attorney General Matthew Platkin alongside the state's Division of Consumer Affairs. It alleges that Discord has violated the state's consumer fraud laws, claiming that the company engaged in deceptive practices that could put minors at risk.

According to the legal complaint, Discord's misleading practices involved convincing children and their parents from New Jersey that the platform was safe while obscuring the potential risks that users, particularly minors, could face. The complaint emphasizes that Discord has failed to adequately enforce its minimum age requirement, allowing children under the age of thirteen to easily bypass restrictions.

Discord's strategy of employing difficult-to-navigate and ambiguous safety settings to lull parents and children into a false sense of safety, when Discord knew well that children on the application were being targeted and exploited, are unconscionable and/or abusive commercial acts or practices, the legal filing stated, underscoring the severity of the allegations.

A representative for Discord responded to the lawsuit in a public statement, asserting that the company disputes the claims made against it. The spokesperson expressed surprise at the announcement of the legal action, stating, Given our engagement with the Attorney General's office, we are surprised by the announcement that New Jersey has filed an action against Discord today. The company highlighted its ongoing commitment to safety, pointing to its continuous investments in features and tools designed to enhance user security.

Central to the lawsuit is the allegation concerning Discord's age-verification process. The plaintiffs argue that the process is significantly flawed, allowing children to easily falsify their ages to gain access to the platform. This raises serious questions about the effectiveness of safeguards intended to protect minors in an environment that can expose them to inappropriate content.

Moreover, the lawsuit claims that Discord misrepresented its Safe Direct Messaging feature, which parents were led to believe would automatically screen and eliminate all private messages containing explicit content. However, the complaint asserts that messages sent between users classified as friends were not scanned at all by default. Even when the Safe Direct Messaging filters were activated, the plaintiffs argue that children remained vulnerable to exposure to child sexual abuse material, violent content, and other harmful material.

The New Jersey Attorney General is seeking unspecified civil penalties against Discord, reflecting the seriousness with which the state is treating these allegations. This lawsuit is not an isolated incident, as it forms part of a broader trend where various state attorneys general across the United States are taking action against social media companies. In 2023, for example, a bipartisan coalition of over 40 state attorneys general filed a lawsuit against Meta, alleging that the company knowingly designed addictive features within its platforms, such as Facebook and Instagram, that adversely affect the mental well-being of children and young adults.

Further highlighting this growing concern, the New Mexico Attorney General filed a lawsuit against Snap in September 2024, claiming that the design of Snapchat has made it easier for predators to target children through sextortion schemes. Additionally, a bipartisan group of over a dozen state attorneys general also pursued legal action against TikTok in October 2024, alleging that the app misleads users regarding its safety for children. In one notable lawsuit lodged by the District of Columbia's attorney general, it was claimed that TikTok's virtual currency and livestreaming feature could harm children financially.

In January 2024, executives from major social media companies, including Meta, TikTok, Snap, Discord, and X, faced tough questioning during a Senate hearing. Lawmakers scrutinized these executives over their companies' failure to adequately protect children using their respective platforms.

As the legal landscape continues to evolve, the outcome of the New Jersey lawsuit against Discord could have significant implications for the social media industry and its approach to child safety.