Discord is a platform that has almost 100 million monthly active users[1], a platform that I used for over four years (from December of 2015 to March of 2020) to talk to friends and find groups, and a platform that almost everyone who plays games uses every single day — but just because a company is huge, doesn't mean that they aren't nefarious.
Invasiveness
Discord has been around for a long time, slowly becoming less and less private and worse for the privacy of it's users. They collect and monetize all information that you directly provide to them and just about every action you take on the platform, this includes your username, email, messages and images that you send through the platform, all of your calls, your IP, device ID, “activities within [our] services”, your demographics, interests, behaviors, and more[2]. So remember that really sensitive message you sent or that call you had? Discord has it — unencrypted. Many users even willingly opt-in to more extreme collection, censorship, and monitoring of their direct messages via Discord's “Safe Direct Messaging” feature. All of this data is collected, sold, traded, profiled, used for legal compliance, marketing, and more. This is even more of an issue as they have said in emails that they do receive government requests for data.[3] They have also seen large investments by some of the biggest global tracking companies, not the least of which include Tencent, Warner Media, and others.[4]
Discord in their privacy policy states that they collect “information from any accounts that you may link”, this means that if you link your any of your accounts from the myriad available platforms that Discord allows you to link (this includes YouTube, Steam, Reddit, Twitter, Spotify, Facebook, and more), Discord is scouring through your accounts and scraping data from them. This could include posts, likes, follows, followers, replies, bio content, and more.
Jason Citron, the current CEO of Discord, also had a previous company before Discord — this project was named OpenFeint[5]. OpenFeint was a social networking application for mobile devices that allowed for leaderboards, forums, and extra achievements for many mobile games that were popular at the time. It was founded in 2009, six years prior to the creation of Discord which was released in 2015. OpenFeint was charged with a class-action lawsuit[6] of illegal violation of the Computer Fraud and Abuse Act, California's Computer Crime Law, giving developers illegal access to user data through its API, California Invasion of Privacy Act, Unfair Competition, Breach of Contract, bad faith, and more. This shows that Mr. Citron has a history of invasive applications, and it stands to reason that Discord is hardly any better.
Discord forces many users to input their phone number or some other form of verification if they are “flagged,”[7][8] in reality this is a broad and mostly untargeted tactic to attempt to get more data out of their users. This practice is especially prevalent with users who have chosen to utilize a VPN or Tor, in which case Discord will almost always provide users with sometimes up to thirty ReCaptcha challenges (or sometimes ones that are incompletable due to Google's blacklisting of “high-traffic” or “suspicious” IPs, realistically this applies to almost all of the Tor or major VPN IPs that Google is aware of) and a request for the user's phone number, therefore deanonymizing the person and making any previous privacy precautions worthless.[9]
On the topic of invasive identity verification, a similar process is undergone for bot developers. Discord forces all bot developers who have their bots in over 100 servers to upload a scanned copy or photo of “[an] identity document like a driver's license, ID card, or passport.”[10]
You can't have a private conversation on Discord. All communications on the platform are unencrypted, and a ledger of all communications made on the platform is stored on Discord's servers. This is in stark contrast to many other growing chat applications, like the more privacy focused Element.io, who have implemented full end-to-end encryption for direct messages and groups[11], and allow for p2p calling. This ensures that not only are your conversations and calls kept private, but also due to the (optional) p2p calling audio quality greatly improves. It doesn't seem that Discord will ever add encryption to chats or groups due to their repeated position, citing that it would “make moderation difficult”, despite the fact that it could just be added as a simple toggle in the conversation's settings menu. This also isn't just some random person asking for a feature that nobody else wants, a user made a post on the discord forums asking them to implement end-to-end encryption and it got 550 upvotes[12], so it's clear that many users want E2EE as well.
When I've explained Discord's privacy problems to users of the platform, I hear many of them say something along the lines of “Well I changed the privacy controls in the settings” — but this actually doesn't really do anything at all. By Discord's own words, “when you turn the flag off, the events are sent, but we tell our servers to not store these events.”[13] This means that all the same data is sent to Discord's servers, and you just have to take their word that they wont do the exact same things with it. If they're not collecting any of the data, then why send it in the first place? If they weren't storing it like they say, then sending it anyway and blocking it on the server side would just waste bandwidth and possibly increase latency in games — since that's what the platform is meant for after all, gaming.
Discord refuses to cooperate with Do Not Track or “DNT” headers, and tracks users the same whether they send a DNT header or not. They say “Our Services currently do not respond to “Do Not Track” (DNT) signals and operate as described [...] whether or not a DNT signal is received”[14], many services have migrated to accepting DNT requests, but Discord has still chosen to not comply with them.
You have to provide your real, legal name. Pseudonyms or other things of the like are not allowed and go strictly against the Terms of Service of the platform, your account could be closed or banned for going against this, once more, destroying anonymity.[15]
When you use the service, you grant Discord a license on all of your content posted on the service. In Discord's Terms of Service they say “By uploading, distributing, transmitting or otherwise using Your Content with the Service, you grant to us a perpetual, nonexclusive, transferable, royalty-free, sublicensable, and worldwide license to use, host, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, perform, and display Your Content in connection with operating and providing the Service.”[16]
Discord does not fully delete users' personal data after a deletion request is received. They clearly state that the data may still be kept for “backup and business continuity purposes.” This is concerning for many reasons, not the least of which being that you can never be sure that Discord isn't keeping all of your data on their datacenters and servers, even if you submit an erasure request.[17]
If Discord were to ever be sold, in a merger, or other financial transactional event, your data may be part of the transferred assets to the new owners.[18] This means that if Discord got bought out by Tencent let's say, all of the data that they collect (which is a lot as described at the top of this section) would be transferred in that acquisition for their new owners to do whatever they would want with the data. And as described above, you aren't safe from this even if you submit a data erasure request, as there is no guarantee that Discord would even delete all of your data and not keep multiple backups of it, that would also be transferred.
Discord in addition contains a process logger that records all of your open programs, browser tabs, and usage habits. This spyware “feature” was confirmed by the CTO of Discord[19]. Due to the fact that Discord is closed source, it is pretty much impossible to tell how much of this is being sent back to Discord for whatever reasons, the best information we have now is their word that they don't send any of this information back. This is what enables their “Game Status” feature, which allows other users to see what game you're playing at that time, or even what program you have open depending on how you have the feature tailored, which is concerning enough in and of itself.
Hypocrisy
I absolutely do not endorse any of the content or people who are involved in creating or distributing the content that I will talk about in this next section. The only reason that I am including this is because I believe that it is contextually important to the bigger story of Discord. Thank you for understanding.
Discord has time and time again taken a strict stance on eliminating servers, banning users associated with, and taking down loli and shota content[20] — and rightfully so, my opinion, and the opinion of many others, is that content of that nature has absolutely no place on the platform and should be removed and dealt with the second that it is discovered. The hypocrisy comes into play when you take into account the moderation team's response to cub content. (If you aren't sure what “loli”, “shota”, or “cub” mean, search for their definitions at your own discretion — you may see content or find information that you didn't want to see or know.) Discord's trust and safety team have said in no uncertain terms that they “do not consider “cub” content to be a violation of our Terms or Guidelines”[21] and that they have no problem allowing content of this nature on the platform[22] (which is assumed to be a result of the moderation team supposedly being comprised of primarily furries.) This is just in direct contradiction. Either you allow all of it, or you allow none of it. Banning users that posted loli content, servers that contained it, etc, but then explicitly allowing the same content just because they're animals? Content of this nature, human or animal, has no place on any social platform, much less one that allows children age 13 years old and up to make an account and use the service[23] — not even to mention all of the people who are under that age that regularly use the service[24], though that's hard to control on the part of Discord considering the sheer scale of the platform and the fact that they have no age verification system in place — it's ironic, they have surveillance in all of the places where it doesn't help but not the one where it would.
Discord states that they don't check users' direct messages unless they have ample reasoning. This reasoning can generally come in a few forms, mass reporting being commonly cited as the most effective, as it can be a fast and mostly accurate way of possibly identifying a problematic person(s). But Discord berates users who mass report and generally take little to no action on a user or server that is mass reported, opposed to one that is naturally found by the moderation team that houses the same or similar content. One Discord moderator stating in a report of a user “Please do not participate in mass reporting”, before closing the thread and abstaining from any further action.
Discord is the go-to place for many illegal activities including stolen account and password selling[25], child grooming, and more. While this is a common and inevitable consequence of substantial growth of a pseudonymous platform like Discord, it still shouldn't undermine the required urgency of a solution and action. The problem lies in the fact that while many people who participate in or facilitate activities of that nature still go free on the platform, many people who aren't knowingly breaking any guidelines get banned and are then subsequently ignored by Discord when asking for any clarification or reason as to why[26]. The Discord Trust and Safety/moderation team is consistently more focused on banning people off of their own personal bias and not any real substantial violation of Discord's ToS, than they are banning pedophiles persistently enough to not get investigated by the FBI.[27]
Complacency
Discord, being a large company, you'd think would try to add features that the community had been asking for for a long period of time, right? Wrong.[28][29][30] Instead of adding encryption, custom themes, custom plugins, or anything else that the community has been asking for for years, in one of their most recent updates they added the ability to link a GitHub account and revamped their “HypeSquad Quiz” (along with some other mostly meaningless “features”.) Since their release in 2015, they haven't really had many original or unique features that weren't present in other large platforms at the time. They haven't progressed forwards with new, exciting features since maybe early 2017 with T.A.Y.N.E (Teleport Anywhere You Need Ever), which allowed users to navigate Discord on the desktop almost exclusively through their keyboard[31]. This added an almost Vim-like navigation scheme, which was really unheard of for chat applications at the time — and while there are a few really awesome standout features like this, they're few and far between. Most other changes since the program's release have been relegated to being almost exclusively performance changes, aesthetic tweaks, or bug fixes. They've become stagnant — not adding any groundbreaking features that many would expect from such a juggernaut of a technology company, no encryption, community theming, plugins, badges in chat, raw message copying, friend popup notifications, direct image to clipboard copying, inline image zooming, and so much more that has been asked for from the community time and time again since almost the release of the program. And because it hasn't been added, many have resorted to breaking ToS[32] with client modifications such as BetterDiscord[33], which allows for all of these things to be done.
One user on Twitter after “reporting a server by an “infamous” pedophile” which, without going into detail, has what content you would expect from a pedophile's server in it, got a canned and fairly disturbing response from the Discord Trust and Safety team. Not only does the moderation team take no action of any kind against the server or user in question, they simply tell the user to “leave the server if you do not like the content!”[34] This sets a dangerous precedent of complacency of the moderation team, as not only is there an alarming skew of priorities, but it's a pattern that seems to have been going on since most of the modern moderation team has been in place.
In mid-2019, Discord rolled out a beta of their new (now indefinitely closed) games store to users who were on the beta (Canary) release channel. With tensions high from recent unjust bans of many average users[35] and large figures in the gaming and online community[36], people were worried that their games (and by extension their money) would be lost forever if they were to unluckily find themselves on the receiving end of one of these inequitable bans. One Twitter user stating “If you get banned from Discord at their political convenience, do you lose access to your library? Sounds like a terrible idea...” Many other users on Twitter and other social platforms (including users on Discord itself) echoed many of the same concerns — and with all of these questions, Discord had an answer. The Discord Trust and Safety team bluntly stated that they are focused on “rule-breaking behavior instead of targeting an individual”. This is Discord's attempted solution at ensuring people don't lose their games while still being able to effectively ban users who break the rules, and a terrible solution it is. What it implies is that it could allow people who previously exhibited rule-breaking or even potentially law-breaking behavior back onto the platform just because they say they won't do it again. It is more or less, by definition, an honors system. Multiple users even reporting that they've seen pedophiles and other genuine criminals let back onto the platform by this reasoning.
Censorship
By using Discord, you waive your right to sue them, instead needing to go through an arbitrator. An arbitrator is a lawyer hired by the company and acts as a sort of negotiator. Since the arbitrator is highly biased towards the company for obvious reasons, it makes it extremely rare for the plaintiff to win. Discord also revokes your right to enter a class action lawsuit against them[37][38] (I wonder where they got that idea from.) Now, Discord repeatedly tried to inform people that they “[had] the right to opt out [...] by sending an email to arbitration-opt-out[at]discord.com”[39]. Not only could you only do this if you sent the email within 30 days (which was later extended to 90 days after immense backlash) of the initial change[40], but what was conveniently left out was that this would only opt you out of the arbitration clause, and would still not reverse your inability to join a class action lawsuit against them, despite the intentionally misleading and vaguely phrased responses of multiple moderators alluding to such.
Many social media platforms for years have had blanket removals for many servers under the reasoning of “misinformation.” Over the years this has applied to Twitter, Facebook, and most other large social platforms — and it seems that this behavior has finally rubbed off on the Discord moderation team. Ice Age Farmer was a Discord community which described themselves as focused on “gardening, preps, seed saving, canning, alternative construction & greenhouses”, who on October 15 had their fairly small invite-only server disabled on the grounds of “spread of misinformation.”[41] For a platform who describes itself as “for anyone who could use a place to talk with their friends and communities”, their censorship seems at best contradictory, and at worst intentionally deceptive and harmful.
There are reports of users being banned from Discord for saying anything anti-furry, and even a report of such a comment, true or not, can get someone banned. A former bot api developer stating “You can report anyone on discord and say something along the lines of “he said fuck furries” give no proof and they'll be banned.”[42]