Mastodon vs Discord
Side-by-side comparison of the Terms of Service and Privacy Policy of Mastodon and Discord.
The service includes meaningful user protections like export, deletion, hashed passwords, SSL, optional 2FA, and a no-sale statement. The main concerns are the decentralized network design, which spreads content to other servers, plus limited support scope and some moderation opacity.
Mastodon.social presents a relatively user-friendly privacy posture for a social platform: it offers account deletion, content export, clear security measures, and says it does not sell personal information. Key tradeoffs come from federation: public, followers-only, and direct-message content may be copied or delivered to other servers, reducing practical control once shared.
Points of interest
-
negative ●●●●● privacyFederated content sharing
Because Mastodon is federated, your public content can be downloaded by other servers, and even followers-only or direct messages may be delivered to other servers. That limits practical control over where your content ends up.
-
negative ●●●●○ privacyDirect messages leave server
Direct messages are sent to recipients' servers when they are on other servers. Users should not assume DMs stay solely under mastodon.social's control.
-
positive ●●●●○ privacyNo sale of data
Mastodon.social says it does not sell or trade personally identifiable information. It may still share data with service providers or when legally required, which is common but worth noting.
-
positive ●●●●○ privacyAccount deletion available
You can permanently delete your account at any time. This gives users a clear exit path, though copies of distributed content on other servers may persist in practice.
-
positive ●●●●○ privacyStrong account security
The policy says sessions and API traffic use SSL, passwords are strongly hashed, and two-factor authentication is available. These are meaningful baseline protections for account access.
-
positive ●●●○○ privacyContent export offered
Users can request and download an archive of their content. This supports portability and backup before leaving the service.
-
neutral ●●○○○ privacyApps get broad access
Authorized apps may access substantial account data depending on the permissions you grant. The positive limit is that apps cannot access your email address or password.
-
neutral ●●○○○ privacyCookies for login/preferences
The service uses cookies to recognize your browser, connect it to your account, and save preferences. This appears functional rather than advertising-focused based on the provided text.
-
neutral ●●○○○ termsModeration outcome opaque
Reports are usually handled quickly, but reporters are not told whether punishment occurred, and some enforcement is not visible publicly. This helps moderation flexibility but reduces transparency for users who report abuse.
-
positive ●●○○○ termsClear operator identity
The service identifies the operating company and provides corporate registration details and contact information. That improves accountability compared with anonymous operators.
Documents
Discord provides notable privacy controls, says it does not sell personal information, offers deletion/access tools, and gives notice of major privacy-policy changes. However, it also uses broad categories of data for personalization, ads, and service improvement, allows extensive sharing with vendors and some advertising partners, and includes strong legal protections for itself such as mandatory arbitration, class-action waiver, liability caps, and broad termination rights.
Discord’s terms and privacy policy are relatively transparent and offer meaningful user controls like data access, deletion, and some limits on personalization. But the service also collects broad usage and content data, shares data with vendors/advertisers, reserves broad moderation and termination rights, and imposes arbitration, class-action waiver, liability limits, and indemnity obligations on many users.
Points of interest
-
negative ●●●●● termsMandatory arbitration waiver
U.S. and Canada users generally must resolve disputes through individual arbitration, not court, and waive jury trials and class actions. This can make it harder and sometimes more expensive to pursue claims.
-
negative ●●●●○ termsLiability capped at $100
If Discord harms you, its financial responsibility is heavily limited to the greater of what you paid in the prior three months or $100. That can sharply reduce practical remedies for outages, data loss, or other service issues.
-
negative ●●●●○ termsBroad indemnity obligation
You may have to cover Discord’s legal costs and liabilities for claims related to your use, content, violations, or misconduct. This shifts significant risk onto users, especially creators or server operators.
-
positive ●●●●○ privacyNo sale of personal data
Discord expressly says it does not sell personal information and says its business is funded by subscriptions, paid products, and sponsored content instead. That is a meaningful privacy-positive commitment.
-
positive ●●●●○ privacyStrong account deletion tools
Users can disable or delete their account from settings, and Discord says deletion permanently removes identifying information and anonymizes other data. This gives users a clear exit path, though some retention exceptions remain.
-
positive ●●●●○ privacyData access and portability
You can request a copy of your data in settings, and Discord says it provides the data in common digital formats such as JSON. This supports transparency and portability if you want to review or move your information.
-
negative ●●●○○ termsBroad content license
You keep ownership of what you post, but Discord gets a worldwide, transferable, sublicensable license to use and adapt it for operating and improving the service. That is common, but still a broad grant users should understand.
-
negative ●●●○○ privacyExtensive data collection
Discord collects account details, messages and uploads, device and usage data, purchase data, and information from advertisers and other third parties. This supports personalization, safety, analytics, and advertising of Discord itself.
-
negative ●●●○○ privacyContent used for moderation models
Public or widely available content and some reported material may be used to build automated safety and moderation systems. Users should know their content may help train detection systems, not just be displayed to recipients.
-
negative ●●●○○ termsCan suspend or terminate broadly
Discord can suspend or terminate accounts for violations, legal demands, safety concerns, risk to others, or even over two years of inactivity. It may do so with or without notice, subject to law.
-
positive ●●●○○ privacyPrivacy controls in settings
Discord offers settings to limit personalization and some data use for service improvement, plus controls for visibility and safety features. These controls do not eliminate collection entirely, but they give users meaningful choices.
-
positive ●●●○○ privacyNotice for major privacy changes
Discord says it will date updates and provide more prominent notice when privacy-policy changes are significant, such as email or in-app highlighting where required. This is better than silent policy changes.
Documents
Comparison is based on each service's published Terms of Service and Privacy Policy. Read the source documents linked above before relying on any specific clause.