

Our previous hacks and cheats category is now exclusive to malicious behavior in online games, such as account cracking and distribution of cheats. Cybercrime covers social engineering, fraud, scams, and things like distributed denial of service (DDoS) attacks. For this report, we’ve introduced a new category: cybercrime.

Discord monthly active users update#
To that end, we may update our categories in future reports to better reflect what we see and what users report to us. Broadly speaking, these are representative of all the different kinds of reports that come in to Trust & Safety, and we feel they paint a clear picture of the behavior and content we’re working to keep off Discord. In each graph, you can expect to see the same 11 categories. The first is the category breakdown: We’ve standardized the way we categorize violations. We want to make each new transparency report more useful and accessible than the last, so we’ve updated how we’re presenting information this time around. They’re a good step towards increasing the variety of tools in our toolset, and as Discord continues to grow, we’ll continue to explore new ways of tackling abuse. Warnings are useful in situations not involving a serious threat of harm because they present an opportunity for user education without us taking permanent action on an account. One of the new things we want to spend some time on in this report is user warnings and their effectiveness. In keeping with our previous reports, we’ll detail the breakdown of reports by category, the rates at which we action violations of our Terms or Guidelines, the total number of actions taken by Trust & Safety, and the rate at which we restore accounts. All the numbers (and announcing a new category) We’re working hard to make sure that anyone at any time can write in, have their issue looked at, and get the help they need. For that reason, the number of employees on our Trust & Safety team almost doubled in the last six months, continuing as one of the biggest teams in the company. The reports we received almost doubled in the last few months.Īs Discord grows, one of our top priorities is to help everyone continue to have a safe experience. This graph shows the month-over-month increase in reports received by Trust & Safety in the first half of 2020. As you might expect, as we welcomed more and more people to Discord, there was a concomitant rise in reports created. In June we announced that Discord has grown to more than 100 million monthly active users. With more people staying home, many new people joined Discord to talk to their friends and seek out communities. We’re also looking to add more case studies, policy documentation, and resources on community moderation.įinally, the biggest change since our last transparency report has been COVID-19 and the growth it has created for our service. In the future, we plan to add more how-to articles to address common issues and resolutions to them.
Discord monthly active users how to#
This is a great resource for everyone, with tips on how to keep your account safe, articles explaining Discord for parents and educators, and a clear breakdown of how we enforce our policies. Later in June, we rolled out our new Safety Center. We wanted it to sum up all the values we hold near and dear: treating each other with respect, treating your communities with respect, and keeping Discord a safe and friendly place for everyone. It clarifies some of our existing policies so that they more clearly reflect what we stand for. This new document clarifies the kind of behavior and content that is - and isn’t - allowed on Discord. In May, we refreshed our Community Guidelines. We’d like to spend some time catching you up on the biggest changes and how we’ve responded to them. Hi again!Ī lot has happened on Discord since our last transparency report. Corrected totals are provided in the new graphic. The proactive server deletions were initially overreported. Note: The graph relating to “Proactive removals” was updated in April 2021.
