Menu

Help Center

Reporting users

Follow

Comments

10 comments

  • Avatar
    darkline1

    Well, I often say things such as illegal crap, but please note that's all in roleplay. Therefore, we are not planning anything in reallife, nah nah. Only in the roleplay. How would you guys react on that? Will you just see the reported message, or also what happened before said message? I mean, I could get arrested only for saying "Hey, you wanna rob the local bank?" in my roleplay character. Please clarify this to me.

  • Avatar
    Angel

    i cant seem to find this button in my dm's please help.

  • Avatar
    Spite

    Same. I can't find the report button in any non-friend direct message, so is this a feature that will be implemented in the next update?
    If so that's great, it's something I've been needing for a while, dealing with massive bot spam on direct messages, all advertising the same server called Space Cadets, ID:302367211224432640, from different bot accounts.
    Looking forward to that feature.

  • Avatar
    Timowh

    How can I report users in a server? This doesn't work in my dm's too

  • Avatar
    fatpigsarefat

    There is no report button for me either.

  • Avatar
    Al

    No report button even in Dev mode

  • Avatar
    Shentino

    I'd like to suggest that evasion of a server ban or contact block be added to the list of reportable offenses.

     

  • Avatar
    Azurezero

    my mega account got hacked

     

    Hi.

    Discord: https://discord.gg/My6XSng (Lewis#8521)

    I hope you had a backup of all of your files.

    If not, you can contact me on discord, and pay a small bitcoin fee for them all back. :)

     

    hes not on my friends list and yet i cant report him.

    can you ipban him?

  • Avatar
    Phoenix

    I'm a mod on a mental health community with about 280 users on Discord. The community consensus in our group is Self Harm reporting is a bad thing, and we're really worried about the feature. We help people by creating a space where they can safely talk about the darker things in their head without fear of it having real life consequences. Somewhere people can talk about PTSD, histories of abuse, and quite often suicide. Talking about these issues openly, and finding people who relate and truly empathise, is almost always enough to de-escilate a situation and help them find a bit of hope to get them through an especially bad week. Creating this self harm reporting feature means anyone on our server could anonymously report any other user for reaching out and trying to find that curative empathy, potentially exposing them to abusive police or psychiatric imprisonment. I know you want to believe there's someone you can call who can intervene and help, but in most places there just isn't. In most areas, the 'care' options available are terrible and often retraumatising and abusive.

    If we are to continue helping people effectively on Discord, we need to be able to maintain the safety of our users from threats like Self Harm reporting. At the very least, you should log incidents in the Audit log, so we can hold people accountable and exclude them from the community if they use reporting maliciously - something you wont be able to judge from a few messages in a moment, without our deep knowledge of our community and the social dynamics between different people on it. My strong preference would be for you to remove the self harm reporting feature entirely as it is likely to hurt many more people than it helps, or at least allow us to disable it on our server in the same way NSFW reporting is disabled in NSFW channels.

    If you would like a better option, Tumblr is a fantastic model. When searching for or posting about self harm related tags, like cutting, suicide, or anorexia, tumblr pops up a thing asking if you're okay, and offering some options the user can use to get support and help if they would like it. The helpful stuff are links to sites you can call or chat with, who provide support in safe anonymous ways, and other resources along those lines. The user can decline, and continue to post or interact with that content if they like. Tumblr's solution doesn't threaten users safety, and instead empowers them and demonstrates compassion. I know a couple of years ago when I was suicidal, I was surprised by it, and it warmed my heart. It actually made my day better.

    As a sidenote, a common refrain in chat right now is "make sure you don't get Nitro, because if you do, Discord might have your address from payment processing, and could make you vulnerable to this". At least in our space, you have terrified some of your most loyal users, and ensured no one else will be signing up for Nitro any time soon.

  • Avatar
    Kuki

    For a few weeks, there used to be a report button for me and I reported several people on servers for posting lolicon. Now the report button dissapeared and the peadophiles are still happily posting lolicon.
    ¯\_(ツ)_/¯

Please sign in to leave a comment.