Safety Processes on Social Media
If you are worried about someone on social media, you can contact safety teams, who will reach out to connect the person with the help they need. *Note: Tumblr no longer directly responds to reports of suicide or self-harm.
To report suicidal content on Discord, right-click the message or tap and hold on mobile, select Report Message, choose Self-harm, and submit so Discord’s Trust & Safety team can review and connect the person with support.
Reporting Concerning Content or Behavior on Discord
Mental Health Resources on Discord
If you ever come across suicidal content on Facebook and are concerned about someone’s safety, you can use the link below to share your concerns and get them support. Meta also offers a Safety Center filled with suicide prevention tools, resources, and guidance to help you know what steps to take.
Report Suicidal Content on Facebook
Visit Meta’s Safety Center for Suicide Prevention Tools & Resources
To report posts related to suicide or self-harm on Instagram, tap the “…” in the upper-right corner of the post. Select Report, then choose Suicide, self-injury, or eating disorders. From there, tap Suicide or self-injury and submit your report.
Visit Meta’s Safety Center for Suicide Prevention Tools & Resources
If you come across a pin or comment that is concerning to you on Pinterest, learn how to report that content below. You can also explore their resource page, which lets you search for support options by country and state.
How to Report Something Concerning on Pinterest
Suicide, Self-harm, and Domestic Violence Prevention Resources
If you’re worried about someone on Reddit, you can report their post or comment to get them support. Click the three dots to the right of the post, select Report, then choose Self-harm or suicide. Tap Next, then Yes, and Reddit will confidentially reach out to connect the person with trained counselors from the Crisis Text Line.
To report suicidal content on Snapchat, press and hold the Snap or Story, tap Report, select Suicide & self-harm (or the most relevant option), follow the prompts, and submit so Snapchat can review and connect the person with support.
To report suicide or self-harm on TikTok, tap the arrow at the bottom right hand corner of the video. Tap Report, select Suicide and self-harm, and submit.
What to Do If You See a User That Needs Support on TikTok
Visit TikTok’s Safety Center
To report suicidal content on Twitch, click the three dots next to the stream, clip, or chat message, select Report, choose Suicide or self-harm, and follow the prompts so Twitch can review the content and connect the person with support. You can also explore their resource page, which lets you search for support options by country and state.
To report suicidal content on X (formerly known as Twitter), tap on the three dots in the top right corner of the post, select Report post, choose Suicide or self-harm, and follow the prompts so X can review and connect the person with support.
You can also click below to report messages about suicide or self-harm to X. X will send the user a direct message with the 988 Lifeline’s number.
To report suicide or self-harm, click “…” at the bottom lefthand corner of the video, then click Report in the drop-down menu. After this, select Suicide, self-harm, and eating disorders and YouTube will review the video and may send a message to the uploader with the 988 Lifeline number.
How to Report Concerning Content on YouTube
Suicide Prevention Resources on YouTube
How to Engage on Social Media
The “Support for Suicidal Individuals on Social and Digital Media” free toolkit was developed by the 988 Suicide & Crisis Lifeline to help digital community managers and social media platforms establish safety policies for helping individuals in suicidal crisis. While we recommend downloading the full kit, we have shared some excerpts below.
One of the first hurdles to cross in establishing a process for suicidal community members is one of identification. How do you know if someone may be in suicidal crisis? Examples of a community post from someone who may be at risk:
“Hi, I really need some help, can someone please contact me.”
“My daughter has fibromyalgia and the treatment alone costs too much for us to keep up with everything else. It’s become a full-time job to take care of her and I don’t know how I can keep going on like this. I feel hopeless with all of this and don’t know how I can keep going.”
“My 15-year-old son has been texting one of his friends and he has been having what appears to be thoughts of suicide. What should I do?”
“I’ve been really depressed lately and I don’t know how to deal with this. I have been thinking about suicide lately, my grandfather committed suicide 10 years ago. I’m so scared about all of this.”
If you have identified an individual that is at risk of suicide or in suicidal crisis but doesn’t seem to be at imminent risk, research suggests that the community moderator reach out to that individual directly, through a set of clear processes established by and best suited to the needs of your platform or community.
While we encourage active moderation and response online, we do not encourage community managers to take on the role of mental health care professionals. All engagement with an at-risk individual should be designed to provide appropriate support while connecting that individual to mental health or crisis resources like the 988 Lifeline, your local crisis center, or other local mental health providers.
If you believe that the person may actually be at imminent risk of suicide, call 911.
If, while engaging with an at-risk individual, you believe that the person may actually be at imminent risk of suicide, call 911 or other local emergency services for immediate assistance. Local emergency services are the fastest way to help a person who is at imminent risk. Other resources or protocols may be inappropriate during this situation and should not be applied.
Download the Social Media Toolkit
For digital community managers and organizations.
