Instagram has finally responded to mental health concerns with a new reporting feature on the app.
Instagram rolled out a new mental health flag feature last week, Seventeen reported Oct. 17. The feature allows users to report anything concerning about an Instagram post. Instead of relying on an algorithm to find potentially concerning posts, the company has placed reporting responsibility in the hands of users.
"We listen to mental health experts when they tell us that outreach from a loved one can make a real difference for those who may be in distress. At the same time, we understand friends and family often want to offer support but don't know how best to reach out," Instagram chief operating officer Marne Levine said to Seventeen. "These tools are designed to let you know that you are surrounded by a community that cares about you, at a moment when you might most need that reminder."
The popular social media platform is home to users of all backgrounds, locations and personalities. There are foodie, fitness and fashion accounts. But with the increased user base comes increased levels of depression; according to a study conducted by the University of Pittsburgh, the more young adults use social media, the more likely they are to be depressed — and more than a quarter of people the researchers polled showed high indicators of depression.
Instagram's new feature gives users an option to anonymously report posts to help combat mental health issues on the app. If someone posts about self-harm and gets reported, a message pops up: ""Someone saw one of your posts and thinks you might be going through a difficult time. If you need support, we'd like to help." The app will then give a few options: talk to a helpline, talk to a friend, or get tips or support.
The same support will be offered if a user searches for a hashtag associated with self-harm or self-injury. Some hashtags have been eliminated entirely; #thinspo (associated with eating disorders) shows zero results when searched.
Because some signs of self-harm show more immediate danger, Instagram has employed teams to review and organize reports by the degree of emergency.
“We have teams working around the world, 24/7, who review these reports,” Instagram spokeswoman Marni Tomljanovic told the Wall Street Journal. “They prioritize the most serious reports and respond quickly. If someone on Instagram sees a direct threat of suicide or self-injury, we encourage them to contact local emergency services immediately.”
News on Open Source is free and unlimited. Access to the rest of 512tech.com comes with an American-Statesman digital subscription, which also includes myStatesman.com and the ePaper edition. Subscribe at statesman.com/subscribe.