Anonymous social media apps aimed at youngsters are kind of like their teenage target market: they like to think of themselves as edgy.
It’s all fun and games, until somebody gets hurt.
Or, in the case of the (recently reformed) cyberbully playground After School, it was all fun and games until some anonymous schoolchild threatened to bring a gun to school.
After School isn’t the first to the game of letting kids post anonymously: Yik Yak beat it to that, as well as to being banned from schools for providing a forum for toxic chat.
But until recently, After School prided itself on its offensiveness, telling users that if the content was too much, they should bug off and go look at cat pictures instead.
According to the Daily Dot, this is what its FAQ page used to say, at least before the iOS app got kicked off Apple’s App Store (for the second time):
What should I do if I see objectionable content?
Report it to us. We believe in free speech and the ability for people to express themselves. If you find the majority of the content too offensive, consider using your phone to instead look at cat pictures or browse a less cutting-edge social network like Facebook. However, we are always able to remove posts and block users who are actually abusing the system.
Now, the app is back on iOS, having attended something like reform school for three months.
Like many apps that have recently entered into an anti-bullying maturation phase, After School is now bent on ridding itself of bullies and threatening language and has dropped the self-aggrandizing language about objectionable content.
As well, similar to Facebook’s recent move to help both those experiencing suicidal or self-harm inclinations and the frantic friends who spot their messages, the app makers have also partnered with organisations that can provide around-the-clock support that enables moderators to respond quickly to foreboding messages.
Cory Levy, cofounder of After School’s parent company One, told the Daily Dot that such outreach is, unfortunately, needed all too often:
If someone writes a post that says, "I’m so depressed right now, I’m thinking about killing or cutting myself," which unfortunately we see often on our service, we immediately detect that message within a second and offer live support. Our mascot comes down and says, "Hey, would you like to talk to someone anonymously?"
In fact, the team has spent the past three months figuring out how to improve safety, including creating a “Safety Board,” made up of psychologists, educators, and nonprofit leaders dedicated to making the internet a safer place for young people.
The application also now has moderators who review every post before it goes live to the social network, and a “mature content” filter to prevent anyone under the age of 17 from seeing mature posts – requiring a driver’s license scan as proof of age when signing up.
Whether kids will be able to slip past these safeguards remains to be seen. Yik Yak also had an age limit, but parents complained that there was nothing in place to actually stop underaged people from signing up.
The After School team’s newfound desire to take safety seriously is reflective of a larger push to quash abusive online behaviour.
We’re seeing it in new laws that criminalise revenge porn.
We’re seeing it in companies admitting that they suck at dealing with trolls and then actually doing something about it, like Twitter making it easier and quicker to report abuse or flag content.
Twitter’s actually been on a something of a crusade with its anti-bullying efforts, which have included tools such as a filter for threats and abusive language.
Maturation with regards to handling online safety also has included replacing user moderation – an approach that hasn’t worked well in Yik Yak’s case – with dedicated staff content moderators.
Teenagers eventually grow up. Most grow out of that cocky, in-your-face arrogance. Most turn into civic-minded, responsible individuals.
So, too, do apps, apparently.
It’s a welcome sign.Follow @NakedSecurity