Facebook was red-faced this week after admitting to a loophole in its child-focused Messenger Kids system.
The company was found apologizing to parents via email after a hole in the supposed closed-loop messaging system allowed children to join group chats with people their parents hadn’t approved.
Launched in December 2017, Messenger Kids is an Android and iOS app designed for users under 13. The service, which doesn’t allow ads, must be installed by parents, who must approve the child’s contacts. It is not possible to search for individual children on the service. Individuals can video chat and message with children using the regular Messenger app, but only if the child’s parent approves them.
On its website, Messenger Kids says:
Kids can only connect with parent-approved contacts, which creates a more controlled environment
Except when it doesn’t.
The Verge discovered that Facebook had made an embarrassing slip up. The social media giant had been sending messages to parents informing them of a “technical error”. A child’s friend in the app could create a group chat with them and then invite people from their own list of parent-approved friends, even when their parents hadn’t approved their child to talk with those people.
Facebook’s email, seen by The Verge, read:
We found a technical error that allowed [CHILD]’s friend [FRIEND] to create a group chat with [CHILD] and one or more of [FRIEND]’s parent-approved friends. We want you to know that we’ve turned off this group chat and are making sure that group chats like this won’t be allowed in the future. If you have questions about Messenger Kids and online safety, please visit our Help Center and Messenger Kids parental controls. We’d also appreciate your feedback.
This was likely a flaw in the design of the Messenger app, in which engineers didn’t account for young users adding new friends. Their oversight allowed a child to talk with a friend of a friend that the child’s parent hadn’t explicitly trusted and may not even know.
It’s good to see Facebook taking action, but it may be too late to burnish an already damaged image when it comes to child safety online. The Information Commissioner’s Office (ICO), in conjunction with Ofcom, surveyed 2,000 people aged 16 and over and 1,000 children aged between 12 and 15 earlier this year about their perception of online harm.
Youngsters worried by unwelcome friends
Almost one in four (24%) of those in the younger age bracket said that they had experienced potential harm on Facebook, which was twice that of the second-most harmful platform, Facebook-owned Instagram. Snapchat came third, at 8%. An even larger proportion of children, 38%, had experienced unwelcome friends or followers on instant messenger platforms, making it the biggest single perceived type of harm on that kind of platform among young users.
The move will also be disappointing for parents seeking a safe avenue for their children to message with friends and family. Facebook Kids was supposed to be a beacon of safety for these families. Meanwhile, the alternatives don’t seem much better. The ICO is investigating video-sharing app TikTok for its use of children’s personal data and adherence to GDPR after it received a multimillion-dollar fine from the US Federal Trade Commission for violating child privacy.
How to keep your children safe on their phones
Are you concerned about what your children are able to access on their smartphones? Matt Boddy explains how you can restrict what they can and can’t access.
(Watch directly on YouTube if the video won’t play here.)