Need more reason to fear the privacy invasion of facial recognition? Here’s one, by way of Russia:
Porn actresses and sex workers are being outed to friends and family by people using a Russian facial recognition service to strip them of anonymity.
As reported by the Russian publication TJournal, users of an imageboard called Dvach in early April began to use the FindFace service to match explicit photos with images posted to the Russian version of Facebook: the social network VK (formerly known as Vkontakte).
The imageboard Dvach – Russian for 2chan – is called the the Russian version of 4chan.
Well, that makes sense. Tormenting workers in the porn industry fits in with some 4chan users’ prior themes of bullying, sextortion and child abuse imagery.
Its popularity exploded after photographer Egor Tsvetkov showed how easy it was to deanonymize people by taking photos of riders on the metro and matching the images with their VK social network profiles.
His project was called “Your Face Is Big Data.”
Seeking to illustrate how invasive facial recognition can be, Tsvetkov photographed people sitting in front of him on the subway, then used FindFace to look for them in social networks.
The FindFace app launched in February and users can feed it an image, whether taken on the metro or in whatever stalking situation presents itself, and the service will match it to public domain images and profile details in VK.
This is all legal, of course, given that the content in question isn’t marked private.
Tsvetkov’s project successfully matched the slumped-over, bundled-up people whose images he captured on the subway with their much shinier, occasionally salacious social media selves.
For his part, Tsvetkov told news outlets he was trying to highlight the dangers and/or invasiveness of FindFace:
In theory, this service could be used by a serial killer or a collector trying to hunt down a debtor.
(In actuality, he himself posted surreptitiously captured and posted images without permission. Nor did he make any effort to obscure the identities of his unwitting subjects. Thus, we chose not to link to his project’s website.)
Dvach users took FindFace even one step further than Tsvetkov had.
According to Global Voices, within three days of media coverage of Tsvetkov’s art project, on 9 April, Dvach users launched the campaign to deanonymize actresses who appear in pornography.
After using FindFace to identify the women, Dvach users shared archived copies of their VK pages and spammed their victims’ friends and family to tell them about their discovery.
Dvach users are also targeting women registered to the Russian website “Intimcity,” which advertises prostitution and escort services.
For a short time, the doxing campaign included a community on VK that was created by Dvach users to upload links to files containing copies of women’s social media pages. It was intended to preserve information in the case of women deleting accounts or altering their privacy settings, according to Global Voices.
The social network quickly banned the Dvach group following a complaint by an anti-sexist community. The page now displays a message that translates to:
This community has been blocked for organizing an attack on Vkontakte pages or communities.
FindFace founder Maxim Perlin told TJournal that several of Dvach’s victims have complained, but there’s no way to prevent the facial recognition service from being used in this way:
We make every effort to protect all network users from potential malicious actions and are ready, if necessary, to provide any information needed to find the users responsible for this harassment.
Another reason to keep your photos private
If you’re looking for yet another reason to set your Facebook photos and posts to private, this is a rock-solid one.
FindFace is a Russia service, and it’s indexed a predominantly Russian social network. But there’s no reason why it or another facial recognition service can’t put up an English version and turn its focus to Facebook instead. Or to other social media posted photos, for that matter.
If those photos aren’t marked “private,” they’re fair game – at least legally, if not morally.
This is just the latest, facial recognition-enabled twist on the concept of using content found in the public domain to embarrass or even endanger people. We saw it in 2012, when the site “We know what you’re doing” surfaced people’s truly embarrassing Facebook status updates. Prior to that, it was the Please Rob Me site that showed the dangers of exposing our current locations, packaged as they are with the tip-off that we aren’t at home.
Egor Tsvetkov was right about one thing: our faces are big data.
We’re used to seeing that couched in terms of government surveillance and marketing snoopery. As it is, 74% of retailers admit to tracking customers from the moment they enter a store, including tracking visitor behavior; recording information about gender, age and how much time you spend in the store; or recognizing how many times you’ve been in the store.
31% use big data, 27% use facial recognition, and 65% track customers for security purposes: for example, they’re using facial recognition to spot shoplifters and to give store security a heads-up about who to keep an eye on.
All that’s being done with no rules to protect customers’ privacy.
As Tsvetkov has shown and Dvach’s campaign has made glaringly obvious, that privacy can be invaded just as easily by individuals as by big retail chains, given what’s now easy access to facial recognition technology.
It all adds up to a great argument for setting photos to private.
Image of facial recognition software courtesy of Shutterstock.com
3 comments on “Facial recognition used to strip sex workers of anonymity”
Good. Degenerates ought to be outed.
Hey Georgi… smile for the camera!
I wonder if this could be countered with fake profiles?