Two search giants, Google and Microsoft, have agreed on measures that should make it harder to search for child abuse images online on the open internet, while Google has made a groundbreaking move to identify and ferret out videos made by paedophiles on its YouTube service.
YouTube engineers have created new technology to identify videos made by paedophiles, according to Google Executive Chairman Eric Schmidt, whose letter about the changes was posted in the Mail Online on Sunday.
As it is, Schmidt wrote, there’s “no quick technical fix” that enables search engines to detect child sexual abuse imagery, given that computers can’t reliably differentiate innocent pictures of children at bath time and genuine abuse.
That means Google has to rely on humans to review images. Those that are determined to be illegal are given a unique digital fingerprint.
Given these unique digital fingerprints, Google can automatically identify illegal pictures.
But paedophiles are increasingly filming their crimes, Schmidt said.
To address that source of child abuse imagery, Google is now testing a new technology to identify such videos and will hopefully make it available to other internet companies and child safety organisations in the new year, he said.
But the work doesn’t stop at the open internet.
The YouTube announcement came the day before a Downing Street summit at which UK Prime Minister David Cameron was scheduled to announce that British and US law enforcement agencies will jointly target online child abuse by monitoring those who operate on the hidden internet.
A transatlantic taskforce will identify ways of targeting criminals and paedophiles who use secret encrypted networks to distribute abuse imagery.
Google and Microsoft also announced that “up to 100,000 search terms will now return no results that find illegal material”, the BBC reports.
Such searches will also trigger warnings that child abuse images are illegal.
Both companies have introduced new algorithms that will prevent Google Search and Microsoft’s Bing from delivering this type of illegal result.
According to The Guardian, Google’s Schmidt announced that a team of 200 had worked to clean Google Search of search terms that can lead to child sexual abuse images.
The restrictions will first be launched in the UK, after which they’ll be introduced to other English-speaking countries and in 158 other languages over the next 6 months.
Google is also displaying warnings at the top of search results for 13,000 queries.
UK Prime Minister David Cameron welcomed the move.
In a speech in July, the PM had announced new measures to protect children and challenged outfits such as Google, Yahoo, and Microsoft to do their part by, for one thing, adopting a blacklist of “abhorrent” search queries that leave no doubt that a searcher’s intent is malevolent.
Google communications director Peter Barron said that the changes would make it “much, much more difficult to find this content online.”
More of what Barron said, from the BBC’s coverage:
We're agreed that child sexual imagery is a case apart, it's illegal everywhere in the world, there's a consensus on that. It's absolutely right that we identify this stuff, we remove it and we report it to the authorities.
Unfortunately, Google’s and Microsoft’s efforts to strip results away from child abuse-related search terms is well-meaning but, ultimately, might amount mostly to, at best, a waste of time, effort and money and, at worst, censorship.
It is not on the open internet that paedophiles search for, and find, the images they’re after. Rather, it is on the so-called dark or hidden web where the trafficking in such images mostly occurs.
As pointed out in a recent University of Massachusetts/Amherst research paper on measuring and analysing child porn on P2P networks, such networks are the most popular mechanism for acquiring and distributing such imagery.
It is here that such images are exchanged, mostly via P2P, largely in encrypted format.
A recent report from the Child Exploitation and Online Protection Centre (CEOP) in the UK backed this up:
The commercial distribution of IIOC [indecent images of children] on the open internet is estimated to account for a very small percentage of the transactions taking place. This low level is likely to be a result of the large volume of IIOC in free circulation, particularly over P2P, and widespread awareness of the traceability of conventional payment methods.
The tendency of paedophiles to use the dark web is increasing, according to the CEOP:
The use of the hidden internet by IIOC offenders remained a key threat during 2012 with the number of UK daily users connecting to it increasing by two-thirds during the year. This represents one of the largest annual increases globally, in a non-oppressive regime.
Technologies designed to scour the dark web searching for active paedophiles are likely to yield far better results than anything that Google and Microsoft are doing with regards to search and the open internet.
One such technology is automatic search on P2P networks for query terms commonly used with child abuse content.
This type of tool was used to collect information on three US defendants, who tried to get the evidence dismissed, saying that the automated computer search amounted to warrantless search and was thereby a violation of Fourth Amendment rights against unreasonable search.
A US federal court rejected that claim last week, saying that once the alleged paedophiles had posted abuse images to a P2P network, they surrendered their rights to claim those images were private files.
It is here, in the dark web, that technology advances and court decisions such as this one from last week stand the best chance of battling child abuse.
Paedophiles live in the dark web. That’s where the battle must be waged.
Image of camera lens courtesy of Shutterstock.
paedophile, in above article, is a misspelling of pedophile.
Thanks for pointing out but it’s the english spelling – paedophile.
The first part of the word derives from the Greek παῖς (pais = child); therefore it rightly has an “a” in it.
Sophos is based in the United Kingdom. In this context, British spelling is appropriate.
Making it almost impossible to find images of child abuse through the Google search engine is just a first step. But that will not stop the circulation of those images through the “dark” web.
The actual abuse of the children is the main crime, although circulation of the imagery may be part of the motive especially when money changes hands.
David Cameron on page 4 of today’s Daily Telegraph, compares the spies tackling online paedophiles to Enigma code breakers during World War Two. He wants to use GCHQ to “decrypt” illegal messages, in order to find and prosecute paedophiles making and sharing illegal images.
This noble objective overrides personal privacy concerns, in my opinion.
It is about time too , why has it taken so long to actually get these big organisations together in the aid of a common cause well done mirosoft and google.
Even if it is only a small percentage, any abuse material on the open internet is too much. Yes law enforcement etc should focus on the dark net, but Google and Microsoft making an effort to reduce access to this material where they operate (the open net) is still extremely important. Describing it as a “waste of time, effort and money” to me seems odd. We should be applauding them on their use of their own money and time in doing this, regardless of the effect it will have on the overall availability of such material.
I don’t see any wasted effort here by Google and/or Microsoft. Even though the bulk of the nasty stuff is on the dark web, the ones who seek it weren’t born knowing how to access it there. They had to have started out looking on the open web, and somehow made connections to learn where and how to find the real filth. Stopping their searches early on makes sense to me.
Year 2017… 4 year after this article and still Google search shows images of nude children when i search porn related subjects, it is not just a problem, this is a big problem. Big company like Google cant ban pictures like that? It still shows those pictures even if user turns on safe search option, as an adult, i feel like what the hell is going on!!!! And it is not just Google, other search engines are same, bloody useless.
Please report your findings to the Internet Watch Foundation https://report.iwf.org.uk/en
Hi, pictures are not abuse kind of, its just nude photos but i dont think that they are decent modelling pictures, thanks for the link but this website is asking for child abuse pictures or videos to report, shall i still go ahead and report? Thx