Dark web is mostly illegal, say researchers

Law enforcement and governments are demanding the ability to decrypt communications.

They want backdoors that can get past encryption, and they’re trying to legislate them into existence. For example, in the US, New York and California recently proposed bills that would require either backdoors to bypass phone encryption or fines of $2,500 per unit sold in the respective states.

Encryption is often, as Motherboard’s Joseph Cox succinctly puts it, a polemic issue: an argument that can boil down to a simplistic battle between the advocates of innocent individuals’ privacy rights (and of security that isn’t weakened via backdoors) vs. those who point to encryption that shields criminals.

This debate – a renewal of the decades-old Crypto Wars – needs more nuance, according to two researchers from King’s College London.

Specifically, it needs to be fleshed out with data showing, as accurately as possible, what encryption is actually hiding, be it journalists protecting their sources, political dissidents, pedophiles selling photos of their own children, or murderers for hire.

To get at that nuance, researchers Daniel Moore and Thomas Rid have carried out an in-depth scan of hidden-services websites within the Tor network.

Hidden services, also known as dark web sites or .onion sites, use the sophisticated, multilayered encryption of the Tor network to hide themselves and the source of their traffic.

💡 LEARN MORE: What is… Tor?

The researchers’ conclusion: dark web sites are, in fact, most commonly used for crime.

From their paper, Cryptopolitik and the Darknet, which explores the relationship between privacy and security:

The results suggest that the most common uses for websites on Tor hidden services are criminal, including drugs, illicit finance and pornography involving violence, children and animals.

To conduct their research, the pair scraped Tor hidden services with a website crawler that automatically hopped from site to site via links, the same as a human site visitor would do.

They had manually come up with a taxonomy with which to classify websites into categories that included, among other things, drugs, extremism (including militant how-to guides and support for terrorist violence), finance (including money laundering, counterfeiting, and trade in stolen credit cards or accounts), hacking (including distribution of malware), “illegitimate” porn (i.e., porn that’s nonconsensual or which involves children, violence, or animals).

Over the course of five weeks, Moore and Rid found 5,205 live websites, 2,723 of which they were able to classify by such content categories.

Of those, some 57% – 1,547 – hosted illicit material.

The researchers point out that there’s no way to index every existing hidden site: it’s a shifting landscape, where sites constantly pop up, only to quickly disappear as they switch addresses and server locations without notice.

To make it more difficult still, Tor addresses are just long strings of characters with a .onion domain at the end.

In order to get a proper sample of all the hidden services on the dark web, the pair built a Python script that crawled the dark web, starting with the popular dark web search engines Onion City and Ahmia.

Moore and Rid said that their methodology, though it’s bound to have missed a “small number” of closely held sites, manages to map a number of sites that roughly matches the estimated number of Tor websites at any given moment.

This is not a full survey: The Tor project has estimated that some 30,000 hidden services are active and announce themselves to the Tor network every day.

The crawler scraped the content from each page and uploaded it for analysis. It did the same when it found a link: it hopped to the linked site and scraped it, too.

Next, an algorithm processed the collected content and automatically sorted it into the categories that Moore and Rid had set up. They spot-checked the automatic categorizing and found it to be, overall, very accurate.

What the researchers captured was a representative sample, they said. Moore told Motherboard that it’s similar to what Tor users encounter:

We don’t make any statements about the entire contents of Tor. We just looked at what is the reasonable offering of hidden services to most users.

We went for what a user can actually see and interact with.

When Motherboard asked The Tor Project to comment on Moore and Rid’s methodology, the nonprofit behind the Tor network declined.

But Kate Krauss, a spokesperson for the project, told Motherboard in an email that the pair seem to have overreached their research:

The researchers seem to make conclusory statements about the value of onion services that lie outside the scope of their research results.

Onion services are a tool with unique security properties used for a wide range of purposes: They are self authenticated, end-to-end encrypted, and offer NAT punching and the advantage of a limited surface area.

Moore and Rid conclude their essay by saying that hidden services have already damaged Tor, as well as trust in the internet as a whole, with the political risk they bring to cryptography.

Should Tor do away with hidden services?

It’s not like hidden services haven’t been considered for the chopping block already.

In 2014, an anonymous user asked Roger Dingledine, one of the original developers of Tor, “Why not scrap hidden services?”

His answer:

We do think about that option periodically.

From the researchers’ conclusion:

Tor’s ugly example should loom large in technology debates. Refusing to confront tough, inevitable political choices is simply irresponsible. The line between utopia and dystopia can be disturbingly thin.

Users, what’s your take: are hidden services worth the political firestorm they generate? Are they worth criminals escaping justice?

Please let us know your thoughts below.

Image of Dark Web courtesy of Shutterstock.com