Back in July 2013, four computers were stolen from a large health care provider in Illinois, USA.
At first blush, it doesn’t sound like “Crime of the Century,” but according to reports, those missing computers have become a huge thorn in the side of Illinois-based Advocate Health Care.
That’s because the computers contained Personally Identifiable Information (PII) of patients going right back to the 1990s – four million of them, in fact.
The computers were password protected, whatever that means, but the data on their hard disks was not encrypted.
In theory, then, if you were to put the hard disks into another computer, or boot the “protected” computers from a CD or USB key, you would almost certainly be able to copy off any or all of those four million records.
The stolen data is said to have contained at least names, addresses, dates of birth and Social Security numbers (SSNs).
SSNs are the closest thing that the US has to a national identity number, giving them an influence in identity and identification that they don’t really deserve.
With your address, date of birth and SSN, an identity crook has a pretty good shot at committing fraud in your name.
So, Advocate has apparently already been hit with the expense (and hassle) of contacting the affected patients, and of offering them a year of free credit monitoring.
Credit monitoring services aim to keep their eye on financial transactions carried out in your name, helping you to spot fraudulent activity on your existing accounts, as well as attempts to open new accounts that you might otherwise know nothing about.
Now, things have just got a whole lot more onerous, with the filing of a class action suit that could end up pitting millions of individuals against Advocate in court:
This is a consumer class action lawsuit brought by Plaintiffs, individually and on behalf of all other similarly situated persons (i.e. the Class Members), whose unencrypted personally identifiable information and personal health information — names, addresses, dates of birth, Social Security numbers, treating physician and/or departments for each individual, their medical diagnoses, medical record numbers, medical service codes, and health insurance information (collectively referred to as "PII/PHI") — entrusted to Advocate was stolen by a thief or thieves while in the possession, custody, and control of Advocate.
(You have to love lawyerly English. Why not use three words when none would have done? The data wasn’t just stolen from Advocate, it was stolen from the company’s possession, custody and control.)
Class actions of this sort can end up expensive for the defendant (and lucrative for the lawyers, I must add, which may help to explain their propensity for pleonasm).
Facebook, for example, recently paid out a settlement for attaching its users’ names and photos to online ads without permission; the bill for that, which involved just over 600,000 eligible claimants, came to $20 million.
The chief lawyer of the company that has taken on the class action against Advocate said:
In this age of advanced technology, Advocate had to realize that its unorthodox methodology for maintaining important and private data posed a risk to the safety and security of their patients.
I don’t mean to excuse Advocate’s lapse, and I don’t disagree that the company should have realised the risk it was taking, but (for all the wrong reasons) I’m not so sure about the word “unorthodox.”
In my experience, encryption is still a technique more honoured in the breach than in the observance, with an awful lot of the world’s PII stored in plaintext.
At the end of 2011, for example, we bought a stash of USB keys from an Australian train company’s lost property auction, interested to see what we might find.
We ended up with 50 USB keys containing 4443 directly readable files, ranging from movies and images, through tax records and software source code, to the minutes of an activists’ meeting.
The number of encrypted files we found?
Zero.
We need to change the world so that storing data unencrypted really is unorthodox.
Image of crook half-inching laptop courtesy of Shutterstock.
The encryption would be good – I'm a huge proponent of FDE – but the real problem is that PII/PHI was on the endpoints to begin with.
I'm at a loss to think of any legitimate reason why the data would need to be downloaded. Patient record maintenance should be handled data entry to the back-end servers. Ditto any mailings (electronic or postal) to ensure proper case records are kept.
And even if there is a reason to download the data, that reason won't include pulling the SSN. They already have a patient number for uniqueness/tracking.
My wife has been treated at Advocate; I'm waiting to see if we'll get a letter.
Apart from the alliterative entertainment inherent in the phrase "propensity for pleonasm" (which perfectly characterizes lawyerly linguistics), this article completely nails the problem in defining the LACK of encryption as the orthodox condition.
Here's what I mean. How many of the people who work for Advocate — from the CEO down to the hourly paid clerks at the bottom of the pay scale — actually encrypt their own electronic correspondence? I'd feel safe in betting that it's well below 1%. For that matter, how many of the people reading this post actually use encryption? It would make an interesting poll.
I don't understand it. To me, it's like, "Well OF COURSE you'd want your communications an information to be secure. Who wouldn't?" That's the problem; most everyone wouldn't. More precisely, they don't care. It's completely off their radar. The people at Advocate don't think about it on the job because they don't think about it even for themselves. The people at banks, and hospitals, and accounting firms don't think about it. Almost no one thinks about it. The ignorance of good security practices is worse than epidemic; it’s ubiquitous. It's a cultural deficiency.
Nothing short of a massive education program will fix the deficiency. People have to WANT to secure their information and communications. Who will educate them? The ISPs seem to be the most logical choice, but they're too blind. They can't even see how encryption (and better yet, identity-trusted signatures) could be a part of the solution to the massive amounts of spam and phish-mail choking their bandwidth and filling their servers. They could be a part of the solution. But they're not even thinking about it.
Here's how bad the problem is: I don't even have a way of corresponding securely with the folks at Sophos…at least, none of the messages I've ever received from Sophos was signed with a public encryption key. And Sophos is among the best of the best. They're certainly doing more than most to educate folks about security. But alas, there’s still a very long way to go.
Working at a large financial institution, which required you to change your password every thirty days, I found that:
I could get into almost any account, like if the name was Fred, they would use FredNov then FredDec and so on. The machine only checked the last year, so every year they knew their password, sometimes it was even easier such as Fred11 or Fred12, so forcing security does not seem to work very well. Maybe if their job deepened on it, but then someone would be looking at passwords and one's like "This job sucks" wouldn't be appropriate.
Most of these machines were not accessible via outside without items like random number generators that encrypted a number that was synced with the mainframe and you kept a card with an internal timebase and synced generation. But even still if you got hold of a card and could get in, I don't know how much damage you could do.
Security is a problem and until it hits US, the people we probably won't do anything about it. I'm relatively lax about it, even thought I know the risks, but also don't keep that kind of important information on my machines or phones. If I lost my iPhone, my biggest concern would be restoring a new one, not what I lost, since I keep numbers and passwords elsewhere.
Are the lawyers going to walk off with 50 billion and the people get 5 bucks each for the efforts? At least the medical people should know to encrypt.
They are offering only a year of credit monitoring. It should be more like 10 years.