Man sues Uber after privacy flaws ‘led to his divorce’

Ridesharing firm Uber is going to be spending some time in court. A French businessman based on the Côte d’Azur has sued the company for divulging private ridesharing information that led his wife to divorce him.

The plaintiff, living  in the south of France, said in his claim that his wife was informed of his Uber trips whenever he took a ride, even though he had logged out of his account after using it on her smartphone. Ongoing notifications from Uber showed her his pickup points and destinations, and when his ride took place. She left him after suspecting that he was having an affair based on his ride data, said reports.

Let’s get the obvious question out of the way first. Does it matter if the Frenchman was cheating on his wife? Did he deserve privacy, or did he get the privacy he deserved?

The answer is that he deserved complete privacy. Privacy can’t be linked to moral judgements, because they change with the times. Yesterday’s crimes are today’s everyday, perfectly acceptable habits. Similarly, you may be doing something considered appropriate today, only for a future government to decide that it is immoral and therefore illegal. Privacy as a concept must stand above such things, otherwise it can’t be guaranteed, and therefore isn’t privacy.

So, what happened to land him in hot water?

The problem reportedly stemmed from a bug in Uber’s iPhone app. Once an account had been signed in on the phone, the digital authorization token used to notify the device about trip details don’t seem to have been revoked, meaning that the phone would continue to receive notifications even after the account was logged out.

The bug was reportedly fixed after December 16 last year, but that doesn’t help the now-divorced user.

A history of privacy problems at Uber

None of this helps Uber’s reputation for poor privacy practices. In December, chief security officer John Flynn sent a letter to staff reminding them of their privacy obligations, and warning staff to follow its rules about access to user data.

The letter referenced a feature article from the Centre For Investigative Reporting, which alleged poor privacy practices at Uber, even though the company had promised in the past to clean up its act. The CFIR report is based heavily on another lawsuit against the company, this time from a former employee, Ward Spangenberg. The former Uber forensic investigator is suing the company for discrimination after it fired him. He had been pushing for data privacy reform within the business for months prior.

In the CFIR report, others joined Spangenberg in alleging weak protections for private data at Uber. Employees could look up rider data with little justification, said former workers. Uber has admitted that it fired employees for improper access.

In the past, journalists have reported that Uber tracked their movements without permission, and added that executives had suggested digging up dirt on the press. This led to an investigation by Uber of its top executive in New York. The Electronic Privacy Information Center complained about the privacy infraction to the New York attorney-general, along with another relating to the breach of 50,000 Uber drivers’ data. This led to a state investigation and a January 2016 settlement that required Uber  to improve its data security.

Some of Uber’s past privacy infractions in the past have been jawdropping. Consider Rides of Glory, its data analysis of morning rideshares, designed to highlight who was on their way home from a one-night stand. It took down the original blogpost after an outcry, but there’s a copy here.

Normally you could pooh-pooh concerns over such stunts by claiming that the data was anonymized, but we know for certain that Uber employees had access to comprehensive data on each ride sharer on a per-ride basis when this blog post was written. As Uber says at the end of the post: “You people are fascinating.” Feeling creeped out yet?

EPIC also warned in mid 2015 that revised privacy terms and conditions enabled Uber to collect location data even after riders had finished their trips, based on its app’s background operation on their phones. Uber said at the time that it might add such features in the future. EPIC has accused the company of doing just that.

We asked Uber about this, and it confirmed that it had followed through on these plans in fall last year.

Specifically, for people who choose to use location services with the Uber app, we are only collecting their location from the time they request a trip until five minutes after the trip has finished.  This helps us improve ETAs, pick-ups, efficiency on Pool, and passenger safety. The collection and use of this data is explicitly communicated to users when they download the latest version of the app as well as in the device-level permissions on your phone, which you can change at any time.

The company emphasises the idea of always putting the user in control. Uber said:
Additionally, you can still use the Uber app even without location services. This option is available for anyone who prefers to manually type in an address instead of using location services.

Zero-sum games

These stories highlight a key point about modern mobile and internet-based services: there’s a trade-off between convenience and privacy. Uber, like many apps, puts highly useful services at your fingertips. It’s ridiculously convenient, and the convenience is tied to you leaving services such as location tracking on.

Users who want convenience from such services should be paying close attention to their privacy terms and conditions. That’s harder than it sounds in a world where attention is limited, and where people don’t always understand the legalese in such agreements.

Is there a way to have both convenience and privacy? Privacy by design, the set of seven design principles designed by the Ontario government in the mid 1990s, promises to help solve these problems. Its author, then-provincial privacy commissioner Ann Cavoukian, argues that it doesn’t need to be a trade-off; that privacy and security needn’t be a zero-sum game. If we can design data architectures and applications to be privacy-conscious from the outset, the theory goes, then we can have our cake and eat it. Hint: that probably means not having a God Mode in your system.

Privacy by design is mandated in the forthcoming General Data Protection Legislation (GDPR) which will become law in Europe in May 2018. Based on it, we could make the case for an easily digestible litmus test for privacy – a kind of gold, silver, bronze affair that could be used to rate an organization’s privacy practices.

If independently verified by an external authorized (and ideally federal) agency, it might go some way towards increasing awareness of an organization’s privacy stance among customers who in many cases might not be aware that privacy is an issue at all.

Beyond that, though, there are bigger problems. Such principles are only as good as the measures taken to monitor and enforce them, and rooting out potential privacy issues is a difficult process. Sometimes they only come out by accident, as our French businessman will tell you.

Even if a company does have the best of intentions for your data, there are flawed internal controls, and rogue staff. There are buggy programs that tell people – and governments – your business by accident. What happens to those good intentions then?