Most of us likely wouldn’t want Apple to store a copy of our DNA or our fingerprints, but that’s pretty much what it’s doing with another one of our biometric identifiers: namely, our voices.
On Thursday, David Talbot, writing for MIT’s Technology Review, reported that researchers are concerned that Apple’s digital assistant Siri is taking far too intimate an imprint of our biometrics and storing far too much of that data on Apple servers.
Voice recordings of users asking questions – which can be personal and/or revealing – travel over the airwaves and are stored on Apple’s servers.
As Talbot wrote, the data contained in those voice recordings differs from other data pumped out by smartphones and computers in that it’s distinct for each individual.
Thus, if the transmission were hacked or subpoenaed, or if a disgruntled employee got hold of it, a given recording could be linked to an individual, aided by increasingly sophisticated voice recognition technologies.
(How sophisticated has voice recognition gotten, you may ask? Recent research has shown it’s possible to detect Parkinson’s symptoms by using algorithms to detect changes in voice recordings.)
Researchers are suggesting that voiceprints could be kept more private by having part of the data processing carried out right on the phone.
Prem Natarajan, executive vice president at Raytheon BBN Technologies, told Technology Review that one approach that would ensure better privacy would be to keep anything that IDs a user on the phone itself, while Apple transmits speech features as opposed to actual speech.
Stripped-down features would be harder to link to individuals, he said.
Granted, this approach could strain the phone’s processor and battery, but speech recognition wouldn’t be stymied, Natarajan said:
"I think it is safe to say that not having access to the [full voice] signal does not impose any meaningful penalty."
This approach has already been implemented on Microsoft’s Kinect, for one, Talbot notes.
Back in 2010, Microsoft assured customers that its new device would keep all its data to itself instead of sending it to Microsoft.
That must have been comforting to consumers, given the advanced sensor array on an Internet-connected device capable of capturing voice, recognizing faces and tracking body movement, plunked down in the middle of millions of living rooms across the land.
Researchers aren’t the only ones concerned about Siri’s privacy implications.
As Technology Review reported last month, IBM has silenced Siri on employees’ iPhones.
The Siri lockdown is part of a broader move to secure the Pandora’s box of security vulnerabilities unleashed when IBM in 2010 adopted a bring your own device (BYOD) policy.
Rather than save IBM money it would have spent buying Blackberries for its staff, the move has actually cost the company, IBM CIO Jeanette Horan told Technology Review.
The problems come from hundreds of employees who are “blissfully unaware” that they were using devices stuffed with popular apps that could carry security risks, all beyond IBM’s control, Horan said:
"We found a tremendous lack of awareness as to what constitutes a risk. … [So now] we're trying to make people aware."
Since rolling out the BYOD policy, IBM has set guidelines regarding what apps employees can use and which to avoid.
Here are some more of IBM’s moves to tighten security around employee devices, according to Technology Review:
- Public file-transfer services such as Dropbox, which Horan said could allow confidential information to leak out, have been banned.
- The company is working to increase awareness about forwarding IBM email to public Web mail services, which violates established protocol.
- IBM’s also working to increase awareness about the dangers of using smart phones to create open WiFi hotspots, which can enable snoops to intercept data.
- The IT department is configuring every device before it’s allowed to access IBM networks, so that they can remotely wipe it in case of loss or theft.
- IT also disables public file-transfer programs such as Apple’s iCloud on devices, swapping them for an IBM-hosted version called MyMobileHub.
That sounds like a good security template for any organization dealing with BYOD.
Let’s hope that if enough large organizations ban Siri, and if enough researchers outline more-secure ways for Siri to process our biometric voiceprints, Apple will act to zip Siri’s lip a bit more tightly.
3 comments on “Apple’s Siri voiceprints raise privacy concerns”
Was this article about Siri or BYOD?
BYOD and the security challenges it presents to organisations like IBM is one aspect of the privacy/security concerns of Siri. Sorry if I didn't make that clear…
Doesnt Google and Skype already do this aswel?