Congress looks to take the wheel on autonomous vehicles

What is the US Congress doing to enable the upsides of the coming autonomous vehicle (AV) world and to protect everybody from the downsides?

Well, they’re discussing it. In some cases they’ve proposed legislation about it. What will actually get done and when is not yet clear.

What is clear is that so far, while cybersecurity and privacy are components of both discussion and legislation, the language is a long way from airtight, which could be a problem.

Autonomous vehicles (AVs), otherwise known as self-driving cars, are automatic targets. Given the massive amount of data collection and connectivity necessary to make such a system function, how could they not be?

The gleam in the AV industry’s eye is for hundreds of millions of “devices” to be collecting and sharing data – through V2V (vehicle to vehicle) communication – that will identify drivers and perhaps their passengers, track their location, speed, driving habits and more.

Besides the obvious privacy implications, that offers opportunities for hackers to take control of critical systems like brakes, steering, accelerator, locks etc, or demand a ransom to leave them alone. In short, AVs are a “target-rich environment”.

And, of course, multiple giants of both the auto industry and the internet – Ford, GM, Toyota, Google, Apple, Tesla, Uber, Lyft and more – are racing to get their models on the road.

So what is Congress doing?

The House Energy and Commerce Committee, in a rare display of bipartisanship late last week, unanimously approved the SELF DRIVE Act, which contains sections on both cybersecurity and privacy. The bill will now move to a vote in the full House.

Among other things, it would require manufacturers of any “highly automated vehicles” to have a cybersecurity plan that includes, “a process for identifying, assessing, and mitigating reasonably foreseeable vulnerabilities from cyber attacks or unauthorized intrusions, including false and spurious messages and malicious vehicle control commands.”

Its privacy provisions track pretty closely to the “Privacy Principles” issued in 2014 by the Alliance of Automobile Manufacturers and the Association of Global Automakers, which call for vehicle owners to be given “clear, meaningful notice” about the collection and use of driver data; “certain choices” about how it is collected, used and shared; along with other provisions about data security, minimization and de-identification and retention.

And Senator Edward Markey (D-Mass.) introduced a bill in March titled the Security and Privacy in Your (SPY) Car Act that would direct the National Highway Traffic Safety Administration (NHTSA) to “conduct a rulemaking” to protect against “unauthorized access” to the vehicle’s electronic controls or driver data.

Elsewhere in the Senate, autonomous vehicle cybersecurity got some lip service prior to a hearing titled “Paving the Way for Self-Driving Vehicles” before the Senate Committee on Commerce, Science, and Transportation about six weeks ago.

Committee chairman Senator John Thune (R-S.D.), along with ranking minority member Bill Nelson (D-Fla.) plus Gary Peters (D-Mich.), issued a list of bipartisan “principles”, the last of which declared that “cybersecurity should be a top priority for manufacturers of self-driving vehicles and it must be an integral feature of self-driving vehicles from the very beginning of their development”.

But Thune, in his opening statement at the hearing, didn’t even mention cybersecurity or privacy, and the witnesses didn’t include a cybersecurity expert or a privacy advocate.

The only witness who even brought those topics up was John M Maddox, president and CEO of the American Center for Mobility.

There is no word yet on when the committee will file legislation – Thune’s office had not responded to questions at the time of this post.

But, as numerous experts note, whatever the intent, legislative language frequently leaves a lot of, uh, wiggle room.

Who is going to define “reasonably foreseeable vulnerabilities” mentioned in the SELF DRIVE Act; or “reasonable measures to protect against hacking attacks” in the SPY Car Act?

Lee Tien, senior staff attorney at the Electronic Freedom Foundation, said “reasonable” can depend on what levels of security and privacy are “practically possible”.

“Cars have lots of parts,” he said. “They’re not all made by Ford or GM or whomever. There’s a lot of assembling of parts – hardware, software, firmware – made by other companies. Who has vetted those parts and their code? Who knows, at a deep level, what that code does? How much modeling of how the systems work together has been done?”

And regarding privacy, he noted that the Markey bill has a huge exception for “driving data stored as part of the EDR [event data recorder] system or other safety systems onboard … that are required for post-incident investigations, emissions history checks, crash avoidance or mitigation, or other regulatory compliance programs”.

“That could swallow the rule, frankly,” he said.

Some experts say Congress should stay out of it – that this is a problem for the private sector to solve.

Gary McGraw, vice-president of security technology at Synopsys, contends that,“the government won’t ever figure it out. They can’t figure out health care, so why would we think they can figure this out?

He said manufacturers are “paying a lot more attention to it. They’re working on it.” And he said he has some faith in the market as well. “This [security] could be a real differentiator for customers,” he said.

Indeed, security appears to be more of a “top priority” for the industry than it does for Congress.