Good idea? Bad idea? Would you want one?
But as the momentum for self-driving cars grows, one question is getting little attention: should they even be legal? And if they are, how will the laws of driving have to adapt? All of our rules about driving ? from who pays for a speeding ticket to who is liable for a crash ? are based on having a human behind the wheel. That is going to have to change.
There are some compelling reasons to support self-driving cars. Regular cars are inefficient: the average commuter spends 250 hours a year behind the wheel. They are dangerous. Car crashes are the leading cause of death for Americans from 4-34 and they cost some $300 billion a year. Google and other supporters believe that self-driven cars can make driving more efficient and safer by eliminating distracted driving and other human error. Google?s self-driving cars have cameras on the top to look around them and computers to do the driving. Their safety record is impressive so far. In the first 300,000 miles, Google reported that its cars had not had a single accident. Last August, one got into a minor fender-bender, but Google said it occurred while someone was manually driving it.
After heavy lobbying and campaign contributions, Google has persuaded California and Nevada to enact laws legalizing self-driven cars. The California law breezed through the state legislature ? it passed 37-0 in the Senate and 74?2 in the Assembly ? and other states could soon follow. The Alliance of Automobile Manufacturers, which represents big car makers like GM and Toyota, opposed the California law, fearing it would make it too easy for car makers and individuals to modify cars to self-drive ? without the careful protections built in by Google.
That is a reasonable concern. If we are going to have self-driven cars, the technical specifications should be quite precise. Just because your neighbor Jeb is able to jerry-rig his car to drive itself using an old PC and some fishing tackle, that does not mean he should be allowed to.
As self-driven cars become more common, there will be a flood of new legal questions. If a self-driving car gets into an accident, the human who is ?co-piloting? may not be fully at fault ? he may even be an injured party. Whom should someone hit by a self-driving car be able to sue? The human in the self-driving car or the car?s manufacturer? New laws will have to be written to sort all of this out.
How involved ? and how careful ? are we going to expect the human ?co-pilot? to be? As a Stanford Law School report asks, ?Must the ?drivers? remain vigilant, their hands on the wheel and their eyes on the road? If not, what are they allowed to do inside or outside, the vehicle?? Can the human in the car drink? Text message? Read a book? Not surprisingly, the insurance industry is particularly concerned ? and would like things to move slow. Insurance companies say all of the rules of car insurance may need to be rewritten, with less of the liability put on those operating cars and more on those who manufacture them.
At the signing ceremony for California?s self-driving car law, Jerry Brown was asked who is responsible when a self-driving car runs a red light. He answered: ?I don?t know ? whoever owns the car, I would think. But we will work that out. That will be the easiest thing to work out.? Google?s Brin joked, ?self-driving cars don?t run red lights.?
Neither answer is sufficient. Self-driving cars should be legal ? and they are likely to start showing up faster and in greater numbers than people expect. But if that is the case, we need to start thinking about the legal questions now. Given the high stakes involved in putting self-guided, self-propelled, high speed vehicles on the road, ?we will work that out? is not good enough.
Read more: Should Self-Driven Cars Be Legal? | TIME.com