But there are other questions to be sorted out as well, like what happens to cabdrivers and whether such vehicles will create sprawl.
And there is an existential question:
Who dies when the car is forced into a no-win situation?
"There will be crashes," said Van Lindberg, an attorney in the Dykema law firm's San Antonio office who specializes in autonomous vehicle issues. "Unusual things will happen. Trees will fall. Animals, kids will dart out." Even as self-driving cars save thousands of lives, he said, "anyone who gets the short end of that stick is going to be pretty unhappy about it."
Few people seem to be in a hurry to take on these questions, at least publicly.
It's unaddressed, for example, in legislation moving through Congress that could result in tens of thousands of autonomous vehicles being put on the roads. In new guidance for automakers by the U.S. Department of Transportation, it is consigned to a footnote that says only that ethical considerations are "important" and links to a brief acknowledgement that "no consensus around acceptable ethical decision-making" has been reached.
There is evidence that people are worried about the choices self-driving cars will be programmed to take.
Last year, for example, a Daimler executive was quoted as saying its autonomous vehicles would prioritize the lives of its passengers over anyone outside the car. The company later said he'd been misquoted, since it would be illegal "to make a decision in favor of one person and against another."
Last month, Sebastian Thrun, who founded Google's self-driving car initiative, told Bloomberg that the cars will be designed to avoid accidents, but that "if it happens where there is a situation where a car couldn't escape, it'll go for the smaller thing."
But what if the smaller thing is a child?