The great-grandson of Henry Ford called on the auto industry and public institutions to address ethical issues emerging in a world where robot cars will make life-and-death decisions in roadway crashes – and to do it soon.
“These cars will have the ability to process data and make decisions much faster than we will as humans,” said Bill Ford, executive chairman of Ford Motor Co., which has promised to have robot taxis on the road by 2021. “No individual company is going to program these vehicles with a set of ethics that isn’t bought into by society at large.”
The discussion to set robot-car ethics must include the auto industry, government, universities and ethicists, said Ford, who commented to reporters Tuesday after a speech at the company’s headquarters in Dearborn, Michigan. With self-driving cars set to hit the road over the next five years, the need for this discussion is urgent, he said.
“How do you want these vehicles to behave?” Ford asked during his speech. “Whose lives are they going to save?”
Ford, 59, was among the first automaker executives to sound the alarm about the changes that would roil the industry as 60 percent of the world’s population migrates to large urban centers over the next 15 years, causing congestion, pollution and mobility challenges. Since warning of “global gridlock” in a 2011 TED talk, Ford has pushed his company to embrace new methods of mobility, including ride-sharing and driverless cars.
Chief Executive Officer Mark Fields said yesterday that the automaker will begin selling self-driving cars to consumers by 2025. The company also plans to offer robot taxis by 2021 and is working with global cities to ease congestion by offering bike sharing and commuter vans for ride-hailing.
Automakers and tech giants are bedeviled by the ethical challenges of programming robot cars to make life-or-death decisions. The promise of self-driving cars is that they’ll anticipate and avoid collisions, dramatically reducing deaths on U.S. highways, which rose 7.2 percent to 35,092 last year.
But accidents will still happen. And in those moments, a robot car may have to choose the lesser of two evils – for example, swerving onto a crowded sidewalk to avoid being rear-ended by a speeding truck or staying put and placing its occupants in mortal danger.
“There are a lot of ethical issues as a society that we have to work through before we have widespread adoption of autonomy,” Bill Ford said. “And that’s not getting talked about enough.”
The issue needs “very deep and meaningful conversations as a society,” he said. “Hopefully, you’ll get thoughtful people together to have this discourse. It has to happen.”
Was this article valuable?
Here are more articles you may enjoy.