The 10 Dangers of Self-Driving Cars & How Engineers Are Solving Them
Designers, engineers, investors, and technology experts all agree that self-driving cars are the next step in the natural progression of automotive technology toward better, safer vehicles.
That doesn’t mean, however, that the transition will be seamless. Here are the ten challenges that builders of self-driving cars are still grappling with:
- Cars are weapons. Whether they’re used overtly as weapons in, for example, the Bastille Day truck attack in Nice, France, or accidentally so, cars can kill. While, in the United States, the number of fatal crashes in proportion to the number of miles that drivers travel annually has decreased steadily since the early 1900s, road traffic accidents still claim around 30,000 lives each year . That cars can be so dangerous is the major problem behind self-driving car development.
- Other cars aren’t automated. Ideally, self-driving cars will eventually be able to synch with one another to coordinate things like traffic merging onto a busy freeway and to do things like redirect traffic patterns to minimize congestion. The problem right now is, almost all of the other cars on the road aren’t self-driving, so they behave in ways self-driving cars can’t predict. Programmers must, for example, “teach” self-driving cars how to predict and react to being suddenly cut off by another driver.
- Self-driving cars don’t listen to the police. They actually don’t really get the concept of “the police.” According to a piece in the MIT Technology Review, self-driving cars still can’t tell a policeman redirecting traffic from any other pedestrian on the road . Although a comical photo of a Google self-driving car prototype getting pulled over by police circulated on the internet in 2015 , engineers are still figuring out how to get their cars to respond appropriately to police and emergency vehicles.
- The future has no room for crumbling infrastructure. Potholes? Self-driving cars can’t handle them. Puddles? Self-driving cars will plow right through them – pedestrians beware. In the MIT Technology Review, Lee Gomes writes, “The car’s sensors can’t tell if a road obstacle is a rock or a crumpled piece of paper, so the car will try to drive around either. Urmson [director of the Google car team] also says the car can’t detect potholes or spot an uncovered manhole if it isn’t coned off” .
- When the maps aren’t accurate, self-driving cars are lost. Self-driving cars rely more heavily on map programs than on sensors that read the immediate environment. Where there is not a map, there is not a way. The car will make mistakes if it’s using an inaccurate or outdated map program, so verifying map accuracy is crucial to developers. Off roading, for now, is out of the question.
- Self-driving cars can’t drive according to road and weather conditions. Extreme weather and some not so extreme weather – like a light snowfall that obscures the centerline on a road – throw self-driving cars for a loop. Engineers are still working on self-driving car sensors that can figure out where the road is and how to drive on it in climates with conditions like snow and ice.
- Construction confuses self-driving cars. Just like how self-driving cars can’t recognize policeman, self-driving cars can’t recognize construction workers. They may be able to read the stop sign the construction worker is holding, but they can’t navigate unfinished road or take detours.
- Battery charge is key. By no means is the battery charge challenge limited to self-driving cars. Like self-driving cars, manually driven electric cars need to be charged. Even manually driven conventional cars need battery charges/replacements now and then. In February, Google revealed that it was testing ways to charge its self-driving prototypes wirelessly. Google proposes that, eventually, wireless chargers can be imbedded in roads to charge self-driving cars while they operate .
- System failures leave cars driverless, but not without passengers. It’s the stuff of my recurrent nightmares: a car that’s out of control, that doesn’t respond the way I expect it will to my attempts to drive it, that has taken its passenger – me – hostage. Because electronics control essentially everything in self-driving cars, there is a higher potential of complete system failure. Google promised us cars without steering wheels, but all smart cars should have some way to manually override the system in case of emergency. According to the UK Code of Practice for self-driving car, for example, any car tested on public roads must have a manual overdrive option .
- Machines like self-driving cars don’t have moral compasses. As machine automation takes over manual operation, the ethics of automating a dangerous tool like a military drone or, indeed, a car, has sparked debate among engineers and ethicists. In real life, drivers sometimes act in their own interest, consciously or unconsciously sacrificing others to walk away from an accident unscathed and sometimes act against their own interest, veering into barriers to avoid hitting other people or animals. Today’s engineers get to decide what kind of moral creatures self-driving cars will be.
Subscribe to Rhoonet to have my next article delivered right into your inbox: “How the AI System in Self-Driving Cars Can Teach Themselves a Simple Form of Encryption Protocol”
Please continue to add to this conversation and share with friends and colleagues.