Autonomous Vehicles Aren’t Ready for Prime Time Yet

Autonomous vehicles are the future. There’s really no stopping the progress of technology from moving forward. Before I dive into the topic, I’d like to straighten out some confusion around the terms. An autonomous vehicle isn’t the same as a self-driving vehicle. The two terms sound the same but an autonomous vehicle can navigate and transport goods and/or people with no input from a human driver or operator. A self-driving vehicle is one where parts of the driving experience are automated but still requires input from a human operator. A fully autonomous car isn’t available for public consumption, though companies such as Google and Uber have made headlines for testing prototypes. Meanwhile, vehicles available for the public have had some automated features and the advances in those features have happened very quickly. For example, Ford now offers adaptive cruise control (which allows the car to adjust its speed if it comes too close to another car in front of it), blind spot monitoring (a feature that will emit an audible alarm if the driver’s turn signal is on while another vehicle is in the blind spot), and even a Lane Keep Assist feature (which allows the car to control the wheel in order to keep the car within a lane) across the entirety of its passenger car lineup as available options (similar features will likely show up on their next generation of commercial vehicles). Other manufacturers such as Kia, Lexus and Subaru have advertised their automated emergency braking feature (where the car will hit the brakes for you in case you aren’t paying attention). All of these are great features but it’s only inching us closer to truly autonomous vehicles, which will radically change the way we use transportation. I believe that autonomous vehicles are the future and they have a great potential to solve a number of serious problems that we have today with our current transportation. However, I’m skeptical of the current implementation of self-driving cars on public roads and I worry about how the revolution of autonomous vehicles will affect overall public safety, including cyclists and pedestrians.



Firstly, autonomous vehicles are only safer than human drivers in ideal conditions. I believe that autonomous vehicles can be safer than humans if there are no human drivers on the road any longer because they are programmed to obey the law exactly as it’s written. To use a quote from the 80’s film Short Circuit, “It's a machine, Schroeder. It doesn't get pissed off. It doesn't get happy, it doesn't get sad, it doesn't laugh at your jokes: It just runs programs.” Machines and computers were designed to do specific jobs with great repeatability. Meanwhile, humans are infinitely less skilled at repeatability. We, as humans, are the wild card. Uber tested an autonomous vehicle in Pittsburgh late last year. A journalist writing for Engadget rode in it. He described the trip as mostly uneventful, except when it came to a 4-way stop. "The car also had a tough time dealing with a four-way intersection -- while an autonomous car will obey the letter of the law, humans don't. So, with safety as a top priority, the car sat at the stop sign, waiting for crossing cars to come to a complete stop before it would enter the intersection. But most people out there don't come to a complete stop at a stop sign, so we just sat and waited while multiple cars crossed in front of us, glancing curiously at the strange bed of sensors on top of the vehicle." There was a human driver to take over, so Ingraham isn’t currently still sitting in an intersection. I think this is just one of the numerous examples of why autonomous transportation simply isn’t ready for public consumption. Elsewhere in his article, Ingraham writes, "The times when my safety driver had to take control were less about the car doing something unsafe and more about it being confused about what its many sensors and cameras were recording. For example, the car didn't know how to deal very well with a truck that was double-parked. It read the truck as a vehicle stopped in the road, but it didn't have the context to know that it wasn't going to move anytime soon, so we just sat behind it until the driver pulled around it." A large part of operating any vehicle means knowing how to react not just to individual circumstances (such as avoiding road debris) but deciding how to avoid road debris when you’re in the far-left lane of the interstate, in the rain at night, with a semi illegally behind you overloaded with 40,000 lbs., and a car to your right with the driver trying to find their favorite playlist on Spotify.

Additionally, there’s a big debate on how the legality and programming for autonomous vehicles will affect public safety. As it stands now, in an auto accident, the drivers of the vehicles are liable and the driver who caused the accident will be at fault. How does that liability shift when the cars are driving themselves? Many believe that the legal liability lands with the manufacturer. The state and federal laws for operating vehicles have always assumed that there’s a driver or operator of the vehicle so our legal system just isn’t ready to handle the immense changes needed for regulating autonomous vehicles. In early May of 2016, there was an accident involving an Ohio man named Joshua Brown who was killed while using Tesla’s Autopilot feature on his Tesla Model S sedan. Brown was in his vehicle moving along a divided highway when a large tractor trailer made an unsafe left-hand turn in front of the Tesla. The Tesla’s cameras were unable to distinguish the side of the white, unmarked trailer from bright sky, so the Tesla proceeded, colliding with the tractor trailer and killing Brown. Tesla issued a public statement after learning of the accident, explaining that their Autopilot feature is not meant to be a truly autonomous system that would allow a driver to completely relinquish control. However, simply looking at the name of the feature, most consumers believe that that’s exactly what it is. Tesla was not deemed liable but it did have a falling out with the camera manufacturer, MobileEye, shortly after that and another similar accident in Florida involving the same model car.

Another tough discussion surrounding autonomous vehicles involves the programming of the artificial intelligence (AI) of the cars. Manufacturers are trying to figure out the morality of autonomous vehicles. As a human driver, if you’re in a situation where you either have to run into a group of pedestrians or collide head-on with a semi, what do you do? Assume for the sake of the question that hitting the pedestrians will kill 3 people but colliding with the truck will kill you and one passenger. The car would likely consider that hitting the truck would result in the lowest loss of life, therefore that’s the best and most moral decision. However, making a car that sacrifices its passengers probably won’t sell very well. Gill Pratt is the Chief Executive of the Toyota Research Institute and met with lawmakers in a House subcommittee hearing on Capitol Hill in early 2017. He says “The artificial intelligence systems on which autonomous vehicle technology will depend are presently and unavoidably imperfect. So, the question is ‘how safe is safe enough’ for this technology to be deployed.”

Lastly, the general public has mixed feelings about autonomous vehicles, at best. One of the contributing factors is that people generally have much higher expectations of machines than they do of other people. Researchers at the University of Pennsylvania conducted a 2014 study where participants were asked to listen to human and computer weather predictions. Their results stated, “people more quickly lose confidence in the algorithmic than human forecasters after seeing them make the same mistake.” This idea makes perfect sense because most people will say things like “I’m only human” when they make mistakes but they become frustrated when a tech item doesn’t work (such as when a website won’t load on their smartphone). Another factor is that with connected technologies coming to vehicles, it creates a larger attack surface for hackers. At a recent DEFCON-24 convention (a mainly IT security convention, short for Defense Convention) three Chinese researchers demonstrated how easy it was to manipulate the obstacles seen by the navigation systems from Tesla, Audi, Volkswagen, and Ford. Not too long ago, there are a number of similar demonstrations on current cars where hackers were able to remotely control the braking and speed of a moving vehicle. In Hosansky’s Future of Cars article, he writes, “Vehicle-to-vehicle communications could make traffic dangerously susceptible to technological failure or to deliberate sabotage.” With so many possible downsides to autonomous vehicles, this creates a chicken and egg situation of sorts. Consumer confidence has to be in place to allow for greater testing of autonomous vehicles on public roads. However, autonomous vehicles have a lot of “wrinkles to iron out” before consumers feel comfortable using or sharing a road with them.

Many proponents of autonomous vehicles are excited about the problems they can solve. In the article Self-Driving Cars, the unknown author communicates "Researchers believe that autonomous cars will make roads safer for all drivers. According to researcher Ryan Hagemenn, if all the vehicles on the roads were fully autonomous, there would be a tremendous reduction in crashes caused by human error, with the potential for a 99 percent reduction in fatalities resulting from traffic accidents." If every car on the road were autonomous, then I believe that accidents will be greatly reduced. However, does that mean that we should ban everyone on the road with a non-autonomous car from driving? Millions of people drive on public roads and many people love cars and love driving. Though some could argue the relevancy of western culture’s love-affair for the automobile (particularly among my generation), car culture is still very much alive and well. I don’t see the general public giving up their cars for a variety of reasons, not limited to economics (new cars aren’t cheap), culture (as a country, we love cars), and a distrust of machines (as evidenced by various studies to even popular culture such as Terminator 2, iRobot, etc.).

In his article The Big Moral Dilemma Facing Self Driving Cars, Steven Overly argues that "More than 35,000 people are killed in car collisions in the United States in 2015, according to the National Highway Traffic Safety Administration. The agency estimates 94 percent of those wrecks were the result of human error and poor decision-making, including speeding and impaired driving. Self-driving enthusiasts assert that the technology could make those deaths a misfortune of the past. But humans are not entirely rational when it comes to fear-based decision-making. It’s the reason people are afraid of shark attacks or plane crashes, when the odds of either event are exceptionally low." While the rationality of humans certainly is not perfect, a number of researchers studying autonomous transportation aren’t so quick to agree. Steve Shladover, director of the University of California’s Partners for Advanced Transportation Technology or PATH program, states “Regular driving involves having to react instantly to darkness, heavy rain, snow, kids following balls out into streets, cyclists and things suddenly falling off of trucks. There’s nothing even remotely approaching the ability to do that. Even the most sophisticated of those test vehicles is far inferior to a novice driver.”

Google is by far one of the most enthusiastic companies behind autonomous vehicles. During an announcement regarding testing of autonomous vehicles in 2010, they explained “Our goal is to help prevent traffic accidents, free up people’s time and reduce carbon emissions by fundamentally changing car use.” Hosansky further elaborates, "The company has stressed that self-driving cars would benefit some of the most vulnerable members of society such as elderly people who can no longer drive. In a YouTube video, a Google engineer chatted with a bland man, Steve Mahan, as he used a self-driving car to run errands. ‘I love it,’ said Mahan. ‘Where this could change my life is to give me the independence and flexibility to go [to] the places I both want to go and need to go’."

I don’t outwardly disagree with these points but I believe that autonomous vehicles need a lot more discussion before we allow companies, automotive and technology, to test them on public roads. I recently discussed some of the issues surrounding autonomous vehicles with friends via Facebook and I got a wide variety of responses. Some were weary of the idea while many loved the thought of a (somewhat) distant future where nobody owned cars and everyone simply subscribed to a car-as-a-service company that operated a fleet autonomous vehicles, hailed via smartphones. Some suggested only allowing autonomous vehicles in certain areas or on certain types of roads (IE, the interstate as opposed to downtown or a college campus). However, autonomous vehicles would enable the elderly and disabled to easily travel with independence. If autonomous vehicles could only go certain places and/or at certain times then I don’t see a differentiation from other public transit mediums such as a bus, train or subway.

As you can see, there’s a lot more to think about regarding autonomous vehicles than simply whether tech companies such as Uber and Google should be building cars, or how far the technology has come. There are decisions of public safety, legality, morality, economics, governmental regulation, environmental impact, and city planning that need to be made and I’m afraid that the progress march of technology isn’t waiting for those decisions to be made. All of these decisions affect not just drivers, but cyclists, pedestrians and all developed populations. As a technology enthusiast, I’m excited by what the future holds for the transportation industry. As an adult who’s seen how dangerous cars can be, I’m very nervous about the seemingly reckless implementation of autonomous transportation.

Comments

Popular posts from this blog

Installing CentOS 7 on a Raspberry Pi 3

Modifying the Zebra F-701 & F-402 pens

How to fix DPM Auto-Protection failures of SQL servers