Just outside of Phoenix, Arizona, in the town of Chandler, Google’s self-driving car subsidiary Waymo has been running on-road tests for their vehicles for the past two years. However, local residents are not very excited about the project. In fact, local residents are becoming hostile with the self-driving vehicles.
Police reports indicate that some local residents have pelted Waymo cars with rocks in addition to attempting to run them off the road. While the vehicles are autonomous, a human driver sits inside for safety purposes. In multiple cases, the human Waymo drivers have been threatened. Police have documented the drivers being threatened with things like PVC pipes and knives. In one of the more extreme cases, a local resident even brandished a gun.
Last month, AZCentral reported that 69-year-old Roy Leonard Haselton was arrested for waving his gun at a Waymo driver. Haselton later told reporters he was concerned about the safety threat posed by the vehicles, citing the death of a woman killed by an autonomous Uber vehicle in nearby Tempe.
In response to the Haselton case, Waymo suggested it was an isolated incident and that the community is supportive of the project.
“This is a rare circumstance involving a man with reported illness, and it doesn’t reflect the positive community response we’ve received from people who are excited and curious about self-driving technology,” Waymo said in a statement after the incident.
There have been at least 21 attacks on Waymo vehicles in the town of Chandler since they began operating.
Waymo has reportedly neglected to press charges for any of the incidents, in a likely attempt to avoid bad publicity. However, Waymo has denied that they were not pressing charges to avoid controversy.
“Safety is the core of everything we do, which means that keeping our drivers, our riders, and the public safe is our top priority. Over the past two years, we’ve found Arizonans to be welcoming and excited by the potential of this technology to make our roads safer. We report incidents we deem to pose a danger and we have provided photos and videos to local law enforcement when reporting these acts of vandalism or assault. We support our drivers and engage in cases where an act of vandalism has been perpetrated against us,” a statement from the company read.
According to Waymo, these accounts make up a very small portion of the 25,000 miles the vehicles travel each day in the area. However, only the most extreme cases, where people are willing to resort to violence, result in the police reports to which Waymo has responded. It is indeed possible that many other residents in town disapprove of the project but don’t have the urge to act violently.
The self-driving cars are maneuvered with complex algorithms that calculate everything from weather conditions, to red lights, to evasive moves in situations that may result in an accident.
Strange ethical questions have appeared in the tweaking of these algorithms. For example, is it ethical for a car to be programmed to kill you if it means saving the lives of many other people?
That very question is one that researchers at the University of Alabama at Birmingham (UAB) are currently considering.
UAB researcher Ameen Barghi said:
“Imagine you are in charge of the switch on a trolley track. The express is due any minute; but as you glance down the line you see a school bus, filled with children, stalled at the level crossing. No problem; that’s why you have this switch. But on the alternate track there’s more trouble: Your child, who has come to work with you, has fallen down on the rails and can’t get up. That switch can save your child or a bus-full of others, but not both. What do you do?”
“Utilitarianism tells us that we should always do what will produce the greatest happiness for the greatest number of people,” he explained. In other words, if it comes down to a choice between sending you into a concrete wall or swerving into the path of an oncoming bus, your car should be programmed to do the former. Deontology, on the other hand, argues that some values are simply categorically always true. For example, murder is always wrong, and we should never do it. Even if shifting the trolley will save five lives, we shouldn’t do it because we would be actively killing one.”
Autonomous vehicles were in the news again this week, after a self-driving Tesla ran over a humanoid robot named Promobot at the CES technology convention in Las Vegas. Many have suggested the incident was staged as a PR stunt, which is plausible considering Tesla CEO Elon Musk’s perspective on Artificial Intelligence.
However, according to Promobot’s Development Director Oleg Kivokurtsev, the team had nothing to do with the accident.
“Of course we are vexed, we brought this robot here from Philadelphia to participate at CES. Now it neither cannot participate in the event or be recovered. We will conduct an internal investigation and find out why the robot went to the roadway,” Kivokurtsev said.