Bad incentives will make autonomous vehicles unsafe

August 19, 2023

Occasionally, friends have been surprised I don’t support expansion of autonomous vehicles in San Francisco. Even though they know I support changing our transportation system to remove car dominance, some think robotaxis will be a lesser evil than cars driven by humans because they’ll be safer.

Recent news makes it harder to defend this AV industry talking point. There have been increasing reports of erratic behavior and, recently, a serious crash where a Cruise ignored a siren and entered the path of an emergency vehicle. At least in its current state, the technology doesn’t seem especially safe.

And since so much has been written about the robotaxis at this point, I usually point to an existing explanation of the problems with robotaxis, like Safe Street Rebel’s statement, instead of winging it myself. The SSR statement highlights the labor, surveillance, accessibility, and car-trip-generating aspects of autonomous vehicles, as does the SF Bicycle Coalition’s statement.

But there’s a more subtle safety issue I want to draw out. It comes down to incentives and human responsibility.

The autonomous vehicle companies have decided to operate as urban taxi services1, which means they have to offer fast trips in the city to be competitive. There’s an inherent tension between that and pedestrian safety.

Let’s say you’re a driver waiting to turn right on red, but there’s a pedestrian crossing in front of you. If, instead of completely stopping, you creep toward that pedestrian like you’re going to hit them, there’s a good chance they’ll get nervous and increase their pace, even breaking into a run.

Human drivers know they can save time by behaving aggressively like this. I’ve had a near miss where I had to scream at one who was creeping at me without looking. If you’re attuned to it, you’ll see this every day, all over the city.

Lo and behold, robotaxis have learned this technique:

…the group [SSR] has noticed increasingly aggressive behavior from the driverless cars. Whereas the cars used to come to complete stops before crosswalks if a pedestrian was detected, they now do “the creeping slow charge thing,” the member said, and “kind of inch into the crosswalk” to bully people out of the way. “It’s kind of intimidating and menacing,” the member said, and a potential harbinger of how the cars will behave if they become more prevalent and accepted as part of the city streets.

Robotaxis also roll stop signs, as I observed sitting out at Robin’s Cafe one afternoon while empty Cruise cars made 10 passes in the course of me sipping a single coffee. And, just like SF’s most inattentive or aggressive human drivers, they’ve been documented stopping ahead of the stop line and blocking crosswalks on red lights.

It’s impossible to say for sure if these behaviors are glitches or intentional aggressiveness, but here’s my point: even if the software driving these cars were technically perfect—which it’s obviously not!—there will always be a strong incentive to program in as much aggressiveness as Cruise, Waymo and Zoox think they can get away with, because they are a business and have to return profits to their investors, and once the novelty wears off, no one will use them if they’re slower than Uber, Lyft and traditional taxis.

And they can get away with a lot of aggressiveness! Robotaxis have less accountability than humans driving cars. They can’t be cited for moving violations in California. At all. Ever.

Now I’m not one to promote policing as the solution for car crashes—that raises equity issues and can end very badly for Black and brown drivers—the emphasis should be on street design. But SFPD did write 780 tickets for moving violations in June, and was not authorized to write a single ticket for a robotaxi. At least in theory, a human driver who repeatedly endangers others can lose their license and be taken off the streets. Cruise and Waymo can’t. No matter what they do, we have no recourse except to petition the state DMV and CPUC.2

Another way autonomous vehicles evade accountability is by hiding who’s responsible for the safety-versus-speed tradeoffs they make. If a human-driven car hits you while creeping into a crosswalk, as almost happened to me, there’s a person right there who’s obviously responsible for the decision to drive that way.

If a robotaxi does the same, there’ll still be a human who made the decision that the car would behave that way. But who? A middle manager who ordered the company’s engineers to reduce dwell time at intersections to hit their KPIs that quarter? A reluctant engineer, themself a bicyclist and transit rider, who’d joined the company hoping their perspective would make AVs safer, who had a sick feeling in their stomach as they implemented the more aggressive behavior, but did as they were told because they feared layoffs?

Whoever it is, that person will probably never be identified and connected with the crash they caused. Instead, the behavior of the car may be falsely viewed as objective and correct, since it followed its programming exactly and can’t get distracted. It must have been the pedestrian’s fault. By concealing human agency like this, along with an advertising and lobbying campaign, robotaxi companies could change cultural and legal norms around the use of streets and make it our fault if we do anything that slows down their cars.

It wouldn’t be the first time technology reprogrammed the streets. It’s how we ended up with the so-called crime of jaywalking in the United States. People used to walk freely anywhere on the roads, but that posed problems for the nascent car industry, as it made their product too slow to be attractive for city trips, and got them vilified for being a hazard to other road users. Sound familiar?

If robocars are to be allowed on public streets at all—which I can’t stress enough is a choice we make, it’s not inevitable we’ll consent to this—then strong accountability mechanisms will be needed to protect against these bad incentives and tendencies. That, on top of the fact that they generate car trips, erode support for public transit, disempower workers, aren’t accessible, surveil us, and are currently very glitchy, is why I support efforts to rescind the blank check CPUC just issued to Cruise and Waymo.

  1. Waymo shut down its trucking division, which is a bit surprising since one might think the lack of pedestrians on controlled-access highways would give autonomous vehicles a greater advantage in that context. Apparently technical limitations of Lidar make operating at freeway speeds challenging. 

  2. Or engage in direct action. See also Tom Humberstone’s brilliant cartoon urging us to reassess the original Luddites

You can follow me on Mastodon or this blog via RSS.

Creative Commons BY-NC-SA
Original text and images (not attributed to others) on this page are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.