Tesla’s Autonomous Vehicle Testing Makes Everyone a Crash Test Dummy

Photo by Roberto Nickson | Unsplash

 

 

Every time you leave your house, you’re an unwitting human test subject in the grand experiment of the race to autonomous vehicle (AV) deployment, even if you never ride in one.

 

The AV industry, spearheaded by companies like Tesla, promises us improved road safety as well as a panacea for transportation equity. But those promises remain aspirational for the time being. There is no credible data to document that a truly driverless car will be safer. While AVs won’t drive drunk, they do show an utter lack of common sense in novel situations, which can cause different problems, resulting in minor fender-benders to fatal crashes. Despite recent advances, AV technology only barely works in very restricted settings without constant human supervision. Hope remains high, but it will take many years – perhaps decades – to deliver on today’s promises.

 

Meanwhile, AV testing presents very real risks to other road users. The use of immature vehicle automation technology on public roads has resulted in reckless driving, numerous injuries, and even some deaths. Meanwhile, regulators have been sitting on the sidelines while the industry disdains its own safety standards. The recent rollout of Tesla Full Self-Driving (FSD) public road testing has increased the urgency of addressing this problem.

 

How Did We Get Here?

The early days of AVs involved university experiments, such as a pair of Carnegie Mellon University (CMU) students crossing the country 98% hands off the steering wheel in 1995. About a decade later, DARPA held high-profile contests, culminating in the Grand Challenge. A set of follow-on corporate research projects followed, most famously at Google (now Waymo), and later at Uber Advanced Technology Group (UATG) as well as a number of competitors.

 

Despite massive amounts of money thrown at the technology, AVs have been reluctant to materialize. In 2017, at least nine companies claimed they’d have one by the end of 2021. The reality has turned out quite differently. Today, no consumer purchased vehicle can operate safely without constant driver attention. Commercially operated AVs are scarce, limited to very specific geographic areas, require significant support staff, and amount to money-losing demonstrations. In other words, despite all the hype, AVs remain a research project, often run by graduates from those same university experiments, except now with tremendously higher stakes.

 

The limiting factor to AV technology is that the real world is a messy, unpredictable place. There are, for practical purposes, an infinite number of novel things that can happen on real-world roads. However, the core machine learning technology used for AVs is only good at things it has seen before and, well, actually learned.

 

The first time an AV sees something novel, it is prone to making incredibly stupid mistakes. Mistakes a human driver is unlikely to make. Mistakes like not recognizing a pedestrian because that person is wearing yellow and the AV design team hasn’t encountered enough people wearing yellow before. Not fun if you happen to be a construction worker directing traffic or a school crossing guard in the rain, but a real example.

 

Training on rare situations creates an insatiable demand for AV training data. Mind-boggling amounts of data. Not a few samples, but rather thousands of examples for anything it needs to interpret. This includes uncommon things such as wheelchair riders, Halloween costumes, flooded roadways, and animal encounters. It takes a long time to encounter rare events in the wild. So the industry initially went for a brute force approach, ramping up spending to get as many miles of road testing as possible. After all, if you’re chasing a trillion-dollar payday, what’s a few hundred test vehicles?

 

The Uber Pedestrian Fatality

Uber ATG (UATG) was a poster child for the large-scale quest for road data. Their data collection involved putting poorly behaving AV test vehicles on public roads and telling everyone their “safety driver” would prevent crashes. Pressure mounted for more and more test miles as the technology didn’t improve as quickly as expected. Safety complacency combined with data collection urgency resulted in cutting corners on operational safety.

 

On March 18, 2018, a UATG test vehicle tragically vehicle struck and killed a pedestrian in Tempe Arizona. The root cause was not a pedestrian crossing mid-block at night. Nor was it a technical defect in the vehicle, or an inattentive test driver. The fundamental cause of the fatality was a broken safety culture that placed collecting more test data above public safety.

 

UATG has since been sold off, but the lessons learned from that fatal crash were incorporated into industry safety guidance, most notably the SAE J3018 standard, which codifies how to test self-driving prototypes safely.

 

The rest of the industry paused, increasing the use of less-risky simulation in addition to road testing, and then resumed testing. Some companies claim to be doing safe testing, but it is difficult to know who is safe and who is practicing safety theater. None of them will commit to acting on the lessons learned from the UATG tragedy by pledging conformance to the SAE J3018 testing safety standard.

 

More concerning, the AV industry is pushing back – hard – against any form of regulation, including any requirement to follow their own industry’s safety standards. This is bizarre, because engineering is built on standards, and every other life-critical industry follows its own safety standards. That includes aircraft, trains, chemical process plants, and so on.

 

But not AVs. Nobody outside the AV companies knows how safe their testing is, nor how safe they plan to be when they deploy on public roads.

 

Reckless Tesla Road Testing

Tesla has followed a somewhat different path than Uber by using their customers to collect data instead of hiring and training professional testers.

 

Tesla initially deployed “autopilot” software that controls both steering and speed control in limited situations. We’ve known since the late 1990s that if you do this, you’ll get “driver dropout” in which the driver stops paying attention. Humans are horrible at monitoring boring automation. Effective driver monitoring to ensure alertness is essential. But Tesla has clearly inadequate driver monitoring.

 

Tesla exhorts drivers to pay attention, but social media videos of inevitable misuse and abuse abound. There have been multiple deaths, all blamed on drivers. The highly respected National Transportation Safety Board (NTSB) has weighed in with safety recommendations that Tesla ignores.  Elon Musk even hung up on the Chair of the NTSB during a phone call. Additionally, the US government safety agency NHTSA has also failed to act substantively on NTSB recommendations, indicating deep systemic troubles in AV regulatory mechanisms.

 

While autopilot is viewed as a driver assistance system, Tesla makes it clear that they are really working on a full AV capability, promising a robo-taxi fleet. Recently Tesla started deploying “Full Self Driving” (FSD) capability to retail customers, which is specifically intended to navigate city streets. Over 10,000 customer vehicles are part of an FSD “beta test” on public roads across the country.

 

The Tesla FSD track record has been, in a word, scary. Videos show FSD testers running red lights, running stop signs, and lunging across centerlines into oncoming traffic. These actions are inherently reckless driving that endangers all road users. At the same time, Tesla plays coy with regulators, claiming that the presence of a driver not trained for testing makes them a “Level 2” unregulated vehicle when there is no functional distinction between FSD testing and other AV testing, except for Tesla’s use of civilian drivers instead of trained testers.

 

Tesla fans recite unconvincing talking points to try to defend this approach, ranging from the morally questionable to the absurd – such as saying that it is not possible to develop such technology without taking these risks. The core issue is not uncertainty as to whether Tesla’s approach will eventually produce a viable technical result. The issue is that Tesla FSD testing flouts regulations, safety standards, and industry safety norms, needlessly putting other road users at risk—including us.

 

What Happens Next?

While all the companies – including Tesla – say some version of “safety is #1,” their actions speak louder than their words. The general industry strategy is to remain unregulated and not held accountable for following industry safety standards. Tesla is simply more aggressive than the rest in executing this strategy.

 

A significant cause of the problem is a dysfunctional blending of Silicon Valley “go fast and break things,” automotive “it’s the driver’s fault,” robotics engineering “it’s all about the cool demo,” and NHTSA’s “non-regulatory” policy on AV safety. Mixing those cultures creates a toxic brew of safety issues for all of us, and can only be resolved via fundamental cultural change.

 

To be clear, I’d love to see this technology succeed. Perhaps this year will be the year they really deliver something scalable. But I’m betting it will continue to take longer than advertised. Meanwhile, if nothing changes avoiding further testing fatalities will require nothing short of a miracle.

 

Unfortunately, the public can’t really opt out crash test dummy status (unless you just want to stay at home). But what you can do is contact your municipal, state, and federal government representatives and tell them that it is time that regulators, especially NHTSA, reign in needlessly risky testing on public roads. Public road testing should not involve reckless driving. Tesla and other AV companies should conform to their own industry safety standards. Making roads safer does not require further needless deaths.

 


 

Philip Koopman

Philip Koopman is a professor at Carnegie Mellon University, Pittsburgh, Pennsylvania who has worked on autonomous vehicle safety for 25 years.


We welcome for consideration all submissions that adhere to three rules: nothing defamatory, no snark, and no talking points. It’s perfectly acceptable if your view leans Left or Right, just not predictably so. Come write for us.

Share With Your Connections
Share With Your Connections
More Exclusive Content
The Latest News from Smerconish.com in Your Inbox

Join our community of over 100k independent minds

This field is for validation purposes and should be left unchanged.

We will NEVER SELL YOUR DATA. By submitting this form, you are consenting to receive marketing emails from: Smerconish.com. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Aweber

The Latest News from Smerconish.com in Your Inbox

Join our community of over 100k independent minds

This field is for validation purposes and should be left unchanged.

We will NEVER SELL YOUR DATA. By submitting this form, you are consenting to receive marketing emails from: Smerconish.com. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Aweber

Write for Smerconish.com

Thank you for your interest in contributing to Smerconish.com Please note that we are currently not accepting submissions for Exclusive Content; we appreciate your understanding.