Tesla Autopilot repeats fatal crash - do they learn from past mistakes?

Topic: 
Tags: 
Tesla Model 3 with top sheared off by crash with truck

Tesla's had another fatal autopilot crash in Florida, with almost the exact same pattern as their first fatal crash there. They crashed again into the broadside of a crossing 18 wheeler. What's disturbing is that Tesla's training protocols should have meant that making sure "this can't happen again" was on the top of their priority list, and it should have been easy to correct.

Yet it happened again. I have more analysis of the accident in the article on Forbes.com: Tesla Autopilot repeats fatal crash - do they learn from past mistakes?

An update I have been unable to make to the Forbes post, based on commentary from a reader:

Tesla Autopilot on most rural highways is limited to 60mph. The car was going 68mph. This could suggest the driver was pressing the accelerator. (Don't call it the gas pedal.) In this case, the Tesla warns that the TACC will not brake. It is unknown if it will still sound the collision warning -- I can think of no reason that it should not, in fact I think it should still emergency brake. We need more information, but Tesla can't provide it due to NTSB rules unless they want to get kicked out of the investigation, again.

Comments

Per https://www.theguardian.com/technology/2016/jun/30/tesla-autopilot-death-self-driving-car-elon-musk, the 2016 Mobileye accident was attributed to “failing to distinguish a white tractor-trailer crossing the highway against a bright sky”.

Recently, my neighbor got a seatbelt violation ticket while wearing a seatbelt over his jacket. The seatbelt and the jacket had very similar colors. The officer claimed he failed to see the seatbelt when driving by next to him twice.

Brad Templeton, this brings up the question, how does the state of the art sensory image analytics with moving objects compare to the performance of the bare human eyes?

Joshua Brown was definitely on AutoPilot at 74 mph on the exact same type of Florida highway.

The Guardian article from 2016 was just repeating garbage that Musk said. MobilEye was very clear that their system at that time did not attempt to detect cross traffic.

Was autopilot previously capped to 60mph on non-freeways? Or are these roads in question not capped?

How? There are certainly ways to correct it, but I'm not sure they'd be easy. Depends how TACC is implemented.

Obviously it'll need to be fixed before the car can be more than level 2. But that's by no means an easy task. I would have thought they would have fixed it by now, but that's making some assumptions about exactly what hardware and software was being used, and it also makes some assumptions on how much code is shared between autopilot and FSD. As I understand it, FSD requires newer hardware than autopilot, so it's possible, probably even likely, that things are implemented quite differently under FSD.

After this incident might it be fair to say that radar is a crutch?

(By the way, there's no such thing as "vehicle code fault" in Florida. If by that you mean violating the vehicle code, I'm sure I can find one that says to pay attention to the road and/or don't crash into vehicles in the intersection.)

Fault goes to the party who failed to correctly yield right of way under the vehicle code. Which sounds like the truck. If the truck had ROW, then the driver, who was responsible for supervising autopilot, would have fault, but he also took almost all the hurt.

As to how to fix it, you train your neural network to recognize trucks (and everything else) crossing the road. You use your fleet to get you thousands of images of cross traffic, and you train the network, and then it gets very good at spotting that sort of thing, and it brakes for them.

I'm not sure where you're getting your definition of "fault." Potentially, both drivers may have failed to correctly yield right of way under the vehicle code. Potentially, both drivers and Tesla may have been at fault.

With the truck driver, the main question will probably be how far the Tesla was from the intersection when the truck pulled into it. The farther away, the less likely the truck driver was at fault.

With the Tesla driver, there's a whole host of questions, and we probably won't ever have a sure answer to them, though we can speculate. If the driver was intentionally not paying attention to the road, they were at least partially at fault. I'd want more data before concluding that, though, and I'm not sure you can ever prove it completely.

With Tesla, the biggest question will be how easy it was to fix the problem. The easier the fix, the more likely Tesla was at fault.

"you train your neural network to recognize trucks (and everything else) crossing the road" is not easy. I'm sure they're working on it, but maybe not in the system that handles TACC.

All that said, we still don't have enough details here to know exactly what happened.

"While drivers are expected to pay attention, Autopilot is now a product which is promised to very soon reach a level where that is not necessary"

No, that's not correct. FSD, not Autopilot, is a product that Elon Musk has claimed will very soon reach a level where that is not necessary.

"and it is now by this standard that it will be judged."

Even if we were talking about the same product (Autopilot is not FSD), I don't think it's fair to judge a product by the standard of the claim of what a future version of the product will be.

You don't judge Windows 10 based on the promised features of Windows 11. Especially not when Windows 11 is actually called something completely different and requires different hardware.

I want to see the details of what happened. If the crash happened because the car tried and failed to recognize this situation, and there's no advantage that would have been provided by the updated hardware, I'll be very disappointed with the progress Tesla had been making. But that seems unlikely. What seems more likely is that the TACC system simply doesn't attempt to handle this situation, because TACC is not FSD.

When police issue a ticket in an traffic incident, it will be based on the vehicle code, in particular failure to yield right of way. Police will not issue a ticket to "Tesla" or the car. Fault for them will be tort or non-traffic law.

I think Autopilot and FSD are very closely linked products, indeed Autopilot is almost a subset of FSD, most of the work on Autopilot will go into FSD. (Especially FSD as Tesla is now attempting to redefine that term.)

I am not judging Autopilot based on the promised features of FSD though. I am judging FSD and the claimed release schedule for FSD capabilities. If Tesla develops better TACC and lanekeep functions as part of their FSD work, and those functions can fit on the existing processor, then they should be going into Autopilot. And I don't even view this as a backport, but rather than Tesla is training CNNs to identify targets on the road, and those CNNs go into both Autopilot and FSD to the extent they can fit. (Originally, of course, Tesla claimed FSD would run on the existing hardware, though that was not true, and everybody else seemed to know that.)

But perhaps they have two completely disjoint development forks for TACC and FSD, though I doubt it. Even if they do, this is a special case. For Tesla, hitting the broadside of a truck should by all rights be at the very top of their trouble ticket list for AEB/FCW/TACC. It should have been at or near the top of the list since the first crash. As well as for FSD, if they are completely different.

As I write, Autopilot needs supervising and is incomplete. The driver is at fault here if he failed to supervise. (The Huang lawsuit will answer questions about how much, possibly.)

I criticize Autopilot here because while I will forgive it almost all unexpected errors, because it is an incomplete system, I am not as forgiving on known fatality causing errors, not when they are showing how good they should be at training networks for things they are consciously looking for.

If there is probable cause that the truck driver violated the vehicle code, he might receive a traffic ticket. It'll be up to the homicide investigation team. The Florida police generally don't issue a ticket at the scene when there is a fatality. That doesn't have much to do with fault, though. The driver may have also received a traffic ticket, except for the fact that he's deceased.

Tesla probably won't get a traffic ticket in this case.

I don't know that it should be at the top of the list for the feature (it's not necessarily even a bug) of cruise control stopping for this type of cross traffic.

If a fatality happened where someone ran through a red light while on Autopilot, would that also be on the top of the list to "fix"? It's just not something that Autopilot is traditionally supposed to handle.

The fact that it is an expected error is why I am more likely to excuse it. People know, or should know, that Autopilot doesn't handle this, unless something has changed.

We'll likely know more about exactly how Autopilot vs. FSD work once the report is issued. If Tesla's plan is to use the TACC code fairly unaltered in the FSD code, they're making a huge mistake. ADAS and self-driving are very different things. I thought you had even argued that yourself.

One huge difference, that may be relevant here, is that an ADAS is going to be tuned to reduce false positives, at least in situations like this one, while a self-driving system is going to err on the side of being more cautious. In the case of a situation where it's easy to simply teach people that the ADAS doesn't handle, it may even be best to not even try to handle it until you can handle it near-perfectly Handling it intermittently just creates a dangerous situation where the driver doesn't know if s/he needs to intervene or not.

But the quality and capabilities of the TACC/AEB/FCW systems tell us what the qualities of the FSD system will be. The speed of progress on those systems tells us about Tesla's ability to progress on the bigger project.

Why would it be on the top of the ticket list? For the reason I name in the title of this article. You don't want to make the same mistake twice. Overall quality is important, but appearance of quality is actually quite important too because public confidence is necessary.

The fact that FSD requires different hardware means, I think, that there's very little we can tell without more information. Tesla doesn't have enough processing power to implement everything that they've written the software for.

It's true that you don't want to make the same mistake twice. But it's not clear this was a mistake. Especially not the first time.

As I said, if the people had died by running through a red light instead, would you feel the same way?

I want to see the investigation before I decide if this was a bug or an unimplemented feature.

FSD will require their new processor -- and probably more than that -- but that processor does the same thing as what they have now -- dot products to run CNNs -- it just does a lot more of it. The systems are the same (though the networks are bigger of course.) But they are not starting from scratch for FSD. They appear to be taking everything they have done for Autopilot and working to improve it, not replace it.

Autopilot does not stop for red lights (though now it detects them and warns you) but a crash of that form today would be in line with what Autopilot is supposed to do. But that crash would change the rules, and they would be foolish not to bump the priority of good red light identification to avoid it happening again, because doing so makes the public lose all confidence. (There is press speculation the recent drop in Tesla's stock price is partly due to this accident.)

I expect robocars to make mistakes. I don't expect them to make the same one twice. Autopilot is not a robocar, but it's trying to be one.

Autopilot is not trying to be a robocar.

The Forbes link ../05/17/tesla-autopilot-repeats-fatal-crash-do-they-learn-from-past-mistakes/ redirects to your newer article ../05/20/elon-musk-declares-precision-maps-a-really-bad-idea-heres-why-others-disagree/#1004a4bb2a33

Can you get that fixed, please?

Been trying for a few days. They changed their software systems.

This was a good article, very informative and well written, but I had a thought while reading it. There are thousands of deaths a year in the US due to auto accidents, many are probably similar to this one where a vehicle pulled out onto a major road and the other driver could not stop in time. Those reasons could have been numerous, such as driver distraction or not enough time to react. Yet, other than a notice in a local newspaper, how many of these accounts are written about and the cause of the accidents discussed? In a similar vein, how many of these accidents has Forbes written about in the past year? Probably only the extremely sensational accidents. Yet here we are, dissecting the cause of an autonomous vehicle crash, but had it not been autonomous, we never would have read about it.

The point I’m making is this. “Dog bites man, it’s not news. Man bites dog, it’s news”. Regular gas powered human driving vehicles kill or maim ~50,000 people a year, it’s not news. Autonomous vehicles kill or maim ~3 people a year, it’s news. But we’re missing the big picture, there’s far less accidents and deaths with self driving or autonomous cars at this time. While I’m definitely not saying these vehicles nor their manufacturers or “drivers” should get a free pass - they should be accountable for their actions and manufacturing and software techniques. But at the same time, we should be hearing a lot more about how these vehicles are actually preventing accidents and deaths.

I’m by no means a autonomous vehicle apologist, but when I have coworkers tell me that they could never trust a self driving car, I just ask them, “who’s driving now? Every time I look around and see into others cars, I see people texting (yep, even those of you who hold the phone down, thinking no one will notice), playing with the radio, eating, doing his or her hair or makeup. There’s very few people actually driving today! At least with autonomous vehicles, someone is driving.”

If we could switch 100% to autonomous vehicles using the current software (i.e. can’t detect a truck entering the roadway), I suspect the accident rate would drop to the 100’s or even less. Even those #’s would most likely be attributed to human error such as running across a road or stepping in front of a vehicle that was traveling to fast to stop in time.

I suspect in the future, the news will reverse, and we’ll be reading how that crazy human tried to drive his or her own car and got in an accident while texting or eating, instead of letting the car drive itself.

Though we have yet to see the real data. The issue here is not what their record is, it's why they appear to have made the same mistake twice. That does not fit the way it should be.

It comes down to who accepts the liability. When level 3 comes out and you don't need to pay attention at times, during those times, someone accepts liability of an accident. If the vehicle makes a mistake, then it's a real problem for the vehicle manufacturer, especially if they specifically say that you don't need to pay attention. Of course, Tesla says you need to pay attention currently, but if they are promising that Teslas on the road today will become robotaxis very soon, then why is Autopilot repeating the same, rather simple, mistake and leading to another fatality? It's incongruous with our expectation if our expectation is purely fed by Tesla.

I think that's where the problem comes in. When Tesla releases FSD, who is to say this type of stuff won't still happen? Does Tesla accept responsibility? Should FSD be released to begin with? Or will Tesla ever be able to release FSD because they'll later find out that a computer vision solution alone is insufficient?

There's just a lot of questions in my mind that come out because of this incident.

Add new comment