Man dies while driven by Tesla Autopilot
A Tesla blog post describes the first fatality involving a self drive system. A Tesla was driving on autopilot down a divided highway. A truck made a left turn and crossed the Tesla's lanes. A white truck body against a bright sky is not something the MobilEye camera system in the Tesla perceives well, and it is not designed for cross traffic.
The truck trailer was also high, so when the Tesla did not stop, it went "under" it, so that the windshield was the first part of the Tesla to hit the truck body, with fatal consequences for the "driver." Tesla notes that the autopilot system has driven 130 million miles, while human drivers in the USA have a fatality about every 94 million miles (though it's a longer interval on the highway.) The Tesla is a "supervised" system where the driver is required to agree they are monitoring the system and will take control in the event of any problem, but this driver, a major Tesla fan named Joshua Brown, did not hit the brakes. As such, the fault for this accident will presumably reside with Brown, or perhaps the Truck driver -- the accident report claims the truck did fail to yield to oncoming traffic, but as yet the driver has not been cited for this. (Tesla also notes that had the front of the car hit the truck, the crumple zones and other safety systems would probably have saved the driver -- hitting a high target is the worst case situation.)
Our condolences for the tragic loss https://t.co/zI2100zEGL
— Elon Musk (@elonmusk) June 30, 2016
It is worth noting that Brown was a major Tesla fan, and in fact, he is the person in a video that circulated in April claiming that the Autopilot saved him from a collision with a smaller truck that cut into a lane.
Any commentary here is preliminary until more facts are established, but here are my initial impressions:
- There has been much speculation of whether Tesla was taking too much risk by releasing autopilot so early, and this will be boosted after this.
- In particular, a core issue is that the autopilot works too well, and I have seen reports from many Tesla drivers of them trusting it far more than they should. The autopilot is fine if used as Tesla directs, but the better it gets, the more it encourages people to over-trust it.
- Both Tesla stock and MobilEye stock were up today, with a bit of downturn after-hours. The market may not have absorbed this. The MobilEye is the vision sensor used by the Tesla to power the autopilot, and the failure to detect the truck in this situation is a not-unexpected result for the sensor.
- For years, I have frequently heard it said that "the first fatality with this technology will end it all, or set the industry back many years." My estimation is that this will not happen.
- One report suggests the truck was making a left turn, which is a more expected situation, though if a truck turned with oncoming traffic it would be at fault.
- Another report suggests that "friends" claim that the driver often used his laptop while driving, and some sources claim that a Harry Potter movie was playing in the car. (A portable DVD player was found in the wreckage.)
- Tesla's claim of 130M miles is a bit misleading, because most of those miles actually were supervised by humans. So that's like reporting the record of student drivers with a driving instructor always there to take over. And indeed there are reports of many, many people taking over for the Tesla Autopilot, as Tesla says they should. So at best Tesla can claim that the supervised autopilot has a similar record to human drivers, ie. is no better than the humans on their own. Though one incident does not a driving record make.
- Whatever we judge about this, the ability of ordinary users to test systems, if they are well informed and understand what they are doing is a useful one that will advance the field and give us better and safer cars, faster. Just how to do this may require more discussion, but the idea of doing it is worthwhile.
- MobilEye issued a statement reminding people their system is not designed to do well on cross traffic at present, but their 2018 product will. It is also worth noting that the camera they use sees only red and gray intensity, it does not see all the colours, making it have an even harder time with the white truck and bright sky. The sun was not a factor, it was up high in the sky.
- The Truck Driver claims the Tesla changed lanes before hitting him, an odd thing to happen with the Autopilot, particular if the driver was not paying attention. The lack of braking suggests the driver was not paying attention.
Camera vs. Lidar, and maps.
I have often written about the big question of cameras vs. LIDAR. Elon Musk is famously on record as being against LIDAR, when almost all robocar projects in the world rely on LIDAR. Current LIDARs are too expensive for production automobiles, but many companies, including Quanergy (where I am an advisor) are promising very low cost LIDARs for future generations of vehicles.
Here there is a clear situation where LIDAR would have detected the truck. A white truck against the sky would be no issue at all for a self-driving capable LIDAR, it would see it very well. In fact, a big white target like that would be detected beyond the normal range of a typical LIDAR. That range is an issue here -- most LIDARs would only detect other cars about 100m out, but a big white truck would be detected a fair bit further, perhaps even 200m. 100m is not quite far enough to stop in time for an obstacle like this at highway speeds, however, such a car would brake to make the impact vastly less, and a clever car might even have had time to swerve or aim for the wheels of the truck rather than slide underneath the body.
Another sensor that is problematic here is radar. Radar would have seen this truck no problem, but since it was perpendicular to the travel of the car, it would not be moving away from or towards the car, and thus have the doppler speed signature of a stopped object. Radar is great because it tracks the speed of obstacles, but because there are so many stationary objects, most radars have to just disregard such signals -- they can't tell a stalled vehicle from a sign, bridge or berm. To help with that, a map of where all the fixed radar reflection sources are located can help. If you get a sudden bright radar return from a truck or car somewhere that the map says a big object is not known to be, that's an immediate sign of trouble. (At the same time, it means that you don't easily detect a stalled vehicle next to a bridge or sign.)
One solution to this is longer range LIDAR or higher resolution radar. Google has said it has developed longer range LIDAR. It is likely in this case that even regular range LIDAR, or radar and a good map, might have noticed the truck.
You may want to read my analysis of what Tesla should to to prevent this and all the different ways we could validate our systems.
Comments
Russell de Silva
Fri, 2016-07-01 01:38
Permalink
Why not multiple sensor types?
I know it's too obvious not to have been thought of already, but surely a combination of sensor types is more effective?
If you have enough knowledge about which is more trustworthy over a broad range of conditions, then it shouldn'be too hard to produce a sytem that is right far more often than either independently?
brad
Fri, 2016-07-01 08:11
Permalink
Sensor types
Yes, most cars, including the Tesla, use multiple sensors. It's usually good -- but of course more sensors means more cost, and also more complexity in the software (which means you don't use every sensor you can think of.)
Tesla is making a production car and cares about cost at today's prices. Most other teams are making prototype cars and only care about the cost in a few years time, so they mostly use LIDAR as the core sensor.
Paul Evans
Sat, 2016-07-02 10:16
Permalink
Speed?
Facts gathered from video reports of the scene and eye witness accounts of the accident:
1) The posted speed limit was 65 mph
2) Another motorists witnessed the Model S travelling well in excess of 85 mph.
3) The intersection was over a slight crest in the direction the Model S was travelling.
While it was a long straight section of road, travelling at very high speed on a road with a high number of level intersections increases the risk of a collision significantly (hence the relatively low speed limit) The slight crest would also have hindered the visual range for both the Tesla and truck driver.
It's quite reasonable to expect the the investigation will conclude the main cause of the accident was excessive speed by the Tesla driver.
But as Mobileye have publicly stated, their current system is not designed for Lateral Turn Across Path (LTAP) detection so the simple fact is that Tesla's Autopilot needs to be restricted to use on roads without intersections (eg expressways) until 2018 when LTAP capability becomes available.
brad
Sun, 2016-07-03 16:31
Permalink
Speed
65mph is a normal highway speed limit, not a reduced one. And many travel 80mph on such roads, 85 is a bit fast, especially on a non-limited-access road.
There is a slight crest. Still not sure how the rules work if a truck makes a left turn and there is oncoming traffic. The truck driver reports the car drifted over a lane or two before hitting him -- presumably to say he was clear of the car's original lane but that would still not excuse the left. And a Tesla autopilot would not normally drift over lanes.
Victor.Lew
Wed, 2016-07-13 06:32
Permalink
this article fails to mention
this article fails to mention a.) the FOV field of view or operational distance range or speed maximum range of the sensors involved in the accidents which is basic reporting b.) the estimated speed of the vehicle from the police report c.) no hard core facts from the authors referenced in the article, not to mention many more omissions in the article.
The online Technical Webinar Series from the Editors of SAE - Advanced Driver Assistance Systems presentation by Andrew Whydell does an excellent job of presenting basic facts that would explain much this article fails to in terms of reality in areas as sensors Tesla incorporated at this specific time in this model and this specific model year. Look at the generation of cameras, when they enter production, what capability they have, and the big changes between the generations. As Mobileye is on record they are developing in partnership with Sony a 7.2 mb pixel cameras with unique properties for testing in 2019-2020 timeframe (Sony is the #1 camera mfg), the SAE powerpoints are never refeerneced which is a shame, as they explain the relevant details)
Why the Insurance Institute for Highway Safety (IIHS) never rates any Tesla's models for safety is a curious point not mentioned. see http://www.iihs.org/iihs/ratings
brad
Wed, 2016-07-13 13:44
Permalink
Fails to mention?
First of all, this is an old article written just after the crash was revealed -- I and many others have more detailed articles so this is a strange comment.
However, in the end, the capabilities of the sensors are not relevant to this crash. They failed to observe obstacles, as was expected and warned extensively about. While all will want to improve the sensors, fixing the issues in this crash does not significantly alter the debate on supervised autopilot -- a common mistake.
Victor.Lew
Wed, 2016-07-13 19:22
Permalink
re: sensors
omitting what you stated in your reply "However, in the end, the capabilities of the sensors are not relevant to this crash" from the original story is what skews the story. It needed to be stated point blank in the original story.
Tam
Tue, 2016-07-05 02:14
Permalink
Infrared Cameras
Is there any role for Infrared Cameras in autonomous driving?
brad
Tue, 2016-07-05 10:19
Permalink
LWIR
If you mean deep infrared, people have done some experiments with them, but I do not think they are deployed in any vehicle. The advantage is they see emitted light rather than reflected light, and they particularly highlight people, animals and car tires, and do it day and night. The downside is that if the background temperature gets in the 90s they no longer see the people or animals well, and of course they are also very expensive and must be placed outside the vehicle (can't be behind the windshield) and so need their own way to clear rain etc.
Tam
Fri, 2016-07-08 01:55
Permalink
Tesla Autopilot=NHTSA automation level 3?
Hi Brad:
Thanks for the very insightful about deep infrared.
On the subject of NHTSA level 3 for Tesla Autopilot: I know the answer is it "no" because it is still in beta, but I would like to think through with NHTSA language to see if it fits with current Tesla Autopilot capability (Not what Tesla warns not what to do: no napping, no distractions, no lateral white tall trucks...)
"...under certain traffic or environmental conditions" that means there should be no lateral white tall truck with light background sky.
A driver was caught napping and "the driver to cede full control of all safety-critical functions..."
As long as there was a predictable stop-and-go slow traffic, the driver was napping comfortably "in those conditions to rely heavily on the vehicle to monitor for changes in those conditions requiring transition back to driver control."
Warning sound could wake the driver up because "The driver is expected to be available for occasional control, but with sufficiently comfortable transition time."
Although Tesla says you got to be attentive all the time with your hand on wheel but functionally, the car seems to perform braking, acceleration, lane keeping, very well in known predictable situations. Well enough that gave up all controls to the system for a while.
Again, that sounds to fits with NHTSA definition of level 3.
On the other hand, Tesla can only fit with SAE level 2 because no matter how good Autopilot is in certain conditions, it cannot be trusted and the criteria is: Driver needs to "Monitor Driving Environment" at all time and cannot shift the responsibility to the system for not registering a white tall truck.
Thus, it sounds like SAE classification is easier to understand than NHTSA's one.
http://www.nhtsa.gov/About+NHTSA/Press+Releases/U.S.+Department+of+Transportation+Releases+Policy+on+Automated+Vehicle+Development
Limited Self-Driving Automation (Level 3): Vehicles at this level of automation enable the driver to cede full control of all safety-critical functions under certain traffic or environmental conditions and in those conditions to rely heavily on the vehicle to monitor for changes in those conditions requiring transition back to driver control. The driver is expected to be available for occasional control, but with sufficiently comfortable transition time. The Google car is an example of limited self-driving automation.
brad
Fri, 2016-07-08 11:13
Permalink
Not yet, and unknown when
A supervised car (so-called "level" 2) is a very different animal from a standby driver (so-called "level" 3) and that's one of the reasons they are not really levels.
A supervised car can have thousands of things it doesn't handle, and they can come up by surprise. A standby driver car has to have ZERO things it doesn't handle on the class of roads and conditions it is rated for. That's because it must never get into a situation of saying "emergency: take the wheel now!" A supervised car is allowed to do that frequently.
And there is a huge difference between frequently and never. And they are even different in the law in the states that have laws. The standby driver car allows you to read a book. It promises you that when it comes upon road conditions it can't handle, it will know 10 or more seconds in advance, and say, "Please put your book down and take the wheel in 30 seconds please."
Today, you can't make a standby car with a camera. Some day you will. Not today.
This is one reason that plane automation has been here for a while. Except when landing, in a plane, if something goes wrong, you usually have minutes to fix it. Even when it was a surprise. So it was always possible to make aircraft autopilots that let the pilot pull out her charts or even go to the bathroom.
The closest way that these are levels is the plan to make a supervised car, and keep improving it and improving it until one day you notice, "Hey, on the freeway it's been 300,000 miles since the last time it needed intervention." Then you could consider using it for standby driver operations on that freeway.
RobN
Sat, 2016-07-09 03:47
Permalink
Tesla's use of the term
Tesla's use of the term "AutoPilot" is problematic. Something like "Supervised Mode" would be better because it would remind Tesla drivers of their responsibilities. It's human nature to read the manual once after a new purchase, then never read it again.
RalfLippold
Sun, 2016-07-10 14:35
Permalink
Autonomous driving - needs more than just a media outcry
Brad - thanks a lot for your full detail description of the current state of autonomous driving in the context of the most recent (first) fatality in connection with Tesla's "Autopilot".
Having experienced several autonomous approaches (Tesla's "Autopilot"; a semi-autonomous railroad system at Erzgebirgsbahn called "Ecotrain" and a system by IAV (they presented at CES together with Valeo this and last year)) two weeks ago during a tour in Saxony the least what one can say: progress is improving at accelerated speed.
In view of the recent media outcry (also in Germany, where the German automotive industry is heavily working to catch up to eMobility as well as autonomous driving) it gives a confident feeling to read a thorough description of the topic.
Add new comment