NHTSA forces Tesla to turn off rolling stop option in FSD prototype -- that's a very bad new power for NHTSA

Topic: 
Tags: 

The feds (NHTSA) are forcing Tesla to do a recall (software update, really) to disable the ability for the FSD prototype to do rolling stops at empty intersections. That turns out to be a surprising bold exercise of regulatory power, and probably a terrible idea, no matter how bad Tesla is. (Almost.) Full details in a new column on the situation, but there's a ton of nuance to this.

Read about it at NHTSA forces Tesla to turn off rolling stop option in FSD prototype -- that's a very bad new power for NHTSA on the Forbes site.

Nuro

I also released a second story today about the progress at Nuro, the road-based delivery robot company. They have a new robot and are ready to scale, they say.

Read about that at Nuro’s 3rd Generation Delivery Robot Is Ready For Manufacture

Comments

can you articulate what safe is?

Well, you will get 100 answers from 90 people, but broadly I view it as an attempt to enumerate the level of risk, which is in turn the probability of various adverse events over a large amount of driving. You want to keep that level of risk down below a certain level. Most people talk about making it superior to the level for humans.

Rand Corp defines 'safety' is not just a measurement and a threshold but also a process. Perhaps Tesla was premature in assesing the capabilities of FSD as sufficient to impliment action based on a threshold. Perhaps, according to the charter of NHTSA, the recall was within the scope of the charter?

As I said, many definitions. But I would say safety is a goal, and one should not confuse the means to a goal from the goal. A safety mindset is a means.

This is also a philosophical question (which our most hated problem, the Trolley problem, actually was created to illustrate.)

Some would look at levels of risk. Others would say that no level of fatal risk is acceptable. Humans tend to be all over the map, and even hold multiple viewpoints at the same time.

Your definition of safety will affect how you measure safety, and measuring safety is probably necessary to attain safety, or at least to know you have.

I don't think they should be involved for another reason. Rolling stops in deserted intersections are safe. In fact, if you look at the Netherlands and other places like it, rolling stops are not unsafe even in non-deserted intersections. In fact much of the world prefers roundabouts where you don't even slow down if the intersection is deserted.

Regulations generally only "provide a floor for the protection of safety".

Rolling Stops began Oct 20 2020. 15 months later Tesla agrees to cease continuance with the current implimentation.

"Processes (other than compliance with regulation) are less likely to be discussed with the general public by government and safety advocates. This is because process information likely comes unilaterally from an AV company — unlike measures.

So although government and advocates might discuss processes with companies, these two groups are unlikely to make statements about process to the general public beyond making public what an AV company has said."
Rand Corp

You are missing the real issue with NHTSA, purpose. If most of the vehicles on the road are self driving and "near perfect" then the gov agency has a much smaller use in the future and then will get defunded and then people get let go.

Much like how the President asks the DEA if they should legalize cannabis, aka we should remove 50% of your purpose to exist and hand it over to ATF (ATCannabisF), surprising how the DEA always says no. When they should be asking the FDA if it is any worse than alcohol, tobacco, caffeine, sugar.

First rule in gov bureaucracy is to expand your bureaucracy and get more funding. In NHTSA eyes, if it can drive on a "highway" they can get involved.

You are missing the real issue with NHTSA, purpose. If most of the vehicles on the road are self driving and "near perfect" then the gov agency has a much smaller use in the future and then will get defunded and then people get let go.

Much like how the President asks the DEA if they should legalize cannabis, aka we should remove 50% of your purpose to exist and hand it over to ATF (ATCannabisF), surprising how the DEA always says no. When they should be asking the FDA if it is any worse than alcohol, tobacco, caffeine, sugar.

First rule in gov bureaucracy is to expand your bureaucracy and get more funding. In NHTSA eyes, if it can drive on a "highway" they can get involved.

Cruise opened to public yesterday in SF so expect comparisons of Cruise automation technology experience to Tesla FSD.

I'll disagree w/r/t rolling stops. The words rolling and stop are antithetical. You can't both be rolling and stopped at the same time. With extremely rare exception, I always come to a complete stop. The only time I don't is in those rare instances in which I think, "You know, that person behind you is an idiot who does not know what stop means."

But in many countries, they almost never use stop signs, and instead have either yield signs (ie. roll through but stop if somebody else has RoW) or roundabouts (go through nearly full speed if clear.)

Their safety records are superior to the USA

So it is stop signs that are bad and rolling stops that are good according to some data.

Federal Register / Vol. 86, No. 247 / Wednesday, December 29, 2021 / Notices - 15 pages

Information Collection Request (ICR)

For 2021 (5 months of reporting)
Now based on compiled data: The agency has received incident reports for the past five months.

For both ADS Level 4 and ADAS Level 2 reporting, divided into:

Incident Reporting for Automated Driving Systems (ADS) and
Level 2 Advanced Driver Assistance
Systems (ADAS)
AGENCY: NHTSA

The list of test participants (20) included with the General Order includes several ‘‘vehicle suppliers,’’ companies that supply various components that are then integrated into completed vehicles, ADS, or Level 2 ADAS, by other vehicle or equipment manufacturers.

2022 estimates for total year:
Affected Public: Vehicle and equipment manufacturers and operators
of ADS or Level 2 ADAS equipped vehicles.
Estimated Number of Respondents: 110.
Frequency: Monthly and on occasion.
Estimates: 1150 (1 and 5 day reports)

ADS and Level 2 ADAS are new technologies that fundamentally alter
the task of driving a motor vehicle. Crashes involving vehicles equipped
with these technologies have resulted in multiple fatalities and serious injuries, and NHTSA anticipates that the number of these crashes will continue to grow in the near future given the increased number of these vehicles on the road and the increased number of vehicle and equipment manufacturers in the market. The General Order provides the agency with critical and timely crash data, which assists the agency in identifying potential safety issues resulting from the operation of advanced technologies on public roads. Access to this crash data may show whether there are common patterns in vehicle crashes or systematic problems with specific vehicles or systems, any of which may reflect a potential safety defect.

OMB Control Number: 2127–0754

NHTSA estimates that there will be 90
ADS manufacturers and operators and
20 manufacturers of Level 2 ADAS
vehicles each year (including
manufacturers that produce both Level
2 ADAS vehicles and ADS vehicles).

ADS, as defined by SAE International and as used in this document, refers to driving automation Levels 3-5. SAE International J3016_201806 Taxonomy and Definitions for Terms Related to Driving Automation Systems for On Road Motor Vehicles. Previous notices issued by NHTSA focused on driving automation Levels 4 and 5, due to the unique vehicle designs expected for vehicles intended to operate without necessary human intervention, and thus, potentially designed without traditional manual controls.

How many ADS vehicles on roads in 2022?
NHTSA list contains ~120 companies with test vehicles/robotaxis required to report.
Mercedes Drive Pilot autos likely on roads in US in 2022 as well.

1000 total incidents for ADAS in 2022 according to this document?

cannot make math work after browsing total VIOs for last 10 yrs with L1 L2 adoption (examples Roland Berger)

definitions below from NHTSA

"Advanced Driver Assistance System” means a Level 1 or Level 2 system.

A Level 1 system is a driver support feature on the vehicle that can assist the human driver with either steering or braking/accelerating, but not both simultaneously. The human driver must remain fully and continuously engaged in the driving task.

Level 2 system is a driver support feature on the vehicle that can control both steering and braking/accelerating simultaneously under some circumstances. The human driver must remain fully and continuously engaged in the driving task.

required reporting only if notified, which is the ambiguity: what is definition of notified.

estimate 1150 ?
be the year ?
for 110 mfgs ?
for l2 l3 l4 ?

here more for why ?
(1) An ADS or Level 2 ADAS equipped
vehicle for which it supplied
components that were incorporated into
the motor was involved in a crash; (2)
the ADS or Level 2 ADAS was engaged
during the period thirty seconds prior to
and through the end of the crash; and
(3) the crash involved a fatality, hospital
treated injury, air bag deployment,
vehicle tow-away, or vulnerable road
user.

review of the research you had undertaken to disprove NHTSA did not have enforcement authority over the rolling stops programming in FSD

It's not a settled question. They have not traditionally had this authority and have not used it if they had it. It's a new thing. Their authority is over whether the car is safe to sell. They have never said, or wanted to say, "This car can go 100mph, that's in violation of the law and unsafe, so we will not allow it to be sold" or even "This car has a cruise control you can set at 100mph."

Certainly never, "This car will go through a stop sign if you don't press the brake." Tesla has made a system which will press the brake for you if it sees a stop sign. They are calling it out for not pressing the brake all the way, even though every other car made will just drive through the stop sign.

rolling stop on black ice 5 mph slide 10x

If a vehicle is "not safe to sell", is self-certification to blame?

Several reports from the press of note.

A. NHTSA has been reported to have been "frowning upon rolling stops several months after the feature was rolled out". The original rollout began in October of 2020, quite some time, 15 months if correct.

B. The safety analyst Mr. P. Koopman noted on a website - "They are removing agency from the human driver in deciding when to break the law. People selecting a "rolling stop" mode had no idea what they were really signing up for (true behaviors were only disclosed in recall)-- but were going to bear the full burden of any liability for hitting a bicyclist or child at an intersection. That is really wrong. Deciding to break the law on behalf of a driver in an opaque way without involving driver approval every time it is done is inherently a problem".

"Only disclosed in recall" sounds unusual if not troubling though I cannot verify accuracy of statement.

Did Tesla pro-actively approach NHTSA with details of the feature for FSD ahead of time - "Tesla could have and should have worked this out with NHTSA in advance. Then lobbied for regulatory change if needed."

The self-certification component to this enforcement process may be as important as the legal ramifications that surround a stop sign.

I will agree that many users of the prototype were not aware it could roll. But then, there are vastly more dangerous things it does than a rolling stop. I mean really vastly. I don't think you can say that people today were unaware of what they were signing up for, since they know, and agree, that it can do thousands of illegal and dangerous things. The rolling stop was just the only one that was deliberate.

A rolling stop is not going to cause you to hit somebody at an intersection. You can literally stop in 1.1 feet at the speed it goes to. If you are not paying attention, you could hit somebody if the car's perception totally fails, but that's true everywhere you drive this thing.

"FSD Appears to Ask Drivers Whether It Should Break the Law"

elon musk 11:44 AM · Oct 24, 2021·Twitter

Please note, this is to be expected with beta software. It is impossible to test all hardware configs in all conditions with internal QA, hence public beta.

design based and performance based

So does Tesla pay the tickets that drivers get for rolling through stop signs?

That's a $100 ticket if you're in front of a camera in DC.

One camera has averaged roughly 115 tickets per day.

That makes no sense.

So a stop sign has the word stop on it. So the vehicle should stop. Our infrastructure is built this way and we have laws that state that we are required to stop. If it was a yield sign then it could roll it, but it may still have to stop depending on the situation. The software in those vehicles should direct the vehicle to stop for stop signs. If the vehicle you are in does not stop then you are in violation of the law. The vehicle is already driving you, why was it programmed to be too lazy to fully stop and then safely proceed? This article arguing a stupid premise.

Add new comment