Daimler Makes Risky Bet Pulling Back From Robotaxi Business

Topic: 
Tags: 

Daimler's CEO has said they plan to "scale back" and "rightsize" their robotaxi efforts and focus on Trucking. Trucking is a good field for them, but this is a big bet.

Bet right and the company avoids wasting some money on being too early to the self driving game. Bet wrong and there may be no Daimler.

Read about it at Daimler Makes Risky Bet Pulling Back From Robotaxi Business

Comments

Waymo's bet is that society is open to sharing roads with their Robotaxis. For that they must not only be capable of self-driving, they also have to drive well. If their cars clog up roads because it takes them too long to make a turn or they drive too slowly, they will not be allowed in large numbers. Watch some videos on how they're doing on Youtube. They might be getting quite close to self-driving, but they're still quite far from being considered good drivers.

Daimler on the other hand says it wants to introduce level 3 self-driving in its s-class next year. That means you can read a newspaper while you're driving on a divided highway and Daimler will take full responsibility if anything happens. I can definitely see a use case there.

And has known it since I was there many years ago. And of course their earliest deployments are not as good as their full commercial deployments will be. All seems to be going according to plan, though a little slower than they wanted, but not a lot slower.

As you may know, Google, before it was Waymo, judged that so called "level 3" was not a wise idea. But it will be good if Mercedes can prove them wrong.

If robotaxis take over the entire market for cars, and that's a very big if, it will take at least a decade, quite possibly longer. In the meantime, many companies will have learned how to make a robotaxi, and not just the ones that have already started development today. The vast majority of what companies trying to make a robotaxi are spending their time on today is learning how not to make a robotaxi. This is important work, but it's also work that will very quickly become non-proprietary. Sure, you won't be able to steal Waymo's code (if Waymo comes out with a self-driving car first). But many many people will know what works and what doesn't, and much of that knowledge will be impossible to protect. Furthermore, hardware is increasing rapidly, and increases in hardware make the software much much easier.

Furthermore, the very hypothetical that robotaxis are going to start taking over the market for cars any time soon is, I think, an unlikely one. Robotaxis are only even remotely close to becoming a reality in very limited areas. Even there, they don't really offer a whole lot of benefit, because they're not truly autonomous. There's either a "safety driver" in the car, or someone in a control center monitoring things. For long-haul trucking such remote-controlled vehicles can save significantly on costs. For taxi service, where you're competing against a gig worker making less-than-minimum wage, maybe not.

You say that Google judged that level 3 was not a good idea. But how is Waymo's robotaxi system, which sometimes asks humans in a control center to make driving decisions, any different from level 3?

I can see why Google previously judged that level 3 was not a good idea. I think over the several years since they made that judgment things have become more clear. Is Audi's level 3 Traffic Jam Pilot not a good idea? It seems to be fine, in terms of design (I don't know if it is sufficiently bug-free). As long as it works as design, it's relatively non-risky, and the only real risk is when a driver is being extremely negligent. I think at the time when Google judged that level 3 was not a good idea, the picture we had of level 3 was fairly different from Audi's Traffic Jam Pilot.

I think Level 3 can work as long as things are safe while waiting 10-15 seconds for the driver to take over, and relatively safe even if the driver never takes over at all. And I think most of the really difficult problems of self-driving can fit into that, because they are usually situations where a human would be more aggressive than a car designed to follow the rules.

It will indeed be a while before it takes over the entire market for cars. But even the fact that that market will go into decline spells doom for today's car companies. Not instantly, but the stock market sees it, and that happens quickly.

Google's system is completely different from level 3, I am not even sure what the similarity is. The reason level 3 doesn't work is that the human in the driver's seat can't be trusted to come and take over reliably within the 10 seconds that is generally desired. Nobody ever expected remote operators to take over at all, certainly not within 10 seconds. I guess there are people today who imagine the remote operator taking over but that's not how Waymo's system works. The remote operator just solves strategic problems, mostly for cars that stopped not knowing what to do. This is ideally, rare. Level 3 (Standby as I call it) is for highway, with live takeover at speed.

When the change comes to robotaxis, it will be much faster than we imagine today. Look at how quickly it went from "Electric cars are a distraction" to "Tesla is outselling all the cars in its price range, combined."

I think the only way you can say that Google's system is completely different from level 3 is if you say it is also completely different from level 4. I think if you had to give it a level, it would have to be level 3, because it requires humans to make driving decisions at unpredictable times when it is not properly parked.

I think that's reasonable, because I don't think that the "level 3" we are going to see is anything like the "level 3" that some people envisioned just a few years ago. There certainly will be no need to trust the driver to take over reliably within 10 seconds. In fact, the vast majority of "take overs" will consist of a human solving strategic problems when the car is stopped. Maybe that's not "level 3." If the car is stopped in traffic where it isn't supposed to be stopped, I wouldn't call it level 4 either, though. If what you're saying is that the very concept of the levels are poorly defined, I agree.

(Do you really think it's going to be acceptable for Waymo vehicles to stop in the middle of the road for significantly more than 10 seconds? You complain about your Tesla stopping and blocking traffic in a parking lot for just a few seconds.)

When (I mean if) the change comes to robotaxis, it will be much slower than you imagine today. It's not at all clear that the market for cars will go into decline for many many years, and even once it does, that's not going to spell doom for all of today's car companies, just the ones that can't adapt. Someone is going to have to build the robotaxis, and it's not going to be Waymo. (It's not clear that Waymo will play any role, in fact. They're good at R&D, but other companies will be better at production, and other companies will be better at logistics, and other companies will be better at operations. The car companies (and even the ridesharing companies like Uber and Lyft) will be much better at collecting data to feed into neural networks. Waymo is the Xerox PARC of the self-driving car world. Their best hope is probably that they can get bought out before their technologies become ubiquitous.)

As you surely know, I don't believe in the levels. L5 is aspirational. Levels 1-2 are just ADAS. Level 3 is just Level 4 (a robocar) where a human may have to take control while the car is moving. So it exists, in a sense, but is really just a special subclass of level 4. As such there is really only level 4 and thus no levels.

And as noted, some believe, in Google, that asking a human to take control while moving is too risky to be part of a general plan.

Yes, it will be acceptable for Waymo taxis to stop for more than 10 seconds on very rare occasions. Rare enough that only a subset of the population ever sees it.

Thanks for the clarification on how you define level 3. I think if Waymo is going to wait until human interventions are needed so rarely that only a subset of the population ever sees it, they will be waiting a very long time.

And that's probably the biggest risk of eschewing levels 0-3. Once people have their nice highly advanced level 2 or level 3 cars that they are used to, it's going to be very hard to get them to abandon them for robotaxis. And if asking humans to take control only when stopped is level 4, there might be level 4 cars out much sooner than there are robotaxis. The hardest-to-tackle issue with robocars is going to be when they want to play it safe and stop when humans want them to go.

(What if you ask a human to take control while moving, but don't require it? For instance, say the car won't go into unknown parking lots without a human in control, but it'll happily circle the block or park in a known parking lot until the human takes control. Is that level 3 or 4?)

Actually, I think my "definition" of level 3 is just an alternate expression of SAE's definition. To them, a Level 3 car is one that may need to call upon a driver to take over, with a decent period of warning. But if you are in a situation where stopping is easy, you don't have to call upon a driver, you can just stop. The only time you need to call on a driver is a place where you are going from a ODD the car is able to handle to one that it can't, and you must make the transition at speed. For example, on a highway, when heading to an off-ramp or construction zone, it is not suitable to stop to do the handoff.

On the other hand, if there is a place to pull off the road just before the boundary, you don't need a hand off. You can just pull off, and tell the driver not to take over, but to start driving again. Everybody agrees humans can start driving safely. What is not clear is if they can take over.

This even applies if you would like the driver to take over for convenience. Level 3 is where the driver must take over. Where you can't just pull over if the driver refuses to respond. Lack of response from a driver is a very rare thing in practice, so it's OK to do something rare, like pulling over.

The other issue is a driver who does take over, but is not really in the mental state to do so -- drowsy, distracted etc. That's another issue with "level 3."

As for how long we have to wait, I disagree. I think it is totally and completely acceptable that robocars be "annoying" in their test phase. You don't want them to be annoying for long, or too annoying, but it is well worth it that the public put up with this if it helps get the vehicles deployed sooner. As they get more mature, and go into major production, that's where you want to be at a level where annoying driving patterns are rare.

The exceptions to debate are the types of annoying conservative driving that cause other drivers to have their-fault accidents. That's an unresolved question. It is true that drivers will, in violation of the law, hit cars which don't drive as aggressively as expected. Again, I don't want that long term, but we might want to encourage it short term, at the cost of some fender benders.

"if you are in a situation where stopping is easy, you don't have to call upon a driver, you can just stop"

As you correctly recognize, stopping is not always the right decision. In fact, it is sometimes a very dangerous decision. (Otherwise, this self-driving car stuff would be easy.)

"For example, on a highway, when heading to an off-ramp or construction zone, it is not suitable to stop to do the handoff."

Those are two good examples of bad times to use level 3. It's a good example of how a naive version of level 3 is a bad idea. However, it's clear, I think, that these aren't types of scenarios where level 3 handoffs are going to be necessary. As I've said before, I think most handoffs will be needed at very low speeds or when stopped. These situations happen somewhat frequently -- usually multiple times a week -- and stopping for significantly longer than 10 seconds would be safe, but it would be very annoying to people both in the car and outside of it.

"Level 3 is where the driver must take over."

Then forget what I've said about level 3. Level 4 is coming soon, though not for robotaxis.

"You don't want them to be annoying for long, or too annoying, but it is well worth it that the public put up with this if it helps get the vehicles deployed sooner."

Why? Not to save lives, because level 4 (by your definition) personally owned vehicles will save just as many lives.

Daimler has a nice foothold in the trucking industry and there is a lot of untapped potential to provide level 3 or 4 capability for truckers. If you could find a way to save cost for huge distribution networks, you can tap into a very large industry. Google searching the size of the trucking industry gives estimates at $700 billion and moves over 70% of all freight in the US. Perhaps the hype of robotaxis is actually like a red herring. The amount of research & development required for robotaxis in a future market that may be saturated by players like Waymo, Uber, and (maybe) Tesla seems to be a wrong move for a company like Daimler.

Furthermore, Mercedes-Benz has a lot more to offer to customers than simply performance. There will still be a place for luxury vehicles in robotaxis.

Add new comment