California Bill Bans Self-Driving Trucks. San Francisco Supervisors Block Waymo. Does Luddism Reign?

Topic: 
Tags: 

Votes were nearly unanimous to reign in autonomous vehicle companies in their birthplace.

Read more at Forbes.com in California Bill Bans Self-Driving Trucks. San Francisco Supervisors Block Waymo. Does Luddism Reign?

Comments

How much of this SF resistance could be a hangover from when Uber ran roughshod over all the rules? And AirBnB to some extent. Seems SF officials had their toes stepped on and are trying to keep that from happening again. So less about Luddism and more about power, authority, and control.

I’m currently trying to get State of Oregon legislature to utilize ‘existing’ FSD technology in specific areas. I was almost killed when a car hit me and injured my optic nerve resulting in right side loss of peripheral vision. But I’ve drove 2500 miles last year by changing the direction of my head and eyes often. But Tesla’s should excel at peripheral related skills. Just look at the Tesla crash data where for example they were caused by lane changes or t-bone accidents or bicycles, pedestrians, etc. I need someone at Tesla to work on data analysis. I think a FSD Tesla is superior to a human”s peripheral vision and should be considered as my prosthetic devise since I am capable of providing good hands-on capability. I need some technical help but I’m a computer engineer and willing to work

Brad, could you take the opposite side of the argument for a second, just for kicks?

I'm someone who has "decided just how many teething problems I can tolerate" and it's just about zero. So how does a citizen like me lobby my government for this without people like you coming and calling me a Luddite? I'm skeptical of self driving technology, and I'm disgusted by the current inability of the government to hold Cruise responsabile for the messes they are making (https://sfstandard.com/business/san-francisco-wants-robotaxis-to-get-tickets-for-moving-violations/), but I'm not going around smashing looms (or cars). So where's the middle ground between corporations usurping our streets and killing our dogs (https://www.sacbee.com/news/california/article276329166.html) and being a Luddite?

If you want to argue it's "just about zero" you must make a case for why it's so close to zero, and paint a story of how the technology can be developed and tested in this zero tolerance world you describe. If you paint a world where it's not possible to deploy innovation, particularly innovation with the promise of saving lives and improving things, that's the luddism.

Offer an option that works. If you don't like the path being taken, describe a better one. But one where innovation and life saving are still readily doable.

Ok, got it. Fair enough.

So my problem with this is the public testing. Both Cruise and Waymo are regularly inconveniencing other road users, and putting some of them in danger. That's what I have zero tolerance for. I'm especially irritated that Cruise seems to suffer no consequences for blocking the public spaces, and SF says they can't even ticket them. Autonomous vehicles should be towed and impounded every single time they block traffic, and the fines to Cruise to recover their abandoned property should be raised to the point where it is financially unviable to test in public. Then I'll be satisfied, because the autonomous vehicles will be behind fences.

I know from reading your writing for a while that you may now tell me, "there is no alternative, this technology has to be tested in public in order to save lives". Then I think we then need to go look at the risk mitigation used in other life-saving technologies that can only be tested in public. I tried to come up with a list, but all I've got is vaccines, and though I'm a big believer in and consumer of vaccines, I wouldn't use their development history as a good model for how to get acceptance from society!

I wrote a term paper in high school, more than 30 years ago, about how cool self driving would be. And now finally when I see it, and I see the poor interactions between capitalism and democracy that we live under, I am now skeptical... Not of the technology so much, but of the companies who monopolize every other aspect of my life and now have the audacity to come abuse public spaces for their own profit.

Thanks for getting me to clarify my thoughts. I'm still anti-robot, but I know better why!

Indeed, what alternative is there to public testing? Just to deploy with never having tested fully? Or to demand perfection or near perfection before public testing?

My standard is, "are they putting the public at significant risk?" Is the risk worse than the risk of other driving? How much worse? Traffic disruptions are a different thing than safety risk, there is a good argument for being much more tolerant of those --- they are done in the name of reducing safety risk, and you don't want to discourage that.

Yes, it's messier than you expected 30 years ago, but it doesn't seem that messy. You must not forget you hear about every little mistake they make, where you don't hear about all the other cars that had crashes or blocked traffic every day. This biases your opinion. We want objective statistics, not anecdotal stories.

> We want objective statistics, not anecdotal stories.

Nah, I figured out what I want and it's accountability for corporations. Every damn time they get away with an anecdote of stealing from the public commons, I want to hear about it and get angry. Sorry that puts us in opposition on autonomous vehicles testing. But that's ok, it takes lots of different opinions to make a society.

Ps: Speaking of accountability, it would be really interesting to hear the inside story of how Uber's project got the death penalty as a result of committing murder in Arizona. We need more of that and less of Cruise's messes.

I have proposed that and I think the vendors would love it. A defined, and high, liability for crashes or other problems. Perhaps double or triple the typical liabilty when a human driver causes a problem. With a sunset though, so the liability comes to match that for human drivers after a few years when things are more mature. i bet the companies would sign up for this in a heartbeat -- the uncertain liability is worse than a high liability.

Add new comment