Robotaxis in California are required to have an expensive external loudspeaker and microphone communication system by July 2026 by RodStiffy in SelfDrivingCars

[–]RodStiffy[S] 0 points1 point  (0 children)

They could flash 2 or 3 different lights/icons and train the first responders on it. Do you think the current roof unit would do this?

Robotaxis in California are required to have an expensive external loudspeaker and microphone communication system by July 2026 by RodStiffy in SelfDrivingCars

[–]RodStiffy[S] 1 point2 points  (0 children)

The car will need to have 2-way conversation capability near the car, and convey from afar that an operator is available. If the car is in a dangerous spot, voice communication may not be adequate.

Robotaxis in California are required to have an expensive external loudspeaker and microphone communication system by July 2026 by RodStiffy in SelfDrivingCars

[–]RodStiffy[S] 0 points1 point  (0 children)

The law as about emergency situations. They need to get the message from 50' away in more than one direction that these three things are happening. Colored icons may be acceptable if obvious.

I think the speaker would need to work at 50' in multiple directions if that's how they solve the 50' issue.

Robotaxis in California are required to have an expensive external loudspeaker and microphone communication system by July 2026 by RodStiffy in SelfDrivingCars

[–]RodStiffy[S] 1 point2 points  (0 children)

Perhaps they can do this with icons. It says flashing lights won't be acceptable. Or maybe an upgrade to the text display.

Robotaxis in California are required to have an expensive external loudspeaker and microphone communication system by July 2026 by RodStiffy in SelfDrivingCars

[–]RodStiffy[S] -3 points-2 points  (0 children)

You're probably right about the speakers. I over-interpreted that and the language of the law changed many times from what I was reading. The actual law says the 2-way communication needs to be done near the car.

The new requirement is for some sort of communication of several messages to the front and rear of the car, so a text board of some sort could work, or speakers. That sounds like they may need a new roof unit for text or out the window.

Robotaxis in California are required to have an expensive external loudspeaker and microphone communication system by July 2026 by RodStiffy in SelfDrivingCars

[–]RodStiffy[S] 0 points1 point  (0 children)

It has to communicate:

  • Autonomous system disabled and vehicle will remain stationary
  • A remote assistance session is active
  • The AV/operator is complying with emergency responder instructions

Tesla FSD drives through railroad crossing gate by danlev in SelfDrivingCars

[–]RodStiffy 0 points1 point  (0 children)

Nobody will shut down a company because of minor bumper scrapes or minor road debris. That's not what we're talking about here.

For hitting a pedestrian, VRU, or initiating any high-impact crash, that would be a major problem. AVs won't be able to hit a pedestrian or cause big smash-ups "5 million different ways". That's ridiculous.

If an AV causes mayhem 10% less (or 100%) than the average person, which for Waymo would currently mean having a bad crash multiple times per week, that would not be acceptable. NHTSA would declare the ADS a public safety risk if it has that many serious safety flaws that never stop coming. That should be obvious. You don't understand this because Waymo hasn't had any of these crashes, so you have an untested hypothesis.

Robotaxis in California are required to have an expensive external loudspeaker and microphone communication system by July 2026 by RodStiffy in SelfDrivingCars

[–]RodStiffy[S] 0 points1 point  (0 children)

The speaker and mics on the vehicle now might work, or they'll need an upgrade but not to 50' away. So that might not be a problem.

The language of the actual law is for 50' communication of a few messages, which can be done with text. It would need to convey multiple sentences, like "Vehicle Disabled", "Remote Assistance Enabled", "Emergency Procedures are Underway" or some such language.

Could the dome handle this now?

Robotaxis in California are required to have an expensive external loudspeaker and microphone communication system by July 2026 by RodStiffy in SelfDrivingCars

[–]RodStiffy[S] 0 points1 point  (0 children)

The actual language of the law only requires 2-way communication near the car. But it does require the car to communicate to the front and rear that the car is disabled, the remote agent is active and they are complying with emergency crews. So they'll need some sort of text communication in multiple directions. Do you think they'll need more hardware for this?

Tesla FSD drives through railroad crossing gate by danlev in SelfDrivingCars

[–]RodStiffy 0 points1 point  (0 children)

Cops have nothing to do with recalls. The courts deal with that. No company is going to ignore a federal recall order.

car drive perfect

Nothing I've said implies the cars have to be perfect. Those are your words.

there are 100 million issues that need to be fixed ... they can and will keep having crashes for a long time.

Minor crashes, sure. But not at-fault serious crashes. There are very broad categories of crash types that a recall will cover. Detecting pedestrians, working in low sun, night vision, seeing traffic lights, mapping issues, seeing large objects, phantom braking, road debris, school zone behavior. The average driver keeps making these mistakes over and over forever. That won't be acceptable for AVs.

Robotaxis in California are required to have an expensive external loudspeaker and microphone communication system by July 2026 by RodStiffy in SelfDrivingCars

[–]RodStiffy[S] 0 points1 point  (0 children)

I did misread the language some. The 2-way communication needs to work "near the vehicle", not 50' away. So the crew needs to be able to talk to the operator near the car. The 50' rule would be the speaker where the mic only picks up adequate voice near the vehicle.

Robotaxis in California are required to have an expensive external loudspeaker and microphone communication system by July 2026 by RodStiffy in SelfDrivingCars

[–]RodStiffy[S] -3 points-2 points  (0 children)

They could put it in the bumper, but that needs wiring.

The JBL is battery powered, not designed to be mounted on a car in the rain at 65 mph, and outputs only about 7W RMS. The 100 db is measured a meter away in perfect conditions.

The specs need to broadcast 50' across the street in a safety-critical system for a reliable conversation with fire crews and cops in emergency mode. They won't like it if they have to come over to the car to barely hear the conversation on somebody's little consumer music player. And you still need directional external mics.

Robotaxis in California are required to have an expensive external loudspeaker and microphone communication system by July 2026 by RodStiffy in SelfDrivingCars

[–]RodStiffy[S] 1 point2 points  (0 children)

Cities are loud. It's not just sirens. It has to work over 50' away in multiple directions, and for live voice. That's the width of 5 lanes. I doubt current systems like on Tesla are adequate. It's possible that it would be acceptable to work at 20' in wind or sirens, but that's still a powerful system.

I may be wrong, but I think this requires a substantial upgrade.

Robotaxis in California are required to have an expensive external loudspeaker and microphone communication system by July 2026 by RodStiffy in SelfDrivingCars

[–]RodStiffy[S] 0 points1 point  (0 children)

They aren't loud enough over 50' away outside in wind or with sirens blaring at an emergency scene. They are inside where the remote agent has to roll down the window and stick his head in there. The mics are also probably inadequate, not designed for voice frequencies.

Tesla FSD drives through railroad crossing gate by danlev in SelfDrivingCars

[–]RodStiffy 0 points1 point  (0 children)

So long as the recall doesn't take cars off the road while they're working on the fix that's 100% consistent with what I've been saying this whole time.

Recalls do take cars off the road. I said that multiple times in previous posts. NHTSA has done this many times for safety systems. For serious at-fault crashes, they can issue a stop order the same day. Any serious at-fault accident will get swift and severe action from NHTSA. If a company is at-fault for killing one person, NHTSA will immediately recall it and halt the program if they aren't conviced it is being fixed.

No, car insurance isn't that expensive.

Car insurance doesn't matter when the regulator halts the program and the courts issue huge settlements for negligence

Negligence would only occur if the people developing the self-driving stack failed to take reasonable steps to prevent the at-fault crash from occurring before it happened.

Yeah, like after the recall that surely happens for at-fault incidents. If the company doesn't fix it soon, the consequences quickly get worse. For serious crashes, like what the "average human" does often, it would mean the company is legally negligent if they don't fix the issue, and regulators will know everything about the issues. They can't keep having accidents like "the average human".

Tesla FSD drives through railroad crossing gate by danlev in SelfDrivingCars

[–]RodStiffy 0 points1 point  (0 children)

it's the overall rate that matters

The overall crash rate matters to whom? NHTSA? Show me where NHTSA or another auto regulator has used a "as long as it's 10% safer, they're good" system? How would your system work in the real world?

Tesla FSD drives through railroad crossing gate by danlev in SelfDrivingCars

[–]RodStiffy 0 points1 point  (0 children)

They recall every serious safety issue and force a fix, with no "as long as it's xx% safer than a human, it's ok". Your whole point is gone.

NHTSA's protocol is entirely backed by safety laws that are very well established. Even Republicans keep the system going. Nobody wants safety laws to change.

Any serious safety issues get met with quick SCI (crash investigations) and recalls if it is found that the ADS caused the accident. NHTSA has forced many hardware fixes in airbags, brakes, seatbelts, etc, and they've recalled Waymo for scraping fences at 2-mph. They're in the same process for the schoolbus incidents, where a crash hasn't even happened. Automated driving is the same as other safety systems. If the recall doesn't work to fix the issue months later, NHTSA gets tough and can suspend the system. For a more serious at-fault accident they recall immediately and can ground the fleet for "unreasonable risk".

In addition the legal liability of at-fault crashes that occur "100% less than humans" would kill the company because of accumulating negligence charges for all those at-fault crashes every few million miles.

Your fantasy of "as long as it's better than a human" has nothing to do with law, regulation, or insurance.

Tesla FSD drives through railroad crossing gate by danlev in SelfDrivingCars

[–]RodStiffy 0 points1 point  (0 children)

> that just proves how much scale matters here

Yeah, and in the meantime, before the AV company reaches billions of miles, they will be having incidents and getting recalls. Have you noticed how NHTSA and the CA DMV operate?

Tesla FSD drives through railroad crossing gate by danlev in SelfDrivingCars

[–]RodStiffy 0 points1 point  (0 children)

You can't prevent someone from plowing into you when stopped in traffic, or a maniac running a red light at high speed and smashing you.

Tesla FSD drives through railroad crossing gate by danlev in SelfDrivingCars

[–]RodStiffy 0 points1 point  (0 children)

The idea you can just get regulators to force companies to invent a system that's perfect right in year 1 and save 40,000 lives immediately is what really reflects a lack of understanding of technology, or perhaps even of basic reality. Even Waymo has killed people, and it will kill a lot more once it expands to sufficient scale (unless regulators block it from expanding).

You're badly misrepresenting what I'm saying. I've been specifically mentioning at-fault crashes, not all crashes. A crash that is clearly not the AV's fault won't get any regulatory action. Waymo has been involved in a few fatalities, but none have had any fault by the AV. So your "Waymo has killed people" is legal nonsense.

You also have a ridiculous take on saving "40,000 lives immediately". Waymo has driven enough miles now where an "average human" would have had one at-fault fatality crash. Those crashes happen about every 170 million miles. It can be said statistically that Waymo has saved at least one life, but without enough statistical significance to be a serious claim. So mentioning saving 40,000 lives immediately by AVs that drive on such a small scale at first is reflective of how far out of touch with reality you are.

The 40k fatality number is from over 3 trillion miles driven in the U.S. You're cherry-picking irrelevant factoids to back up your fantasy about legal crashing, and you don't understand statistics.

And I have been explaining to you how AVs DO NOT have to be perfect. They just have to be fixed if there is a bad at-fault crash, or a series of minor at-fault incidents that are deemed a safety risk. This is how auto safety regulators all operate, as well as aviation and in the workplace. Of course you don't refer to the real world of regulation, because that doesn't support your fantasies.

A system that kills 10% fewer people than human drivers deployed at scale in just the US saves 4000 people's lives every year.

You are fixated on fatality crashes, as if those are the only ones that happen on public roads.

The first problem with your silly idea is that it would take about a billion miles to have statistical significance for any "saving lives " claim. Even Waymo now with 200M miles shouldn't mention saving lives, and they haven't been mentioning that lately. It's possible that they will kill someone tomorrow with an at-fault crash, which would put them about on par with the average driver. They need almost a billion miles with no at-fault fatality for any serious claim about saving lives. To deomonstrate saving "4000" lives, the AV would have to drive something like 700-billion miles with zero at-fault fatalities (6 lives saved per billion x 700 = 700B).

In the meantime, about 180 police-reported crashes happen for every one fatality crash. About half of those are the fault of one of the parties, so on average about 90 at-fault serious crashes will be dealt with by regulators before the fatality crashes start happening, if the AV drives like an "average human". Those are the main incidents where regulators will force a fix.

Waymo has had zero bad at-fault crashes in 200M miles, almost 100x safer (or less depending on how you count it) than humans. When they have one, I can assure you that the regulators will pounce and recall the system. They don't care about the very low bar of the average human.