Even After $100 Billion, Self-Driving Cars Are Going Nowhere

155 Replies, 7343 Views

Driverless cars are causing more and more traffic issues in San Francisco

John Callaham

Quote:Driverless cars were once touted as the saviour of transportation, but the reality is that the current automated vehicle technology efforts are causing real problems for the city of San Francisco.

Quote:An NBC News report on YouTube talks about how driverless cars from GM's Cruise and Google's Waymo, which got permission to offer cab services in San Fransico in 2022, are now responsible for three 911 calls a day in the city. The report says the cars can get confused and stop entirely if they encounter construction or red lights from emergency vehicles, leading to traffic jams.

It's a particular problem for the city's fire department and first responders. One fire chief says she sees at least one incident a day with driverless vehicles. Other reports filed by the department include autonomous vehicles driving toward active fire scenes and running over hoses. One report claims firefighters had to break one of these cars' windows in order to stop it. The fire chief interviewed stated she doesn't believe these cars are "ready for prime time."

The local Department of Transportation doesn't have much power over Google or GM to control these vehicles, thanks to special rules made at the state level that doesn't allow the city of San Francisco to regulate driverless vehicles.
'Historically, we may regard materialism as a system of dogma set up to combat orthodox dogma...Accordingly we find that, as ancient orthodoxies disintegrate, materialism more and more gives way to scepticism.'

- Bertrand Russell


[-] The following 2 users Like Sciborg_S_Patel's post:
  • nbtruthman, Typoz
Self-driving cars are bad at the social part of traffic

Michael Skov Jensen-Copenhagen

Quote:Self-driving cars struggle when navigating social interactions in traffic like whether to yield to something or someone in the road or continue driving, researchers report.

Quote:Should I go or give way? It is one of the most basic questions in traffic, one that humans typically make quickly and intuitively, because doing so relies on social interactions trained from the time we begin to walk.

Self-driving cars on the other hand, aren’t as adept, according to the new study. An analysis of videos uploaded by YouTube users of self-driving cars in various traffic situations shows that self-driving cars have a particularly tough time understanding when to yield.

“The ability to navigate in traffic is based on much more than traffic rules. Social interactions, including body language, play a major role when we signal each other in traffic. This is where the programming of self-driving cars still falls short,” says Barry Brown, professor at the University of Copenhagen who has studied the evolution of self-driving car road behavior for the past five years.

“That is why it is difficult for them to consistently understand when to stop and when someone is stopping for them, which can be both annoying and dangerous.”
'Historically, we may regard materialism as a system of dogma set up to combat orthodox dogma...Accordingly we find that, as ancient orthodoxies disintegrate, materialism more and more gives way to scepticism.'

- Bertrand Russell


[-] The following 1 user Likes Sciborg_S_Patel's post:
  • Typoz
(2023-06-03, 02:37 PM)Sciborg_S_Patel Wrote: Self-driving cars are bad at the social part of traffic

I like this part:

Quote:Should I go or give way? It is one of the most basic questions in traffic, one that humans typically make quickly and intuitively, because doing so relies on social interactions trained from the time we begin to walk.

I agree, this is a basic skill learned over our entire lives, it isn't particularly a car-driving skill. Sometimes driving involves gestures or even a glance in the direction of another vehicle, these things are natural human skills which we incorporate into driving. Possibly machine-machine interactions may be easier to incorporate into automated systems, but machine-human ones very much less so.
(This post was last modified: 2023-06-03, 04:51 PM by Typoz. Edited 1 time in total.)
[-] The following 3 users Like Typoz's post:
  • Brian, nbtruthman, Sciborg_S_Patel
A Waymo self-driving car killed a dog in ‘unavoidable’ accident

Rebecca Bellan

Quote:Public perception aside, Waymo could face investigations from regulatory bodies like the National Highway Traffic Association. NHTSA requires manufacturers and operators of high-level autonomous vehicles to submit incident reports for crashes if the autonomous driving system was in use at any time within 30 seconds of the crash and the crash resulted in property damage or injury. The agency told TechCrunch it had reached out to Waymo for more information, but has no open investigations into the company at this time.

In 2018, when an autonomous vehicle from Uber’s now-shuttered AV unit hit and killed a pedestrian, the National Transportation Safety Board (NTSB) launched an investigation. Usually, the NTSB launches a highway investigation when there’s been a significant crash that highlights a potential national safety issue. A spokesperson from the agency told TechCrunch she doesn’t believe NTSB has any current investigations involving Waymo.
'Historically, we may regard materialism as a system of dogma set up to combat orthodox dogma...Accordingly we find that, as ancient orthodoxies disintegrate, materialism more and more gives way to scepticism.'

- Bertrand Russell


(2023-06-09, 05:53 PM)Sciborg_S_Patel Wrote: A Waymo self-driving car killed a dog in ‘unavoidable’ accident

Quote:The human operator didn’t see the dog, but the vehicle’s autonomous system did. However, a number of factors, including the speed and trajectory of the dog’s path, made the collision unavoidable, according to Waymo.

Quote:Neither the safety operator nor the autonomous system braked to avoid collision, according to Waymo. In both cases, that’s because of the “unusual path” the dog took at “a high rate of speed directly towards the side of the vehicle,” said a Waymo spokesperson.

It seems there's not enough information available to figure out what happened. Since the description is provided by Waymo, it isn't an independent or neutral report.

Of course collisions with animals occur fairly frequently, there may not be anything unusual here. But still, sometimes just by assessing the type of environment there might be reason to drive more slowly and cautiously just in case 'something' unexpected takes place. Too many unknowns to really tell, I think.

There's also a factor alluded to by @David001, an extra human sense which clearly doesn't prevent all accidents but may sometimes play a role in either sensing or responding to a situation.
(This post was last modified: 2023-06-10, 07:49 AM by Typoz. Edited 1 time in total.)
[-] The following 1 user Likes Typoz's post:
  • Sciborg_S_Patel
17 fatalities, 736 crashes: The shocking toll of Tesla’s Autopilot

by Faiz Siddiqui and  Jeremy B. Merrill

Quote:The school bus was displaying its stop sign and flashing red warning lights, a police report said, when Tillman Mitchell, 17, stepped off one afternoon in March. Then a Tesla Model Y approached on North Carolina Highway 561.

The car — allegedly in Autopilot mode — never slowed down.

It struck Mitchell at 45 mph. The teenager was thrown into the windshield, flew into the air and landed facedown in the road, according to his great-aunt, Dorothy Lynch. Mitchell’s father heard the crash and rushed from his porch to find his son lying in the middle of the road.
“If it had been a smaller child,” Lynch said, “the child would be dead.”

The crash in North Carolina’s Halifax County, where a futuristic technology came barreling down a rural highway with devastating consequences, was one of 736 U.S. crashes since 2019 involving Teslas in Autopilot mode far more than previously reported, according to a Washington Post analysis of National Highway Traffic Safety Administration data. The number of such crashes has surged over the past four years, the data shows, reflecting the hazards associated with increasing use of Tesla’s driver-assistance technology as well as the growing presence of the cars on the nation’s roadways.

The number of deaths and serious injuries associated with Autopilot also has grown significantly, the data shows.
'Historically, we may regard materialism as a system of dogma set up to combat orthodox dogma...Accordingly we find that, as ancient orthodoxies disintegrate, materialism more and more gives way to scepticism.'

- Bertrand Russell


[-] The following 4 users Like Sciborg_S_Patel's post:
  • nbtruthman, David001, Brian, Typoz
(2023-07-13, 01:42 PM)Sciborg_S_Patel Wrote: 17 fatalities, 736 crashes: The shocking toll of Tesla’s Autopilot

by Faiz Siddiqui and  Jeremy B. Merrill

I'm glad the boy survived, but it sounds as though the car was doing something dangerous even if the bus had not been a school bus.

To me, it is obvious that driverless vehicles will ultimately be forbidden from public roads after lots of people have been killed or injured, and after ludicrous amounts of expenditure.

David
[-] The following 1 user Likes David001's post:
  • Sciborg_S_Patel
(2023-07-13, 07:40 PM)David001 Wrote: it is obvious that driverless vehicles will ultimately be forbidden from public roads after lots of people have been killed or injured

While I'm sensitive to any loss of life, I think this is not only NOT obvious but likely dead wrong.

I remain very optimistic about autonomous travelling.  I find it an inevitable and ultimately superior outcome.
[-] The following 1 user Likes Silence's post:
  • Sciborg_S_Patel
(2023-07-14, 02:32 PM)Silence Wrote: While I'm sensitive to any loss of life, I think this is not only NOT obvious but likely dead wrong.

I remain very optimistic about autonomous travelling.  I find it an inevitable and ultimately superior outcome.

In a very different form using new hardware and possibly even new programming languages that can handle the domain space of driving...very possible.

But the current strategy of just hoping accumulations of data will produce a good driverless car...very doubtful IMO. And apparently doubtful in the minds of the companies trying to put out these autonomous vehicles given the the shady way the tech is being foisted onto the public...
'Historically, we may regard materialism as a system of dogma set up to combat orthodox dogma...Accordingly we find that, as ancient orthodoxies disintegrate, materialism more and more gives way to scepticism.'

- Bertrand Russell


(2023-07-14, 07:23 PM)Sciborg_S_Patel Wrote: In a very different form using new hardware and possibly even new programming languages that can handle the domain space of driving...very possible.

But the current strategy of just hoping accumulations of data will produce a good driverless car...very doubtful IMO. And apparently doubtful in the minds of the companies trying to put out these autonomous vehicles given the the shady way the tech is being foisted onto the public...

I'm not so sure new hardware is needed per se.  Maybe, but I'm not sure.  We have quite a bit of hardware both in terms of sensors and raw computing power.

The 'software' side clearly needs further evolution.

That said I'm simply not close enough to the technical elements to have an educated position either way.  That said I wouldn't be surprised if it turns out we're much closer to it sitting here in July of 2023 that you might suspect.
[-] The following 1 user Likes Silence's post:
  • Sciborg_S_Patel

  • View a Printable Version
Forum Jump:


Users browsing this thread: 1 Guest(s)