I think we are in danger of ignoring the main relevance of driverless cars to the issue of the true nature of the human mind. I mean it could be possible that driverless cars are just too expensive to develop, but could exist in principle, or that people really want an alternative to cars altogether, so that this technology is too late in the day. (Even though to be honest, I don't really believe either of those suggestions).
I have been recently re-reading James Carpenter's book, "First Sight". It is a somewhat turgid read, but it proposes an interesting idea. He asserts that the endless variations and successful repetitions of Dean Radin's presentiment experiment are showing us that human consciousness (maybe all consciousness) enjoys some form of advance knowledge of what is about to happen next - even when that is based on a quantum-based stream of random numbers. He seems to suggest that every event that enters our consciousness starts off as an unconscious psychic alert that helps us to use the conventional information that may follow.
Presumably, this is completely lacking in any driverless car, and it may explain the problems engineers are having making these vehicles safe, even in a benign urban setting.
The development costs for these cars are phenomenal, and a healthy slice of that money (after bribes to the regulatory authorities) must have been spent on software (in its broadest sense). I'd suggest that there must be a law of diminishing returns here, and that safe, driverless cars are simply not possible if they are to mix with other vehicles and people.
It is also worth remembering that when humans are taught to drive, this starts in quiet backstreets, and gradually moves on to main roads and eventually to motorways. Those quiet back streets are going to present all kinds of hazards such as children playing, non-standard road layouts, pot holes, junk in the road, etc etc. The fact that the learning process is reversed in this way is rather suggestive.
David
Self-driving trucks struggle to deliver
Joann Muller
Quote:There's been an industry shakeout over the past year, causing once-giddy investors to pull back, while survivors shed workers and struggle to fund continued development.
Quote:Aurora lost $1.7 billion last year and will need to raise more cash to fund its commercial rollout, co-founder Sterling Anderson tells Axios.
Quote:The bottom line: Self-driving trucks are stuck in low gear.
There's a lot of attempts at spin from the executives they quote but seems to be the facts are not looking good for driverless car/truck companies hoping to replace human workers with machine "learning" pipe-dreams...
(2023-04-26, 11:15 PM)Max_B Wrote: [ -> ]You’d just use GPS for speed limits… that’s part of the reason for the EU’s Galileo satellite network.
In theory yes, but it seems to not work consistently. We'll see if it works with the satellite network.
Your post fully supports my earlier claim in this thread that the problem with selfdriving cars is mainly a legal issue.
Quote:We are proud of Autopilot’s performance and its impact on reducing traffic collisions,” the company states with a January 2023 (https://www.tesla.com/VehicleSafetyReport) Vehicle Safety Report update. The numbers are staggering. In the 3rd quarter, one crash was recorded for every 6.26 million miles driven using Autopilot. Teslas not using Autopilot technology logged one crash for every 1.71 million miles driven.
For comparison, the NHTSA estimated one automobile crash every 652,000 miles. That equates to Tesla drivers without Autopilot engaged being 2.5 times safer and with Autopilot being used ten times safer than the national average.
https://www.notateslaapp.com/news/1144/t...ety-report
(2023-04-27, 06:18 PM)sbu Wrote: [ -> ]Your post fully supports my earlier claim in this thread that the problem with selfdriving cars is mainly a legal issue.
https://www.notateslaapp.com/news/1144/t...ety-report
Seems like nothing more than corporate hype.
edit:
Tesla Again Paints A Crash Data Story That Misleads Many Readers
Brad Templeton
Quote:Every quarter, Tesla releases crash data on their cars in various modes. Recently it also released its annual “impact report,” which for the first time included some data on drivers using their prototype “full self driving” system, which has been pre-purchased by several hundred thousand owners. Much of the coverage of the report has described it as presenting an incredibly positive story of Tesla safety. Their raw numbers do seem very good, but the reality is disturbingly different.
Quote:I have done analysis of these reports before, generally concluding that what they actually show is that users of Autopilot have a roughly similar number of crashes to those not using it. The Impact report did not cite a number for non-Autopilot use with ADAS active safety, but the quarterly safety report for Q4/2022 reported about 0.71 airbag events per million miles, a bit worse than the annual summary.
Quote:Tesla’s number give a very incorrect impression — so incorrect that it is baffling why they publish them when this has been pointed out many times by many writers and researchers. Oddly, Tesla has the real data — they have the best data in the world about what happens to their vehicles. The fact that they could publish the truth but decline to, and instead publish numbers which get widely misinterpreted raises the question of why they are not revealing the full truth, and what it is that they don’t reveal.