Trains.com

Self Driving cars unsafe at any speed

3025 views
44 replies
1 rating 2 rating 3 rating 4 rating 5 rating
  • Member since
    March 2016
  • 1,568 posts
Posted by CandOforprogress2 on Friday, July 15, 2016 10:21 PM

A wristband that delivers a mild eletroshock when the driver nods off would be a start.

  • Member since
    July 2008
  • 2,325 posts
Posted by rdamon on Thursday, July 14, 2016 3:53 PM

  • Member since
    December 2001
  • From: Northern New York
  • 25,021 posts
Posted by tree68 on Thursday, July 14, 2016 3:28 PM

Back in the 1960's, GM experimented with a wire laid in the pavement.  Their "self guided" cars had a sensor which detected the wire and followed it - rather like an electronic slot car...

LarryWhistling
Resident Microferroequinologist (at least at my house) 
Everyone goes home; Safety begins with you
My Opinion. Standard Disclaimers Apply. No Expiration Date
Come ride the rails with me!
There's one thing about humility - the moment you think you've got it, you've lost it...

  • Member since
    July 2008
  • 2,325 posts
Posted by rdamon on Thursday, July 14, 2016 2:04 PM

The idea of truck or car "platooning" has been shown to increase road capacity by reducing the space between cars.

http://arstechnica.com/cars/2016/04/europe-completes-a-demonstration-of-semi-autonomous-truck-platooning/

 

  • Member since
    December 2008
  • From: Toronto, Canada
  • 2,560 posts
Posted by 54light15 on Thursday, July 14, 2016 1:30 PM

Mercedes-Benz found out with one of their semi-self driving cars is that the lane detector doesn't work when the lines in the road are covered in snow. What I see happening eventually is that the Elon Musks of the world will give money to "help" politicians campaigns who will then enact laws where all federal, provincial and state roads must have readable-in-all-weather sensors installed in the middle and sides of roads and in crossings of various types, both railroad and pedestrian. This will be done in the name of safety and will be paid for by your increased taxes. Don't you like safety?

  • Member since
    March 2016
  • 1,568 posts
Posted by CandOforprogress2 on Thursday, July 14, 2016 1:23 PM

One way that i can see this working is on Private toll roads and HOV lanes- Imagine that we could repurpose abandoned railroad corridors into robot roads that have driverless electric trucks and buses serving industries and stations along the way-

https://techcrunch.com/2016/04/25/the-driverless-truck-is-coming-and-its-going-to-automate-millions-of-jobs/

  • Member since
    December 2001
  • From: Northern New York
  • 25,021 posts
Posted by tree68 on Thursday, July 14, 2016 1:20 PM

Clearly the author of that article lives in the city.

Out here in the sticks, I don't see many of those touted advantages as being advantages.  Even if you can call for a vehicle and have it arrive at your house, such functionality will rely on the sufficient availability of the vehicles.  Will my decision to use one for a two hour trip to somewhere mean someone doesn't make it to work?  Will one still be available ten hours from now when I want to return?  What if someone at my destination wants to use one of the vehicles to travel even further from my home?  Could one of these vehicles theoretically travel coast to coast, in bit and pieces by a number of individuals?

And if individuals don't own the vehicles, who does?

LarryWhistling
Resident Microferroequinologist (at least at my house) 
Everyone goes home; Safety begins with you
My Opinion. Standard Disclaimers Apply. No Expiration Date
Come ride the rails with me!
There's one thing about humility - the moment you think you've got it, you've lost it...

  • Member since
    January 2014
  • 8,221 posts
Posted by Euclid on Thursday, July 14, 2016 12:55 PM

8 ways driverless cars will drastically improve our lives

 

http://www.techinsider.io/8-ways-driverless-cars-will-drastically-improve-our-lives-2015-12

One of the 8 ways quoted from the link:

In addition to saving lives, driverless cars may also help save our planet.

Because autonomous vehicles are built to optimize efficiency in acceleration, braking, and speed variation, they help increase fuel efficiency and reduce carbon emissions.

According to McKinsey, adoption of autonomous cars could help reduce car CO2 emissions by as much as 300 million tons per year. To put that into perspective, that’s the equivalent of half of the CO2 emissions from the commercial aviation sector. 

  • Member since
    January 2014
  • 8,221 posts
Posted by Euclid on Thursday, July 14, 2016 12:44 PM

I have no doubt that self-driving cars can be technologically perfected.  However, the concept seems to be about so much more than transportation.  It seems to be a facet the green movement just like renewable energy.  This virtue seems to be responsible it being pushed so hard by the public sector.  It has that coercive feel of the banning of the incandescent light bulb.

Some of this is reflected in this piece:

http://www.usnews.com/news/business/articles/2016-01-14/government-developing-policies-for-self-driving-cars

Quotes from the link:

Foxx [of USDOT] said the government believes self-driving vehicles could eventually cut traffic deaths, decrease highway congestion and improve the environment. He encouraged automakers to come to the government with ideas about how to speed their development.

"In 2016, we are going to do everything we can to promote safe, smart and sustainable vehicles. We are bullish on automated vehicles," Foxx said during an appearance at the North American International Auto Show in Detroit.

Bryant Walker Smith, a law professor at the University of South Carolina and an expert on the legal issues surrounding self-driving cars, said the government's action is aggressive and ambitious. He said regulators are following the example of Europe, which has exempted autonomous cars from certain regulations in order to speed their development.

Foxx said President Barack Obama's budget would provide $4 billion over the next decade for programs to test connected vehicles. Some vehicles already on the road can communicate with each other, or with traffic lights or stop signs, through cellular signals. Eventually, that could prevent accidents, since vehicles could warn each other before they crash at an intersection.

 

It seems to me that the timeline for the perfection of the self-driving car is being incredibly over-promised in the name of the politically correct virtues of the green movement and its quest for sustainability.  Most ironically, therefore, the driverless car is an assault on the automobile. 

 

  • Member since
    May 2003
  • From: US
  • 25,292 posts
Posted by BaltACD on Thursday, July 14, 2016 12:43 PM

My observations and experiences - either the 'machine' has total control or the human has total control.  Shared control generally means that the human is in no position (mental and/or physical) to assume control when the machine decides to relinquish control.  A human deciding to disengage machine control is one thing - a machine deciding to ceede control is a different animal entirely.

Never too old to have a happy childhood!

              

  • Member since
    May 2013
  • 3,231 posts
Posted by NorthWest on Thursday, July 14, 2016 12:07 PM

erikem
A big factor was the design of the side stick control on the plane that did not give feedback as to what the other pilot was doing with controls. With the traditinal control yokes, the captain would have known immediately that the first officer was yanking back on the yoke and thus prolonging the stall.

Certainly part of it, and I am a much bigger fan of Boeing's approach to FBW and Otto. However, I think that the most important outcome of this crash was the understanding of the complex human factors that played a major role.

NorthWest
What if the car suddenly gives back control when an icy skid becomes outside parameters?

erikem
All the more reason to keep the driver informed of what the car was doing, in particular feedback through the steering wheel. A co-worker suggested that the "auto-pilot" mode should do what's done on locomotives, have an alerter function that requires interaction from the driver at intervals to check if the driver is paying attention.

But then what problem are you trying to solve with automation? Unlike in aircraft there are far fewer efficiency gains besides increasing capacity of roads, which would probably reduce vehicle spacing to the point where a human reaction would probably be too slow. The premise for these systems has mostly been to avoid accidents caused by drunks, texters, etc, who are not paying attention to the road, and to allow more productive use of time in the vehicle by allowing the occupants to do other things instead of drive. If the driver has to constantly monitor the vehicle, then why not simply drive the thing instead of trying to deal with the issues of trying to pay attention, complacency and the startle factor resulting in improper reactions when the car suddenly hands control back?

  • Member since
    November 2003
  • From: Rhode Island
  • 2,289 posts
Posted by carnej1 on Thursday, July 14, 2016 11:49 AM

Ulrich

Completely self driving cars are still a few years off although cars that steer themselves down the highway are already here (check out the Mercedes class E). 

 

Given that vehicles with self steer also (in all cases AFAIK) have "smart" cruise control systems which automatically detect traffic and operate both the vehicles throttle and brakes I would say that for highway operation they qualify as "self driving". In the next few years there will be more advanced versions of the system that will automatically change lanes in response to traffic.

"I Often Dream of Trains"-From the Album of the Same Name by Robyn Hitchcock

  • Member since
    November 2003
  • From: Rhode Island
  • 2,289 posts
Posted by carnej1 on Thursday, July 14, 2016 11:46 AM

Euclid

Self-driving cars are dramatically over-promised.  It seems like politically correct, utopianism.  To jump right into this self-driving dream relying solely on computer technology is naïve.  Here is what I don’t quite understand:  Other than the fashion statement of have a car that drives you around automatically, what exactly is the point of idea?  Is it to remove the fallibility of a human driver, and thus make driving safer?

A few years ago, I wrote about my prediction for the future of driving where cars have a layer of automatic, central control that enforces the laws as you drive.  I thought it would develop as a natural extension of the current proposals to tax drivers by the miles driven, which, in effect, turns all roads into toll roads.  Instead of getting people out of their cars and into mass transit, it would turn people’s cars into a form of mass transit.   This self-driving car development seems to have leapfrogged past my prediction and taken it to its ultimate level.

 

 

 Of course running the entire road system under centralized control would deprive governments of a nice,steady revenue stream: speeding fines. Any toll-by-the-mile system would also be able to monitor average speed and automatically ticket speeders.

"I Often Dream of Trains"-From the Album of the Same Name by Robyn Hitchcock

  • Member since
    April 2011
  • 649 posts
Posted by LensCapOn on Thursday, July 14, 2016 11:16 AM

erikem

In addition to dealing with streetcars and RR crossings, I'd wonder if the the autopilot knows anything about "Turn around, don't drown" with respect to flooded roads.

 

Much more than Street Cars and RR Crossings, I'm worried about their treatment of Motorcycles! (And their Riders)

 

 

 

Train Nut AND Bike Nut Here.

  • Member since
    September 2003
  • 21,669 posts
Posted by Overmod on Thursday, July 14, 2016 9:58 AM

erikem
NorthWest

What if the car suddenly gives back control when an icy skid becomes outside parameters?

All the more reason to keep the driver informed of what the car was doing, in particular feedback through the steering wheel.

A co-worker suggested that the "auto-pilot" mode should do what's done on locomotives, have an alerter function that requires interaction from the driver at intervals to check if the driver is paying attention.

Every single person I know in human factors engineering, IxD and artificial consciousness thinks that the premise and execution of alerters is pointless and, basically, more dangerous than effective.  And is prepared to back that up with evidence.

The way to 'check that the driver is paying attention' is to monitor their attentiveness in the background, and periodically interact with them through 'normal' methods (such as conversation).  This permits some rather simple confirming analytics, such as voice stress analysis, as well as confirming the right level of 'high functioning' that is necessary for safe response to "emergent situations" (which is how we now have to redefine 'emergencies' with the original word having lost that technical meaning).

An issue with skids is that feedback 'through the wheel' may be inadequate or wrong for many drivers, who will overcompensate or just plain freeze when presented with it.  The same has been true of antilock brakes since the early days of hydraulic servomotor actuation, where the default 'advice' to the uninitiated has been 'stomp and steer' even as some of the instantiations have made that response deadly under what may be fairly common circumstances.  (It happened directly to me, so I know the issue quite well.)  Furthermore, there is little chance that a sufficiently 'total' failure of automatics to cope with a given situation can be addressed by any human response -- nominally "prepared to take over" or not.  That is especially true in the '60s-style vehicles made popular by people like the illustrious Syd Mead, where they'd have to drop their drinks and their pinochle hands, turn their chairs 90 degrees, and scramble for pop-out controls ... good luck with that.  But it may be true for skids as well; by the time the operator is responding, it may take the skills and reflexes of a Bondurant pursuit graduate just to make the resulting impact 'safe' enough to preserve the lives of the passengers.

There is a very wide set of research results on the psychophysics of 'anticipated catastrophe' that indicate that attempting to keep an operator 'ready' to assume the functions of an automated system is actually worse than not having the automation running under "autonomous control" in the first place.  Just imagine the fun with having to reach around and touch the emergency controls -- it sure won't be an 'alerter button' in a car that has to be steered as well as braked! -- every 40 seconds.  Hope the driver or passengers weren't hoping for a restful journey!

 

  • Member since
    December 2005
  • From: Cardiff, CA
  • 2,930 posts
Posted by erikem on Thursday, July 14, 2016 8:55 AM

NorthWest

Designers would do well to read the BEA report on the crash of AF 447. The combination of the startle factor, overreliance on technology to the detritement of basic airmanship and lack of procedures as to what to do when things go wrong led to the pilots stalling and crashing a fully functional aircraft.

A big factor was the design of the side stick control on the plane that did not give feedback as to what the other pilot was doing with controls. With the traditinal control yokes, the captain would have known immediately that the first officer was yanking back on the yoke and thus prolonging the stall.

What if the car suddenly gives back control when an icy skid becomes outside parameters?

 

All the more reason to keep the driver informed of what the car was doing, in particular feedback through the steering wheel.

A co-worker suggested that the "auto-pilot" mode should do what's done on locomotives, have an alerter function that requires interaction from the driver at intervals to check if the driver is paying attention.

  • Member since
    July 2008
  • 2,325 posts
Posted by rdamon on Thursday, July 14, 2016 8:35 AM
One has to look no further than “Fly by wire” control systems for aircraft. Even though there is a physical control stick and pedals, the computer controls “how-much” and “when” things happen. Of course there are multiple systems and over-rides, but elimination of a steering wheel in a car may be more of elimination of the direct mechanical linkage than removing the physical wheel.
Compare the complexity of a modern cars anti-lock braking system to an older car. I have a few brake spoons in my tool box that are now there for history lessons for my kids.
 
Automation is a journey where things that can be automated get automated. How long have we had cruise control? Now we have adaptive cruise control, headlights, blind spot monitoring, lane departure ... etc. 
 
When I was a freshman in college, the joke was our job was to create an airplane that had a pilot and a dog in the cockpit. The pilots job is to feed the dog and the dog’s job it to bite the pilot if he touches any of the controls.

 

I agree Google really advanced the concept, this will be fun to watch.
 
To steer :) this to a railroad topic, I was amazed when I got into the first car of the L9 train in Barcelona and there was no cab.
 
  • Member since
    April 2016
  • 1,447 posts
Posted by Shadow the Cats owner on Thursday, July 14, 2016 7:18 AM

I would not want a Telsa car equipped with their autopilot.  Here is why one of their cars in autopilot ran under a truck that was across the road as the autopilot was not able to tell that a 53 foot trailer was not the sky.  It literally took the owner of the cars head off.  That person was watching a DVD instead of the road.  NTSB is now looking into the accident.  That should be a fun one for Telsa to get out of. 

  • Member since
    September 2003
  • 21,669 posts
Posted by Overmod on Thursday, July 14, 2016 7:14 AM

jeffhergert
One news item I saw had one of the designers/engineers (I think he was from Google) say eventually they hope to remove the steering wheel entirely.

There are idiots and fools everywhere.  Even at Google.

Anyone familiar with industrial control logic (let alone consumer-grade or crApple OS and environment "management") will know how pointless it would be to implement a safety-critical system in a vehicle capable of high momentum without providing capable backup.  That backup won't stop at redundant electronic control systems.

The 'catch' for many years is removing controls that are either continuously monitored or which provide the 'temptation' to go to manual control unpredictably.  The latter being a particular bugaboo in some of the early 'inductive control' experiments in the late Forties, where a user in "emergency" might inadvertently grab for the wheel, whack on the brake, and otherwise bung up any semblance of control that the electronics might be able to assert over the situation.  There were similar discussions in military applications over the degree to which pilots should be able to override high-G systems that were monitoring key airframe stresses.

There is no better solution for operating a conventional motor vehicle than a steering wheel, but it poses a number of restrictions for a proper 'autonomous vehicle'.  The problem is that most of the 'other' backup-control modalities, most notably anything with a sidestick proportional controller, do not work well without power boost of some kind, which means that they are functionally useless or worse in a great range of potential failure situations.

Interestingly enough, most of the 'older' systems tacitly accepted the idea that the automatic control was intended for long, continuous sections of 'cruise', or enablement for "autobahn"-style high-speed operation at minimum separation, rather than tooling around town ready to avoid all the children playing ball or old ladies opening car doors and fudster roadhogs.  Google decided to tackle the whole enchilada, and I greatly respect them for it, but I suspect there is really no way to make the situation truly safe enough -- particularly when cars are operating 'out of full warranty' and people start cutting corners to keep them in operation -- to avoid the large money damages that would shut down any entity actually providing autonomous vehicles in any significant quantity.

  • Member since
    December 2009
  • 1,751 posts
Posted by dakotafred on Thursday, July 14, 2016 7:09 AM

It's bad enough that so many people are driving distracted now. Can you imagine their lack of readiness when called upon to perform a sudden override of the autopilot?

After this, we can worry about an even scarier prospect, the flying car. It's in the works, per a recent story in the Wall Street Journal.

  • Member since
    September 2007
  • From: Charlotte, NC
  • 6,099 posts
Posted by Phoebe Vet on Thursday, July 14, 2016 6:39 AM

Automation requires control.  As long as most cars are driven by humans there will be accidents.  The autopilot operated car failed when it could not react to something stupid that a human in another vehicle did.  A transition to autodrive cars would take decades.  

Dave

Lackawanna Route of the Phoebe Snow

  • Member since
    March 2003
  • From: Central Iowa
  • 6,901 posts
Posted by jeffhergert on Thursday, July 14, 2016 1:00 AM

tree68

Another danger with self-driving vehicles is that drivers will forget how to drive (although one could argue that some never knew).  Unless such vehicles have 100% functionality, there will be times when the occupants may have to actually control the vehicle.

Back in the 60's, I recall reading an editorial about a similar concept.  Speed would be controlled, vehicle spacing could be controlled, in short, the vast majority of mistakes that drivers make which result in collisions, etc, could be controlled and avoided.  Included in the piece was reference to "killer trees," at the time a hot topic, referring to trees close enough to the highway to present a hazard to someone going off the road.

Then, the editorial concluded, some dummy would still roll his vehicle and kill himself anyhow...

I would argue that local governments would fight external control of speed (ie, enforcing speed limits) due to a loss of revenue.

On the other hand, insurance companies would love the opportunity to monitor your driving habits, as they would then have justification for raising rates.

And where such enforcement applies to controlled access highways - those roads were designed for 70+ (except in urban areas).  Anything less is just a way to gain revenue.

 

One news item I saw had one of the designers/engineers (I think he was from Google) say eventually they hope to remove the steering wheel entirely.

Self-driving cars give a new meaning to "computer crashes."

Jeff  

  • Member since
    September 2014
  • 376 posts
Posted by GERALD L MCFARLANE JR on Wednesday, July 13, 2016 8:35 PM

@Overmod...it's called a California Roll for a reason, and legally if you can see 500' in either direction at a 4-way stop you do not have to come to a complete stop if the vehicle in front of you already did so...granted this doesn't happen very often, but it is perfectally legal in that situation.  As for "self-driving" cars, the Tesla autopilot isn't a true self-driving car like the Google vehicle...you still have to maintain some road awareness.

  • Member since
    September 2003
  • 21,669 posts
Posted by Overmod on Wednesday, July 13, 2016 3:20 PM

Euclid
Say we are at the point where half the cars are self-driving, and the other half is not. Under that traffic mix, if one were hands-on driving their vehicle as is done now, would that person likely prefer that everybody else was driving manually, or would that person likely not care one way or the other?

I think it depends on your attitude. 

If you're assuming 'defensive driving' on the part of other drivers, you probably won't care much about whether given vehicles are self-driving or not.  I confess that I would be eyeing known autonomous vehicle operation with something of a weather eye toward unexpected strange behavior ... for example, if "passengers" suddenly disengage auto and respond the wrong way in panic.  But I wouldn't be in a permanent half-funk of prescient terror expecting that at any moment, either ... I'd just leave a bit more of a cushion around the vehicle in question.

If you're assuming, as I did, that some 'manual drivers' will try to exploit the self-driving cars, by aggressively cutting in or 'playing with' closing distances, then I suspect the situation will hinge very materially on whether or not the self-driving car is prepared to report any such behavior promptly and believably to 'enforcement'.  I had my fill of nasty California drivers trying to beat me to crossings, or assert the right of way by pretending not to stop at four-way stops, and I do not assume that those people would be anything other than delighted to try their skill on robots programmed to be terrified of even the remote possibility of acting in a way that would produce lawsuits or summonses.  You can judge for yourself what 'correct countermeasures' against that sort of evolved driver behavior could, or should, be.

The trouble I see coming after the Musk/Tesla fiasco, though, is what happens if self-driving cars begin to be frequently encountered and they start having lots of little glitches.  Particularly when they don't recognize your having your door open in a parking space, or they pull over too far to avoid a situation in the far lane and you're the triage contact, or some sort of deferred maintenance or cumulative unrecognized mechanical failure or  common-mode failure begins to become seen.  Then I start leaving more wary space between me and the robot cars ... or get myself one and letting it calculate the spaces and angles and vectors for me.

A useful parallel I've used for, let's see now, over 40 years now (where did the time go??) is the scene in The Mote in God's Eye where some of the characters walk across traffic which continues to whiz at top speed but misses them-- barely but effectively -- no matter how quickly or unexpectedly they move.  That, rather than the giant truck that brakes to a stop if you walk to the middle of the road and hold up your hand, is the real paradigm for 'traffic of the future' -- the problem being that neither we nor our cost-effective machines can really be trusted to do this quite as effectively as the Moties would ...

 

  • Member since
    September 2003
  • 21,669 posts
Posted by Overmod on Wednesday, July 13, 2016 2:53 PM

DS4-4-1000
The local townships have come up with a low tech solution, they post calibrated signs on the most traveled roads which indicate the depth of the water.

That would be one of the most direct solutions, if it were placed in a location (and in a format) that the car's 'vision system' could read. 

A follow-on question would be whether money exists in your local budget to replace these signs with a 'standardized' particular form that would simplify autonomous-vehicle programming, as I suspect little more than that would suffice.  Presumably one of the responses that local law enforcement could use would be to put similar standardized signs out in locations where flooding was being experienced.  But I suspect a better answer is going to be some form of centrally-coordinated geofencing (where a given GIS area is set as 'likely to flood' and this information broadcast to autonomous vehicles, which then assume that area is 'off limits' to them until further notice, and they reroute around following normal GPS navigation procedure.)

The default would be to recognize the changed 'sight picture' of a road with more than a few inches of standing water across it.  I'm tempted to think that an extension of something like 'Google street view' might be used as a reference source for this or other unanticipated change, for example spills or changes in signage that might lead to guidance failure or critical ambiguity if encountered 'autonomically'.  In a sense this is an extension of the 'tercom' navigation that was historically commonly employed for self-driving systems... who says no good came out of cruise-missile development? Wink

  • Member since
    June 2001
  • From: US
  • 13,488 posts
Posted by Mookie on Wednesday, July 13, 2016 2:42 PM

Euclid

Say we are at the point where half the cars are self-driving, and the other half is not.  Under that traffic mix, if one were hands-on driving their vehicle as is done now, would that person likely prefer that everybody else was driving manually, or would that person likely not care one way or the other? 

 

We are doing that right now.  Some people still "drive" with both hands and one or both feet and....their brain.  Others text or talk on a phone while pushing a very heavy dangerous object down a road.  One or no hands, one foot and...well, very little use of the brain for driving.  Self-driving is looking a lot safer!

She who has no signature! cinscocom-tmw

  • Member since
    March 2016
  • 1,568 posts
Posted by CandOforprogress2 on Wednesday, July 13, 2016 2:31 PM

Its 2016 and we dont even have self-driving trains. And they are on tracks.

I know we have people movers and the DC Subway but they still need people. The People Mover in Indy Indiana has someone 24/7 at the controls at the main station. Driverless Freight trains are only allowed in very wilderness areas where there are no people to run over like Labrador or the New Mexico Desert. I assume that in the near future that trains could be driven remote by some guy in India from a outsourced control center in a loco simulater. But rail unions are still strong and have huge sway over FRA rules.

  • Member since
    October 2012
  • 225 posts
Posted by DS4-4-1000 on Wednesday, July 13, 2016 2:09 PM

Overmod
 
DS4-4-1000
I don't see anybody putting any flood height transmitters on the roads around here anytime soon.

 

Did I say anything as stupid as that?

The National Weather Service has some fairly good ideas about predicting flood heights in regions with accurate GIS data.  Yes, there might be some simple transponders in key regions, but we're not talking a dense network of devices that would only actuate something like .0002% of the time.

On the other hand, very accurate historical data for flood extent is preserved in a number of places, and it does not take rocket science (or lots of expensive computer time) to figure out the correlation between weather and subsequent flood extent.  A child could figure out how to do this to an extent that performs the desired action: keeping autonomous vehicles away from expected risk.  Perhaps the system 'cries wolf' sometimes -- the weather people around Memphis are notorious for predicting sky-is-falling storm risk which never quite gets to more than the occasional downpour -- but if the purpose of the network is to turn autonomous vehicles around so they don't drown, it does what is required.

 

 
I don't know about where you live but here in Southeastern Ohio we frequently have rain events where one stream is out of its banks by several feet making the roads intersected impassable. While the streams on either side that are less than 1/2 mile away remain completely in bank.  The National Weather Service doesn't even try to predict where the flooding will occur, they simply state that the possibility exists in the following counties....  So will my autonomous car not work in those counties while the flash flood warning exists?
 
The local townships have come up with a low tech solution, they post calibrated signs on the most traveled roads which indicate the depth of the water.
  • Member since
    January 2014
  • 8,221 posts
Posted by Euclid on Wednesday, July 13, 2016 2:08 PM

Say we are at the point where half the cars are self-driving, and the other half is not.  Under that traffic mix, if one were hands-on driving their vehicle as is done now, would that person likely prefer that everybody else was driving manually, or would that person likely not care one way or the other? 

Join our Community!

Our community is FREE to join. To participate you must either login or register for an account.

Search the Community

Newsletter Sign-Up

By signing up you may also receive occasional reader surveys and special offers from Trains magazine.Please view our privacy policy