Trains.com

Self Driving cars unsafe at any speed

3025 views
44 replies
1 rating 2 rating 3 rating 4 rating 5 rating
  • Member since
    March 2016
  • 1,568 posts
Self Driving cars unsafe at any speed
Posted by CandOforprogress2 on Tuesday, July 12, 2016 12:46 PM

https://www.youtube.com/watch?v=Fu-zJbWK4B4

Now self driving cars and RR crossings and streetcars dont mix

  • Member since
    August 2009
  • 322 posts
Posted by BLS53 on Tuesday, July 12, 2016 4:15 PM

CandOforprogress2

https://www.youtube.com/watch?v=Fu-zJbWK4B4

Now self driving cars and RR crossings and streetcars dont mix

 

 

I just saw a Wall Street analyst on CNBC talking about this. He was ripping Elon Musk a new one, and stating how Musk's ego has got in the way in developing the self-driving feature of the Tesla. 

  • Member since
    March 2003
  • From: Central Iowa
  • 6,901 posts
Posted by jeffhergert on Tuesday, July 12, 2016 5:51 PM

USA Today had an article that Tesla has had 3 accidents within 2 weeks of self-driving cars having accidents while in automatic mode. 

Jeff

  • Member since
    December 2005
  • From: Cardiff, CA
  • 2,930 posts
Posted by erikem on Tuesday, July 12, 2016 10:22 PM

In addition to dealing with streetcars and RR crossings, I'd wonder if the the autopilot knows anything about "Turn around, don't drown" with respect to flooded roads.

  • Member since
    December 2001
  • From: Northern New York
  • 25,021 posts
Posted by tree68 on Wednesday, July 13, 2016 7:05 AM

erikem

In addition to dealing with streetcars and RR crossings, I'd wonder if the the autopilot knows anything about "Turn around, don't drown" with respect to flooded roads.

Given the problems humans have had following their GPS onto railroad tracks and the like, I wouldn't put much faith in a self-piloting car to avoid such an unexpected obstacle as a flooded road, particularly because it doesn't have a vertical component as would another vehicle, etc.

If nothing else, perhaps the recent incidents will point up shortcomings in the software or hardware and make way for improvement.  Unfortunately, glitches in that type of software have more consequences than they would in your home computer...

LarryWhistling
Resident Microferroequinologist (at least at my house) 
Everyone goes home; Safety begins with you
My Opinion. Standard Disclaimers Apply. No Expiration Date
Come ride the rails with me!
There's one thing about humility - the moment you think you've got it, you've lost it...

  • Member since
    February 2016
  • From: Texas
  • 1,552 posts
Posted by PJS1 on Wednesday, July 13, 2016 8:59 AM

The early days of railroads saw all kinds of disasters, including bridge collapses, head-on collisions, derailments for no known cause, etc.  The problems were overcome, and today railroads are one of the safest modes of transportation in the world.

The early days of aviation saw headline grabbing disasters while little old ladies and men in tennis shoes proclaimed, "If god wanted man to fly, he would have given him wings."  Today’s passenger statistics show that most people believe air travel is a safe bet. 

Someday driverless cars will reach the same technological excellence that we see in rail and air travel.  You can take it to the bank, although many of us won't live long enough to see it. 

Rio Grande Valley, CFI,CFII

  • Member since
    January 2014
  • 8,221 posts
Posted by Euclid on Wednesday, July 13, 2016 9:06 AM

Self-driving cars are dramatically over-promised.  It seems like politically correct, utopianism.  To jump right into this self-driving dream relying solely on computer technology is naïve.  Here is what I don’t quite understand:  Other than the fashion statement of have a car that drives you around automatically, what exactly is the point of idea?  Is it to remove the fallibility of a human driver, and thus make driving safer?

A few years ago, I wrote about my prediction for the future of driving where cars have a layer of automatic, central control that enforces the laws as you drive.  I thought it would develop as a natural extension of the current proposals to tax drivers by the miles driven, which, in effect, turns all roads into toll roads.  Instead of getting people out of their cars and into mass transit, it would turn people’s cars into a form of mass transit.   This self-driving car development seems to have leapfrogged past my prediction and taken it to its ultimate level.

 

  • Member since
    September 2003
  • 21,669 posts
Posted by Overmod on Wednesday, July 13, 2016 12:09 PM

Euclid
Other than the fashion statement of have a car that drives you around automatically, what exactly is the point of idea? Is it to remove the fallibility of a human driver, and thus make driving safer? ... This self-driving car development seems to have leapfrogged past my prediction and taken it to its ultimate level.

A good introduction to the possibilities and 'science' of autonomous vehicles is Brad Templeton's blog (and site).  Before that, there are some pretty good 'imagineering' discussions as far back as the GM automated-highway proposals of the Forties (and some before that, but not with evolved hardware to make them practicable).

"Safety" -- especially safety as misdefined as 'rigid adherence to motor-vehicle statute requirements at all times' -- is only one of the things that autonomous vehicles can be programmed to achieve.  One very significant advantage is continuously-optimized route guidance, without the foreground-attention problems that come from human drivers having to cope with crApple-style GPS navigation interfaces.  Another is integration between physical route requirements and vehicle power requirements -- particularly when doing hybrid power management or making 'best use' of BEV-style external charging.  Yet another is tolerance for human failure (which may be as simple as sudden dazzle, as 'preventable' as driving intoxicated, or as catastrophic as sudden stroke or heart attack) with at least the tacit ability to transport passengers safely to what may be an emergent destination rather than just 'going to the curb and stopping when safe' and flashing an alert signal to 'the appropriate authorities'.

I thought the 'enforcement' of safe driving behavior was going to come from a radically different direction: the mandatory installation of driver monitoring modules as a precondition of insurance coverage.  (That would be followed up by nanny-state enforcement of the modules as 'proof of required insurance coverage', probably with some ability of the police to wirelessly interrogate all passing vehicles and shut down/tow/confiscate "violators" and arrest the nominally responsible parties in finest Goliath tradition).  I thought this would actually lead to greater 'embracing' of autonomous vehicle control, as it would produce at least 'plausible denial' that any observed nonlegal operating practices were 'offenses' under motor-vehicle statutes.

One problem that Templeton apparently took up in detail is what happens when savvy violators 'game' the situation where multiple autonomous vehicles meekly yield the right-of-way to more aggressive vehicles.  My conclusion (which was not his) was that pervasive dashcam video could, and should, and would be used to enable any driver or passenger to quickly rat out any violations, and report them reliably (and with at least a good leg up on chain-of-evidence integrity) in precisely a way that can optimize targeted enforcement -- quick, or delayed.  But it might be argued that setting up such a network of dashcams has nothing nominal to do with autonomous operation as a contributor to overall safety, and might be a higher priority to implement (even though its communications infrastructure and many of its instantiations make autonomous operation much more effective and safe).

  • Member since
    February 2003
  • From: Guelph, Ontario
  • 4,819 posts
Posted by Ulrich on Wednesday, July 13, 2016 12:33 PM

Completely self driving cars are still a few years off although cars that steer themselves down the highway are already here (check out the Mercedes class E). 

  • Member since
    December 2001
  • From: Northern New York
  • 25,021 posts
Posted by tree68 on Wednesday, July 13, 2016 12:39 PM

Another danger with self-driving vehicles is that drivers will forget how to drive (although one could argue that some never knew).  Unless such vehicles have 100% functionality, there will be times when the occupants may have to actually control the vehicle.

Back in the 60's, I recall reading an editorial about a similar concept.  Speed would be controlled, vehicle spacing could be controlled, in short, the vast majority of mistakes that drivers make which result in collisions, etc, could be controlled and avoided.  Included in the piece was reference to "killer trees," at the time a hot topic, referring to trees close enough to the highway to present a hazard to someone going off the road.

Then, the editorial concluded, some dummy would still roll his vehicle and kill himself anyhow...

I would argue that local governments would fight external control of speed (ie, enforcing speed limits) due to a loss of revenue.

On the other hand, insurance companies would love the opportunity to monitor your driving habits, as they would then have justification for raising rates.

And where such enforcement applies to controlled access highways - those roads were designed for 70+ (except in urban areas).  Anything less is just a way to gain revenue.

LarryWhistling
Resident Microferroequinologist (at least at my house) 
Everyone goes home; Safety begins with you
My Opinion. Standard Disclaimers Apply. No Expiration Date
Come ride the rails with me!
There's one thing about humility - the moment you think you've got it, you've lost it...

  • Member since
    August 2010
  • From: Henrico, VA
  • 8,955 posts
Posted by Firelock76 on Wednesday, July 13, 2016 12:48 PM

Strikes me this "self-driving car" thing is a high-tech way to bring us back to the horse and buggy days, you know, when a driver had a horse to help him with his thinking?

Fall asleep in the driver seat?  That's OK, Dobbin knows the way home!

  • Member since
    May 2013
  • 3,231 posts
Posted by NorthWest on Wednesday, July 13, 2016 12:55 PM

Tree's right on the mark. There will almost certainly always be situations that require the driver to take control, and probably suddenly. Such is the paradox of Children of the Magenta: the more complex the automation, the more trained the user must be to correct things when they go wrong.

Designers would do well to read the BEA report on the crash of AF 447. The combination of the startle factor, overreliance on technology to the detritement of basic airmanship and lack of procedures as to what to do when things go wrong led to the pilots stalling and crashing a fully functional aircraft.

What if the car suddenly gives back control when an icy skid becomes outside parameters?

  • Member since
    April 2007
  • From: Iowa
  • 3,293 posts
Posted by Semper Vaporo on Wednesday, July 13, 2016 12:56 PM

erikem

In addition to dealing with streetcars and RR crossings, I'd wonder if the the autopilot knows anything about "Turn around, don't drown" with respect to flooded roads. 

A component of the self-steering is to "see" the road ahead and the edges in some manner for some distance (with allowance for loss of edges, etc. for specific distances at specific places, such as at intersections, etc.) so a road covered with water should be detected and avioded somehow.  Of course that doesn't take into account a short bridge span missing.

 

Semper Vaporo

Pkgs.

  • Member since
    September 2003
  • 21,669 posts
Posted by Overmod on Wednesday, July 13, 2016 1:07 PM

Semper Vaporo
A component of the self-steering is to "see" the road ahead and the edges in some manner for some distance (with allowance for loss of edges, etc. for specific distances at specific places, such as at intersections, etc.) so a road covered with water should be detected and avoided somehow.

A number of the more 'modern' systems are actually arranged so their machine vision detects water, or the distortions of physical features over time typical of rippling or motion of water, and references this against some of the 'database' features associated with the GPS location.  So, in a very real sense, the logic will recognize the presence of flooding at a particular point, make some conclusion about likely flooding effects 'further along the intended route', and almost certainly (litigation being at the state it is!) performing a reversal and reroute; that's the moral equivalent of 'turn around, don't drown' in context.

It is also likely that the system will be primed to receive and parse weather alerts, which may in the future include realtime flash-flooding water heights or GIS referencing and data fusion from 'smart structures'.  That might help prevent a proper autonomous vehicle from coming to a potentially-flooded place (or a dangerously-compromised bridge) in the first place.

There is considerably more to autonomous vehicle operation than 'self-steering vehicles' a la Martin Marietta, just as there is much more to a modern smartphone touch interface than high sensor resolution and many levels of touch sensitivity...

  • Member since
    October 2012
  • 225 posts
Posted by DS4-4-1000 on Wednesday, July 13, 2016 1:29 PM

Overmod
It is also likely that the system will be primed to receive and parse weather alerts, which may in the future include realtime flash-flooding water heights or GIS referencing and data fusion from 'smart structures'. That might help prevent a proper autonomous vehicle from coming to a potentially-flooded place (or a dangerously-compromised bridge) in the first place

I don't see anybody putting any flood height transmitters on the roads around here anytime soon.  It takes years to get any damaged bridges or washed out culverts replaced, and you expect the government bodies to just divert already scarce funds to install something that will require even more maintenance.

  • Member since
    September 2003
  • 21,669 posts
Posted by Overmod on Wednesday, July 13, 2016 1:37 PM

DS4-4-1000
I don't see anybody putting any flood height transmitters on the roads around here anytime soon.

Did I say anything as stupid as that?

The National Weather Service has some fairly good ideas about predicting flood heights in regions with accurate GIS data.  Yes, there might be some simple transponders in key regions, but we're not talking a dense network of devices that would only actuate something like .0002% of the time.

On the other hand, very accurate historical data for flood extent is preserved in a number of places, and it does not take rocket science (or lots of expensive computer time) to figure out the correlation between weather and subsequent flood extent.  A child could figure out how to do this to an extent that performs the desired action: keeping autonomous vehicles away from expected risk.  Perhaps the system 'cries wolf' sometimes -- the weather people around Memphis are notorious for predicting sky-is-falling storm risk which never quite gets to more than the occasional downpour -- but if the purpose of the network is to turn autonomous vehicles around so they don't drown, it does what is required.

  • Member since
    January 2014
  • 8,221 posts
Posted by Euclid on Wednesday, July 13, 2016 2:08 PM

Say we are at the point where half the cars are self-driving, and the other half is not.  Under that traffic mix, if one were hands-on driving their vehicle as is done now, would that person likely prefer that everybody else was driving manually, or would that person likely not care one way or the other? 

  • Member since
    October 2012
  • 225 posts
Posted by DS4-4-1000 on Wednesday, July 13, 2016 2:09 PM

Overmod
 
DS4-4-1000
I don't see anybody putting any flood height transmitters on the roads around here anytime soon.

 

Did I say anything as stupid as that?

The National Weather Service has some fairly good ideas about predicting flood heights in regions with accurate GIS data.  Yes, there might be some simple transponders in key regions, but we're not talking a dense network of devices that would only actuate something like .0002% of the time.

On the other hand, very accurate historical data for flood extent is preserved in a number of places, and it does not take rocket science (or lots of expensive computer time) to figure out the correlation between weather and subsequent flood extent.  A child could figure out how to do this to an extent that performs the desired action: keeping autonomous vehicles away from expected risk.  Perhaps the system 'cries wolf' sometimes -- the weather people around Memphis are notorious for predicting sky-is-falling storm risk which never quite gets to more than the occasional downpour -- but if the purpose of the network is to turn autonomous vehicles around so they don't drown, it does what is required.

 

 
I don't know about where you live but here in Southeastern Ohio we frequently have rain events where one stream is out of its banks by several feet making the roads intersected impassable. While the streams on either side that are less than 1/2 mile away remain completely in bank.  The National Weather Service doesn't even try to predict where the flooding will occur, they simply state that the possibility exists in the following counties....  So will my autonomous car not work in those counties while the flash flood warning exists?
 
The local townships have come up with a low tech solution, they post calibrated signs on the most traveled roads which indicate the depth of the water.
  • Member since
    March 2016
  • 1,568 posts
Posted by CandOforprogress2 on Wednesday, July 13, 2016 2:31 PM

Its 2016 and we dont even have self-driving trains. And they are on tracks.

I know we have people movers and the DC Subway but they still need people. The People Mover in Indy Indiana has someone 24/7 at the controls at the main station. Driverless Freight trains are only allowed in very wilderness areas where there are no people to run over like Labrador or the New Mexico Desert. I assume that in the near future that trains could be driven remote by some guy in India from a outsourced control center in a loco simulater. But rail unions are still strong and have huge sway over FRA rules.

  • Member since
    June 2001
  • From: US
  • 13,488 posts
Posted by Mookie on Wednesday, July 13, 2016 2:42 PM

Euclid

Say we are at the point where half the cars are self-driving, and the other half is not.  Under that traffic mix, if one were hands-on driving their vehicle as is done now, would that person likely prefer that everybody else was driving manually, or would that person likely not care one way or the other? 

 

We are doing that right now.  Some people still "drive" with both hands and one or both feet and....their brain.  Others text or talk on a phone while pushing a very heavy dangerous object down a road.  One or no hands, one foot and...well, very little use of the brain for driving.  Self-driving is looking a lot safer!

She who has no signature! cinscocom-tmw

  • Member since
    September 2003
  • 21,669 posts
Posted by Overmod on Wednesday, July 13, 2016 2:53 PM

DS4-4-1000
The local townships have come up with a low tech solution, they post calibrated signs on the most traveled roads which indicate the depth of the water.

That would be one of the most direct solutions, if it were placed in a location (and in a format) that the car's 'vision system' could read. 

A follow-on question would be whether money exists in your local budget to replace these signs with a 'standardized' particular form that would simplify autonomous-vehicle programming, as I suspect little more than that would suffice.  Presumably one of the responses that local law enforcement could use would be to put similar standardized signs out in locations where flooding was being experienced.  But I suspect a better answer is going to be some form of centrally-coordinated geofencing (where a given GIS area is set as 'likely to flood' and this information broadcast to autonomous vehicles, which then assume that area is 'off limits' to them until further notice, and they reroute around following normal GPS navigation procedure.)

The default would be to recognize the changed 'sight picture' of a road with more than a few inches of standing water across it.  I'm tempted to think that an extension of something like 'Google street view' might be used as a reference source for this or other unanticipated change, for example spills or changes in signage that might lead to guidance failure or critical ambiguity if encountered 'autonomically'.  In a sense this is an extension of the 'tercom' navigation that was historically commonly employed for self-driving systems... who says no good came out of cruise-missile development? Wink

  • Member since
    September 2003
  • 21,669 posts
Posted by Overmod on Wednesday, July 13, 2016 3:20 PM

Euclid
Say we are at the point where half the cars are self-driving, and the other half is not. Under that traffic mix, if one were hands-on driving their vehicle as is done now, would that person likely prefer that everybody else was driving manually, or would that person likely not care one way or the other?

I think it depends on your attitude. 

If you're assuming 'defensive driving' on the part of other drivers, you probably won't care much about whether given vehicles are self-driving or not.  I confess that I would be eyeing known autonomous vehicle operation with something of a weather eye toward unexpected strange behavior ... for example, if "passengers" suddenly disengage auto and respond the wrong way in panic.  But I wouldn't be in a permanent half-funk of prescient terror expecting that at any moment, either ... I'd just leave a bit more of a cushion around the vehicle in question.

If you're assuming, as I did, that some 'manual drivers' will try to exploit the self-driving cars, by aggressively cutting in or 'playing with' closing distances, then I suspect the situation will hinge very materially on whether or not the self-driving car is prepared to report any such behavior promptly and believably to 'enforcement'.  I had my fill of nasty California drivers trying to beat me to crossings, or assert the right of way by pretending not to stop at four-way stops, and I do not assume that those people would be anything other than delighted to try their skill on robots programmed to be terrified of even the remote possibility of acting in a way that would produce lawsuits or summonses.  You can judge for yourself what 'correct countermeasures' against that sort of evolved driver behavior could, or should, be.

The trouble I see coming after the Musk/Tesla fiasco, though, is what happens if self-driving cars begin to be frequently encountered and they start having lots of little glitches.  Particularly when they don't recognize your having your door open in a parking space, or they pull over too far to avoid a situation in the far lane and you're the triage contact, or some sort of deferred maintenance or cumulative unrecognized mechanical failure or  common-mode failure begins to become seen.  Then I start leaving more wary space between me and the robot cars ... or get myself one and letting it calculate the spaces and angles and vectors for me.

A useful parallel I've used for, let's see now, over 40 years now (where did the time go??) is the scene in The Mote in God's Eye where some of the characters walk across traffic which continues to whiz at top speed but misses them-- barely but effectively -- no matter how quickly or unexpectedly they move.  That, rather than the giant truck that brakes to a stop if you walk to the middle of the road and hold up your hand, is the real paradigm for 'traffic of the future' -- the problem being that neither we nor our cost-effective machines can really be trusted to do this quite as effectively as the Moties would ...

 

  • Member since
    September 2014
  • 376 posts
Posted by GERALD L MCFARLANE JR on Wednesday, July 13, 2016 8:35 PM

@Overmod...it's called a California Roll for a reason, and legally if you can see 500' in either direction at a 4-way stop you do not have to come to a complete stop if the vehicle in front of you already did so...granted this doesn't happen very often, but it is perfectally legal in that situation.  As for "self-driving" cars, the Tesla autopilot isn't a true self-driving car like the Google vehicle...you still have to maintain some road awareness.

  • Member since
    March 2003
  • From: Central Iowa
  • 6,901 posts
Posted by jeffhergert on Thursday, July 14, 2016 1:00 AM

tree68

Another danger with self-driving vehicles is that drivers will forget how to drive (although one could argue that some never knew).  Unless such vehicles have 100% functionality, there will be times when the occupants may have to actually control the vehicle.

Back in the 60's, I recall reading an editorial about a similar concept.  Speed would be controlled, vehicle spacing could be controlled, in short, the vast majority of mistakes that drivers make which result in collisions, etc, could be controlled and avoided.  Included in the piece was reference to "killer trees," at the time a hot topic, referring to trees close enough to the highway to present a hazard to someone going off the road.

Then, the editorial concluded, some dummy would still roll his vehicle and kill himself anyhow...

I would argue that local governments would fight external control of speed (ie, enforcing speed limits) due to a loss of revenue.

On the other hand, insurance companies would love the opportunity to monitor your driving habits, as they would then have justification for raising rates.

And where such enforcement applies to controlled access highways - those roads were designed for 70+ (except in urban areas).  Anything less is just a way to gain revenue.

 

One news item I saw had one of the designers/engineers (I think he was from Google) say eventually they hope to remove the steering wheel entirely.

Self-driving cars give a new meaning to "computer crashes."

Jeff  

  • Member since
    September 2007
  • From: Charlotte, NC
  • 6,099 posts
Posted by Phoebe Vet on Thursday, July 14, 2016 6:39 AM

Automation requires control.  As long as most cars are driven by humans there will be accidents.  The autopilot operated car failed when it could not react to something stupid that a human in another vehicle did.  A transition to autodrive cars would take decades.  

Dave

Lackawanna Route of the Phoebe Snow

  • Member since
    December 2009
  • 1,751 posts
Posted by dakotafred on Thursday, July 14, 2016 7:09 AM

It's bad enough that so many people are driving distracted now. Can you imagine their lack of readiness when called upon to perform a sudden override of the autopilot?

After this, we can worry about an even scarier prospect, the flying car. It's in the works, per a recent story in the Wall Street Journal.

  • Member since
    September 2003
  • 21,669 posts
Posted by Overmod on Thursday, July 14, 2016 7:14 AM

jeffhergert
One news item I saw had one of the designers/engineers (I think he was from Google) say eventually they hope to remove the steering wheel entirely.

There are idiots and fools everywhere.  Even at Google.

Anyone familiar with industrial control logic (let alone consumer-grade or crApple OS and environment "management") will know how pointless it would be to implement a safety-critical system in a vehicle capable of high momentum without providing capable backup.  That backup won't stop at redundant electronic control systems.

The 'catch' for many years is removing controls that are either continuously monitored or which provide the 'temptation' to go to manual control unpredictably.  The latter being a particular bugaboo in some of the early 'inductive control' experiments in the late Forties, where a user in "emergency" might inadvertently grab for the wheel, whack on the brake, and otherwise bung up any semblance of control that the electronics might be able to assert over the situation.  There were similar discussions in military applications over the degree to which pilots should be able to override high-G systems that were monitoring key airframe stresses.

There is no better solution for operating a conventional motor vehicle than a steering wheel, but it poses a number of restrictions for a proper 'autonomous vehicle'.  The problem is that most of the 'other' backup-control modalities, most notably anything with a sidestick proportional controller, do not work well without power boost of some kind, which means that they are functionally useless or worse in a great range of potential failure situations.

Interestingly enough, most of the 'older' systems tacitly accepted the idea that the automatic control was intended for long, continuous sections of 'cruise', or enablement for "autobahn"-style high-speed operation at minimum separation, rather than tooling around town ready to avoid all the children playing ball or old ladies opening car doors and fudster roadhogs.  Google decided to tackle the whole enchilada, and I greatly respect them for it, but I suspect there is really no way to make the situation truly safe enough -- particularly when cars are operating 'out of full warranty' and people start cutting corners to keep them in operation -- to avoid the large money damages that would shut down any entity actually providing autonomous vehicles in any significant quantity.

  • Member since
    April 2016
  • 1,447 posts
Posted by Shadow the Cats owner on Thursday, July 14, 2016 7:18 AM

I would not want a Telsa car equipped with their autopilot.  Here is why one of their cars in autopilot ran under a truck that was across the road as the autopilot was not able to tell that a 53 foot trailer was not the sky.  It literally took the owner of the cars head off.  That person was watching a DVD instead of the road.  NTSB is now looking into the accident.  That should be a fun one for Telsa to get out of. 

  • Member since
    July 2008
  • 2,325 posts
Posted by rdamon on Thursday, July 14, 2016 8:35 AM
One has to look no further than “Fly by wire” control systems for aircraft. Even though there is a physical control stick and pedals, the computer controls “how-much” and “when” things happen. Of course there are multiple systems and over-rides, but elimination of a steering wheel in a car may be more of elimination of the direct mechanical linkage than removing the physical wheel.
Compare the complexity of a modern cars anti-lock braking system to an older car. I have a few brake spoons in my tool box that are now there for history lessons for my kids.
 
Automation is a journey where things that can be automated get automated. How long have we had cruise control? Now we have adaptive cruise control, headlights, blind spot monitoring, lane departure ... etc. 
 
When I was a freshman in college, the joke was our job was to create an airplane that had a pilot and a dog in the cockpit. The pilots job is to feed the dog and the dog’s job it to bite the pilot if he touches any of the controls.

 

I agree Google really advanced the concept, this will be fun to watch.
 
To steer :) this to a railroad topic, I was amazed when I got into the first car of the L9 train in Barcelona and there was no cab.
 
  • Member since
    December 2005
  • From: Cardiff, CA
  • 2,930 posts
Posted by erikem on Thursday, July 14, 2016 8:55 AM

NorthWest

Designers would do well to read the BEA report on the crash of AF 447. The combination of the startle factor, overreliance on technology to the detritement of basic airmanship and lack of procedures as to what to do when things go wrong led to the pilots stalling and crashing a fully functional aircraft.

A big factor was the design of the side stick control on the plane that did not give feedback as to what the other pilot was doing with controls. With the traditinal control yokes, the captain would have known immediately that the first officer was yanking back on the yoke and thus prolonging the stall.

What if the car suddenly gives back control when an icy skid becomes outside parameters?

 

All the more reason to keep the driver informed of what the car was doing, in particular feedback through the steering wheel.

A co-worker suggested that the "auto-pilot" mode should do what's done on locomotives, have an alerter function that requires interaction from the driver at intervals to check if the driver is paying attention.

Join our Community!

Our community is FREE to join. To participate you must either login or register for an account.

Search the Community

Newsletter Sign-Up

By signing up you may also receive occasional reader surveys and special offers from Trains magazine.Please view our privacy policy