Trains.com

Nine Dead 150 Injured - Never Should Have Happened

8544 views
95 replies
1 rating 2 rating 3 rating 4 rating 5 rating
  • Member since
    December 2007
  • From: Southeast Michigan
  • 2,983 posts
Posted by Norm48327 on Sunday, February 14, 2016 5:00 AM

Electroliner 1935
I like flying Southwest Airlines because their pilots are expected to fly the plane into a landing rather than use the auto landing system. here have been articles about pilots not knowing how to react to abnormal situations.

Sometimes autoland may be the better option. To wit: Alaska Airlines landing on the parallel taxiway at Seattle. Flight deck crew will get new ones reamed for that. YUP! Human failure.

Norm


  • Member since
    September 2010
  • 2,515 posts
Posted by Electroliner 1935 on Saturday, February 13, 2016 9:24 PM

I think operating proficency is still required. I recall on a trip to Philadelphia, riding a PATCO train to Lindenwald and watching the operator push the button to close the doors, push the button to start the train and the train accelerated to track spped and automatically slowed and stop at the next station. On the way back into Philly, the operator pushed the button to close the doors, then using the throttle manually operated the train to the next station all the way back. I asked him why. He replied "To keep my proficency up. On dry rail it (the automation) works fine but with wet rail, the train will slide through the station" He went on to explain the system was set for a heavy braking rate that could not be obtained on wet rail as there was a fixed transponder that initiaded a fixed braking applcation. So he needed to keep his proficiecy up. 

I like flying Southwest Airlines because their pilots are expected to fly the plane into a landing rather than use the auto landing system. here have been articles about pilots not knowing how to react to abnormal situations. 

  • Member since
    July 2006
  • 9,610 posts
Posted by schlimm on Saturday, February 13, 2016 5:05 PM

Wizlish
Wizlish wrote the following post 8 hours ago: schlimm Based on the information so far in the German press, it appears the automated safety system did not fail; the dispatcher committed an error. So you're contending that the safety system didn't fail, but an accident still happened?  That's an oxymoron.  And if there is an accident, it's a failure of the safety system as a whole, whether or not its component parts all were working correctly and it was a short circuit between the headphones or whatever.

The PLZ system did not fail; the dispatcher did by overriding it, as almost anyone would correctly read my comment.  As a whole the "system" failed only by permitting human interference at that level.

 

C&NW, CA&E, MILW, CGW and IC fan

  • Member since
    October 2014
  • 1,644 posts
Posted by Wizlish on Saturday, February 13, 2016 9:49 AM

Yes to all the above, but I would note that both these were technically more complicated than the point of failure in the West Side collision.  The problem there was that a fairly large number of 'standard operating practices' all contributed in a perfect storm that particular day, at least according to the government's report.

In the German accident it was being claimed that all the safety equipment was operating perfectly, and it was just a mistake that produced the wreck.  And I continue to claim that it is the 'operating perfectly' claim that is defective here; as with another 'safety system' failure (the rear-end collision at Naperville that led more or less directly to the ICC's enforcement of the 79 mph ATC requirement by early 1951) it cannot be said to operate 'perfectly' if the system itself is imperfect (or incomplete, which is one of the semantic senses of 'imperfect') by design.

Of course I sympathize with the idea that accidents can occur even when all the equipment does what it was designed to do.  That is not the full definition of what a 'system' needs to provide, though.  That is one of the great things that came out of Cold War systems development, along with PERT and other approaches to getting results under uncertainty, and some things like PAL and the original and restarted B-70 programs, that managed both complex uncertainty and effective midcourse coordination when necessary.

I have a fundamental issue with deterministic fragile systems that purport to be 'safety' systems.  And I consider this German accident to be another demonstration why I do.

  • Member since
    October 2006
  • From: Allentown, PA
  • 9,810 posts
Posted by Paul_D_North_Jr on Saturday, February 13, 2016 9:32 AM

Wizlish: "an unanticipated but very predictable way" - you know, I get that.  It's not the oxymoron it might seem to be. 

A great article about a very similar accident - the interlocking was being upgraded, and there were some temporary 'work-arounds':  

"The accident that couldn't happen - collision between CB&Q and RI trains 9/25/64, Montgomery, Illinois", by Shaw, Robert B., from Trains, October 1965, pg. 23 &etc. [keywords: accident CB&Q Montgomery RI ]

And another one, more about the fallibility of human-dependent systems and the DS relying too much on the operators in a case where there's some uncertainty:

"Had GR&I No. 5 passed Mill Creek? - the operator was asleep", by Norman, Harold B., from Trains, January 1974, pgs. 32 - 36. [keywords: accident dispatch GR&I ] 

- Paul North.

"This Fascinating Railroad Business" (title of 1943 book by Robert Selph Henry of the AAR)
  • Member since
    October 2014
  • 1,644 posts
Posted by Wizlish on Saturday, February 13, 2016 8:28 AM

schlimm
Based on the information so far in the German press, it appears the automated safety system did not fail; the dispatcher committed an error.

So you're contending that the safety system didn't fail, but an accident still happened?  That's an oxymoron.  And if there is an accident, it's a failure of the safety system as a whole, whether or not its component parts all were working correctly and it was a short circuit between the headphones or whatever.

I understand that it's pretty common practice in critical-systems design to make safety systems immune to stupid operator errors.  And that it is indeed the fault of the systems designer if that is not done.  From what I understand, this was not an unusual combination of factors (like the situation leading up to Lac Megantic), it was a straightforward assertion of authority that a proper safety system (which by definition includes the procedures and paper workarounds used when any part of the engineering functionality is 'down' or tagged out) SHOULD have caught long before rail vehicles collided.

Not to make a cheap shot, but I seem to remember that this is the nation that computer-engineered a collision into one of its early scheduling algorithms -- Netz B, wasn't it? -- and I have to wonder if this represents a similar approach to technological implementation that turns out to be fragile in an unanticipated but very predictable way.

  • Member since
    July 2006
  • 9,610 posts
Posted by schlimm on Saturday, February 13, 2016 7:44 AM

Based on the information so far in the German press, it appears the automated safety system did not fail; the dispatcher committed an error.   

It is not surprising that on here, the tendency is to blame automation: this tragedy, the NEC crash in N. Philly, and the Metro-North crash, when the primary cause was faulty human judgment.

C&NW, CA&E, MILW, CGW and IC fan

  • Member since
    October 2014
  • 1,644 posts
Posted by Wizlish on Saturday, February 13, 2016 7:36 AM

[quote user="Norm48327"]

mudchicken
Buttonpusher's generation doesn't like the wake-up calls, but here's another one.
  • Member since
    March 2016
  • From: Burbank IL (near Clearing)
  • 13,540 posts
Posted by CSSHEGEWISCH on Saturday, February 13, 2016 6:45 AM

This is starting to sound like a hi-tech version of a lap order.

The daily commute is part of everyday life but I get two rides a day out of it. Paul
  • Member since
    December 2001
  • From: Northern New York
  • 25,013 posts
Posted by tree68 on Friday, February 12, 2016 3:04 PM

Computers smarter than humans? Dubious.

 
Jack R.
I understand your sentiment. I was once very dubious about computers, but I have evolved and I believe we need to embrace our creative intelligence and utilize it for purposes that can best be described as tantamount to our survival. 

Computers are getting pretty smart - but computers at the level of rail traffic control are simply working a set of instructions - they aren't going to make decisions on the fly.  Presumably, most possible situations that could be encountered are programmed into said computer.

If the situation doesn't square with the rules, someone has to sort it out.

I won't use the term "shut off," but I will use the term "override," which appears to be what the dispatcher in question did.  Clearly there was an error involved.

Dispatchers have been overriding the computer/timetable for years.  And every now and then, something untoward occurs.

LarryWhistling
Resident Microferroequinologist (at least at my house) 
Everyone goes home; Safety begins with you
My Opinion. Standard Disclaimers Apply. No Expiration Date
Come ride the rails with me!
There's one thing about humility - the moment you think you've got it, you've lost it...

  • Member since
    July 2006
  • 9,610 posts
Posted by schlimm on Friday, February 12, 2016 2:52 PM

(from Friday's SZ.de - Süddeutsche Zeitung - South German News, the main Munich newspaper)

"The dispatcher of the signal box in Bad Aibling was apparently still trying at the last second, to prevent the collision of the two trains. This was heard from the investigation group on Friday. Accordingly, he used a particular cellular network, which can communicate directly between the dispatcher and driver, to make a first emergency call, shortly before the collision. This signals to all train drivers on the section concerned the highest risk and they should stop immediately. Immediately afterwards, he sent a second emergency call, but by then it was too late. The message was recorded and is currently being evaluated. Just like the records on the third black box, which has now been recovered. Only then you will be able to reconstruct the processes conclusively.

Currently however, indications are that the dispatcher incorrectly allowed the entry in the section two trains at the same time. Although the system in this case gives a warning, could he have bypassed this warning by the switching on of replacement signals."

C&NW, CA&E, MILW, CGW and IC fan

  • Member since
    February 2016
  • 101 posts
Posted by Jack R. on Friday, February 12, 2016 1:29 PM

Norm48327

 

 
Jack R.
Humans are only capable of "X" amount of functions. Computers, if properly programmed, can predict and if necessary, stop, deter or prevent such catastrophic events.

 

Computers smarter than humans? Dubious.

 

I understand your sentiment. I was once very dubious about computers, but I have evolved and I believe we need to embrace our creative intelligence and utilize it for purposes that can best be described as tantamount to our survival. 

Human intelligence, it has been suggested, is limited to our own abilities to learn. If something tragic happens, there are people who use this intelligence to figure out the who,what,where,how and whys of it all. Hopefully, a miracle happens and something is created to stop, deter, prevent or otherwise keep these awful terrible things from occurring.

Train derailment's ought to be a thing of the past in the 21st century. Certainly, train collisions ought to. Yet, here we are in the 21st century and people are still losing their lives to something that is 100% preventable. Technology must be employed at some point to ensure such things do not occur. Now in the railroad industry, we have, in this country and others, accomplished quite allot in the past 100 years, but one thing has never ever been tried before. That is, putting an intelligent system into a locomotive. This system would run completely independent of the engineer. It would, in fact, make calculations based upon data that is known. Such as weight, length, itinerary, time, space, track, switches......every aspect of the train. Such a locomotive would have have taken immediate control of that Amtrak locomotive and successfully stopped that train long before the curve. All because the locomotive is literally thinking for itself.

Now before anyone says, hey Jack, you're crazy.......consider this. The F117 Night Hawk bomber, now officially retired, utilized a very advanced computer system that not only worked with the pilot, but it literally helped him fly the craft. Without such a system, the F117 cannot fly period. 

The point is, human life is worth something man......We have the technology right now, today, to place into all these beautiful locomotives, like the ES44AC. Such a system would only enhance an engineers ability to control his train more safely, more effectively and with greater assurance. The railroads can afford such technology. At the very least, passenger trains ought to be required to possess it. Doing so, makes sense. Human sense!

 

 

  • Member since
    May 2003
  • From: US
  • 25,281 posts
Posted by BaltACD on Thursday, February 11, 2016 10:12 PM

schlimm

Returning to reality from Ludditeville, the definitive cause of the crash in Bavaria is still unknown, but appears to be the dispatcher's human error, not a failure of the system or a computer.

Not knowing the specifics of the system the Germans are using. 

In Train Control territory - which is operated on my territory and from discriptions of the German system, sounds similar.  There is NO WAY for the Train Dispatcher to 'turn the system off'.  When there is a failure at a location, the Train Dispatcher can give the train that got stopped at a Absolute Stop signal permission to pass the signal at Restricted Speed -AFTER assuring himself that there are NO OPPOSING TRAINS in the block the train is being given permission to enter.  This assurance goes to interrogating the CTC (CADS System) of the next control point in the route of the train to insure that no signal is lined and no train is in the block the train is being given permission to enter at RESTRICTED SPEED.

Under NO Circumstance is a train allowed track speed in these instances.

Logs are kept of all signal appratus manipulations by both the Train Dispatcher as well as actions that take place in the field by the CADS system.  All Dispatcher radio communications are recorded - no matter which channel the Dispatcher conducts the communication on.

Never too old to have a happy childhood!

              

  • Member since
    October 2006
  • From: Allentown, PA
  • 9,810 posts
Posted by Paul_D_North_Jr on Thursday, February 11, 2016 9:24 PM

http://ten90solutions.com/enhancing_vitality

Excerpt: "But that doesn't change the fact that such systems as PZB do not remedy the underlying, and fatal, weakness of any train control system that is not based, first last and always, on the separation of the field from the office in determining occupancy."

- Paul North.

"This Fascinating Railroad Business" (title of 1943 book by Robert Selph Henry of the AAR)
  • Member since
    July 2006
  • 9,610 posts
Posted by schlimm on Thursday, February 11, 2016 2:45 PM

Returning to reality from Ludditeville, the definitive cause of the crash in Bavaria is still unknown, but appears to be the dispatcher's human error, not a failure of the system or a computer.

C&NW, CA&E, MILW, CGW and IC fan

  • Member since
    December 2007
  • From: Southeast Michigan
  • 2,983 posts
Posted by Norm48327 on Thursday, February 11, 2016 2:32 PM

mudchicken
Buttonpusher's generation doesn't like the wake-up calls, but here's another one.

And sometimes they don't know what to do when the automation fails. Asiana 214 wound up a ball of twisted aluminum because the pilots didn't know how to fly the plane. They were totally dependent on automation.

Norm


  • Member since
    December 2001
  • From: Denver / La Junta
  • 10,820 posts
Posted by mudchicken on Thursday, February 11, 2016 1:27 PM

BaltACD

Where both computers and humans share responsibility for an operation - the human will become complacent in the computers ability and will be less vigilent for the ultimate outcome of the operation.  WMATA's incidents have proven this over and over.

 

and they just did it again...(fortunately with an empty train and a virtually empty train it sounds like)

The old designing something to be foolproof adage, while true, still has consequences. Being alert , being cautious around automated systems and playing devil's advocate when encountering new situations is an unwritten rule that unfortunately is not always followed. Buttonpusher's generation doesn't like the wake-up calls, but here's another one.

Mudchicken Nothing is worth taking the risk of losing a life over. Come home tonight in the same condition that you left home this morning in. Safety begins with ME.... cinscocom-west
  • Member since
    January 2014
  • 8,217 posts
Posted by Euclid on Thursday, February 11, 2016 10:53 AM

tree68
 
Euclid
I also wonder about the human decision to design an automatic protection system that could be turned off by another human. 

 

How would you like to drive a car that sensed stoplights?  And knew that a dark stoplight meant stop, which meant you couldn't drive any further until the car sensed a green light.  Unless, of course, there was an override designed into the system.  Which would have to be activated by the driver.

Any control system like that needs the ability to be over-ridden (including shut off) should there be an issue that requires manual control.

Think airplane autopilots...

 

 

All I said is that I “wonder” about the system, meaning that I would like to know the details about the manual override.  I did not assert the points that you infer from my comments and then disagree with.  

I certainly never said or suggested that there should not be a manual override. 

  • Member since
    August 2005
  • From: At the Crossroads of the West
  • 11,013 posts
Posted by Deggesty on Thursday, February 11, 2016 10:44 AM

tree68

 

 
Euclid
I also wonder about the human decision to design an automatic protection system that could be turned off by another human. 

 

How would you like to drive a car that sensed stoplights?  And knew that a dark stoplight meant stop, which meant you couldn't drive any further until the car sensed a green light.  Unless, of course, there was an override designed into the system.  Which would have to be activated by the driver.

Any control system like that needs the ability to be over-ridden (including shut off) should there be an issue that requires manual control.

Think airplane autopilots...

 

I wonder how such a system in a vehicle would manage a flashing red light, especially one that comes into play when the lights in all four directions are flashing red. Not all drivers understand that such a situation calls for being treated as though the intersection were an all-way stop (there is an intersection in Salt Lake City that used to have signs reading "five-way stop").

Johnny

  • Member since
    December 2001
  • From: Northern New York
  • 25,013 posts
Posted by tree68 on Thursday, February 11, 2016 10:27 AM

Euclid
I also wonder about the human decision to design an automatic protection system that could be turned off by another human. 

How would you like to drive a car that sensed stoplights?  And knew that a dark stoplight meant stop, which meant you couldn't drive any further until the car sensed a green light.  Unless, of course, there was an override designed into the system.  Which would have to be activated by the driver.

Any control system like that needs the ability to be over-ridden (including shut off) should there be an issue that requires manual control.

Think airplane autopilots...

LarryWhistling
Resident Microferroequinologist (at least at my house) 
Everyone goes home; Safety begins with you
My Opinion. Standard Disclaimers Apply. No Expiration Date
Come ride the rails with me!
There's one thing about humility - the moment you think you've got it, you've lost it...

  • Member since
    January 2014
  • 8,217 posts
Posted by Euclid on Thursday, February 11, 2016 10:20 AM
In the case of this wreck in Germany, the automatic system that would have prevented it was turned off by a human using human judgement.  I don’t know if it has been determined just what happened next. 
Other than turning off the automatic system, did the human do something else that actually caused the wreck, or was it caused by some other circumstance beyond the control of the human who turned off the automatic protection? 
I also wonder about the human decision to design an automatic protection system that could be turned off by another human.   
  • Member since
    May 2003
  • From: US
  • 25,281 posts
Posted by BaltACD on Thursday, February 11, 2016 10:19 AM

Where both computers and humans share responsibility for an operation - the human will become complacent in the computers ability and will be less vigilent for the ultimate outcome of the operation.  WMATA's incidents have proven this over and over.

Never too old to have a happy childhood!

              

  • Member since
    April 2007
  • From: Iowa
  • 3,293 posts
Posted by Semper Vaporo on Thursday, February 11, 2016 10:07 AM

As a computer programmer and designer of 45 years’ experience, I agree with BOTH Jack R. and edblysard.  Given enough PROPERLY DESIGNED hardware with enough memory in the computer to contain the software and enough speed of the computational components, a computer could handle anything.

The problem is the two constraints...

 

1 -- The system designers (programmers) must think of every possible problem that might crop up and then find the appropriate response to each one so the system can take the best course of action.  (The Apollo designers may have thought of the possibility of the hole in the side of the command module, but discounted needing hardware backup since the result of such an explosion could likely have killed the human "backup" hardware and rendered the need for non-human "backup" moot.  As it turned out, the "problem" was not as catastrophic as it could have been and they were able to rely on the available backup... i.e.: the humans.)

And

2 -- You have to have the proper hardware which is ALWAYS limited (Apollo 13 was limited due to weight and size considerations to just one system with very limited redundancy so as long as the humans were still alive they were the "backup" system of the payload).

Contrary to popular belief, computers are not, and possibly never will be, as fast as the human brain at computational abilities (comparing signals and conjuring up a solution to discrepancies in the signals)... at least not in the same volume/weight as the human brain.  The human mind is a massively simultaneous parallel processor and not, as computers are presently designed, just multiple individual processors, handling data in serially executed programs (like your wiz-bang computer with a Quad-core Multi-threading processor, w/graphics controller and RAID disk drives).

Besides, no computer programmer can think of all the possible problems that might occur such as to develop contingency procedures to perfectly handle them.

 

A long time ago, I formulated a maxim that may not completely fit here, but I repeat it often just for the fun of it.

 

"Computers are neither smart nor dumb...

they are just plain MEAN!"

 

Semper Vaporo

Pkgs.

  • Member since
    January 2014
  • 8,217 posts
Posted by Euclid on Thursday, February 11, 2016 9:36 AM
schlimm

From what I have read in most of the German media, likely the PZB 90 system was turned off by a human signal controller (dispatcher).  He meant well, but clearly the automatic train control did not fail.  Once again those fatal two words, human error.  But the final word is not yet known, pending thorough examination of the black boxes..

(from Deutsche Welle):

What's an example of some new safety technology?

There's the newer European train safety system, the European Train Control System (ETCS). Right now it's in operation on the newly built line between Leipzig and Erfurt. And then there's a line between Berlin and Leipzig where this technology is also installed, although at varying levels. Between Berlin and Leipzig there are also still the conventional signals as a failsafe.

On the Leipzig-Erfurt line, they've completely done away with the signals. It's done with so-called beacons. You can imagine them as small wireless sensors that are laid in the track bed. The information is thereby exchanged between the train and the dispatcher. This is already a very strong move toward automated driving. A train driver is of course still there, but it's mostly automated."

 
I think there is a place for both human consciousness and computer control, but either one can fail.  And also, either one can take over in case the other fails.  However, if the reports about the cause of this wreck are true, it does not seem like the poster boy for why human consciousness is more reliable than computers.    
  • Member since
    March 2002
  • 9,265 posts
Posted by edblysard on Thursday, February 11, 2016 7:13 AM

Computer are only capable of "X" amount of functions.

Humans, if experienced and properly trained, can predict and if necessary, stop, deter or prevent such catastrophic event.

Anyone remember Apollo 13?

Hole in the side of command module, almost out of electric power, limited breathable air supply, needing to do a corrective re-entry main engine burn...no computers at all, no guidence system...and a one time only, no second chance burn.

Either they "guess" right, or they are dead.

With a wrist watch, slide rule, and eye ball aiming, they excuted a perfect burn and hit the glide window dead center.

Computers are great for performing the same function over and over and over...it takes skill, intuition, and instinct to make that critical decision.

Computers don't "think"...they just do as they are programed.

I will (and do) bet my life on an human engineer over any computer.

23 17 46 11

  • Member since
    July 2006
  • 9,610 posts
Posted by schlimm on Thursday, February 11, 2016 7:02 AM

From what I have read in most of the German media, likely the PZB 90 system was turned off by a human signal controller (dispatcher).  He meant well, but clearly the automatic train control did not fail.  Once again those fatal two words, human error.  But the final word is not yet known, pending thorough examination of the black boxes..

(from Deutsche Welle):

What's an example of some new safety technology?

There's the newer European train safety system, the European Train Control System (ETCS). Right now it's in operation on the newly built line between Leipzig and Erfurt. And then there's a line between Berlin and Leipzig where this technology is also installed, although at varying levels. Between Berlin and Leipzig there are also still the conventional signals as a failsafe.

On the Leipzig-Erfurt line, they've completely done away with the signals. It's done with so-called beacons. You can imagine them as small wireless sensors that are laid in the track bed. The information is thereby exchanged between the train and the dispatcher. This is already a very strong move toward automated driving. A train driver is of course still there, but it's mostly automated."

C&NW, CA&E, MILW, CGW and IC fan

  • Member since
    December 2007
  • From: Southeast Michigan
  • 2,983 posts
Posted by Norm48327 on Thursday, February 11, 2016 6:06 AM

Jack R.
Humans are only capable of "X" amount of functions. Computers, if properly programmed, can predict and if necessary, stop, deter or prevent such catastrophic events.

Computers smarter than humans? Dubious.

Norm


  • Member since
    February 2016
  • 101 posts
Posted by Jack R. on Wednesday, February 10, 2016 11:38 PM

Well, one thing not mentioned thus far in this thread is.....

" may those who lost their lives rest in the Lord "

.....and may God grant their families peace in this most difficult time.⛪

 


One thing I have learned from being in, around and through trains, accidents are totally 100%

preventable. Human error is at the heart of most, if not all train mishaps. Many years ago, in Japan, a very young train engineer found out the hard way why one should never rush when running late. Oh he pushed his train alright and in doing so, he caused one......if not the worst......commuter train crash in Japanese history. It was just plain awful.

I suspect that human error is to blame for countless accidents, but as I suggested in another thread regarding the Amtrak crash in Philly, we need locomotives that can think. Intelligent systems that can be utilized by locomotives that can literally take control when control seems all but impossible in extreme conditions. I submit, this accident in Germany was 100% preventable and that if a locomotive can literally take control of the situation, those people would still be alive and everyone would have been okay.

Smart technology is something humans must embrace at some point in our lives. We must accept and evolve or, deny it and suffer the consequences of our lack of faith in technology.

If a locomotive is so equipped with this intelligence as I call it, it would basically have said.......

......." hey wait a second here.....there is something ahead and I need to stop ".

Humans are only capable of "X" amount of functions. Computers, if properly programmed, can predict and if necessary, stop, deter or prevent such catastrophic events. Such a smart system would also require an advanced signal system. Nothing like we see today. No, this signal system would be literally light years ahead of these archaic systems in place today. 

Locomotives, signals, track working as one to prevent such tragedies.

  • Member since
    December 2001
  • 1,486 posts
Posted by Victrola1 on Wednesday, February 10, 2016 4:48 PM

Bavaria train crash: what we know so far

 

Published: 09 Feb 2016 13:56 GMT+01:00
Updated: 10 Feb 2016 08:00 GMT+01:00

Check here for a quick overview of all the latest information on the train accident in Bad Aibling, Bavaria on Tuesday morning.   

http://www.thelocal.de/20160209/bavaria-train-crash-what-we-know-so-far

  • Member since
    July 2010
  • From: Louisiana
  • 2,310 posts
Posted by Paul of Covington on Wednesday, February 10, 2016 4:37 PM

   I first heard of this from the Chinese news service (cctv.com), and they mentioned that it appeared that the signal system was turned off to manually move a late train.   Later, on CBS news, they said the cause was unknown.  Here is one report:

http://english.cntv.cn/2016/02/10/VIDEWujJ1vbJ3hWUwROQIgKe160210.shtml

   I would have offered this earlier, but have had trouble getting on to the site until today.   It just occurred to me today that the site may be swamped because of coverage of the Chinese new year celebrations.   If you try the above link, it may be very sluggish.

_____________ 

  "A stranger's just a friend you ain't met yet." --- Dave Gardner

Join our Community!

Our community is FREE to join. To participate you must either login or register for an account.

Search the Community

Newsletter Sign-Up

By signing up you may also receive occasional reader surveys and special offers from Trains magazine.Please view our privacy policy