I must assume that today all makers of locomotives use compatible MU connections, so GE and EMD can work together. And even 60 years ago, I regularly saw EMD and Alco and even FM mixed together, presumably with no compromise. But seems to me in the dim origins, various makers had their own systems and you could not readily mix brands. I know nothing of the details of MU. But I am sure everyone recognized the value of inter-brand compatibility.
SO my question is this: in the early days, did EMD just "win" as the 300 pound gorilla in the room, other makers adapting to their system? (or EMD adopting some other.) Or was an industry committee formed to set up a standard? How long did incompatibilities linger before the industry got it all worked out? And has the modern system evolved? I mean would a 1955 F unit plug into a present day locomotive and proceed? Or are there limits to back-compatibility?
That's enough questions.
There's a reason the 8-notch system is standardized as 'AAR' (and there are reasons it persists long past tilting plates to implement binary relay logic).
Manufacturers have gone far beyond 8-notch engine governing in a number of respects; GE tried for some time to provide 'intermediate notches' by tinkering with excitation, for example, and at least some versions of DPU are proportional in servo.
It makes just as much sense to retain interworkable 8-notch (whether you call the notches 'run' positions as EMD did or not) as it did for Vail to implement the modern Bell system to be backward-compatible with older equipment and systems for so many years. A modern computer-controlled engine does not need to respond to notch commands, nor does a program like TO or Leader need to convert its commanded speeds into notch settings directly. But it helps if any engine has the ability to control any consist it 'finds' itself controlling in MU, and contrariwise can be controlled by anything if necessary.
at one time in Panama City Fl. The Bay line (A&SAB) had (or has) an Alcoa Road switcher in a park. The MU connector only had a 19(?) pin connector.
Since you are asking about MU cables, I have a question. When I was employed by the PRR in the late '50's, Diesels had three or four hoses on each side of their pilot and their other end. I was told they were for MU purposes. But the units also had multipin cable connectors. Were the hoses for use with non-EMD units? Or older EMD units?
Electroliner 1935Since you are asking about MU cables, I have a question. When I was employed by the PRR in the late '50's, Diesels had three or four hoses on each side of their pilot and their other end. I was told they were for MU purposes. But the units also had multipin cable connectors. Were the hoses for use with non-EMD units? Or older EMD units?
The hoses were for operation of various aspects of pneumatic operations in the MU hook ups. The Cable connections are for the electrical operations involved in the MU hook up - both are required.
Never too old to have a happy childhood!
Some of the really old-time diesels had pneumatic throttles.
Erik_MagSome of the really old-time diesels had pneumatic throttles.
If you look at GG1s, which are straight electrics with transformer-tap throttle control, you will see the hoses involved in MU brake connections. We have discussed what these do in past threads.
A passenger engine would also need a connection of some kind for the conductor's signal line. We had a thread years ago about a mystery hose from the nose of an early E unit that I think was such a connection.
I thought it would be easy to find 'manual' pictures of the connection arrangements for BLW power, or to locate some of Matthew Imbrogno's comments on modernizing the system for shortline and special use. I am sorry to have failed in this so far. I am sure there are people here, or following the Classic Trains forums, who will know; someone might consult the nooks and crannies of the Bakdwin Diesel Zone site or a Wayback version of it to see what's there.
The airbrake one is simplicity itself. If there is no air pressure then the emergency brakes are automatically applied. No with the mu hoses it could be that they're reversed just incase someone mistakenly somehow got one of them hooked up to the main air line of the train. Don't laugh it more than likely has happened. Well by having the mu and other signal lines the opposite of the air brakes are you prevent any mistakes when coupling up cars or locomotives together.
The glad hands on the brake pipe hose and the MU hoses are different sizes. I don't think it's possible to cross the brake pipe with one of the MU hoses. You can cross MU hoses, I've seen it happen. (Coming off the diesel ramp, no less.)
There's probably a reason why the different orientations was used. I've never heard why, though.
Jeff
Shadow the Cats ownerThe airbrake one is simplicity itself. If there is no air pressure then the emergency brakes are automatically applied.
You can get the MR and trainline gladhands to go together, but you have to twist one hose around 180 degrees, and even then I don't think the gladhands will 'lock' together properly.
I've always guessed that the different gladhand orientation was a deliberate choice to prevent 140 PSI MR air pressure from being introduced to things like brake valves that were not designed to handle that amount of pressure.
Every now and then the shops will put the wrong type of angle cock on the brake pipe or one of the MU air lines. So I got in the habit of feeling for the little raised line on the handle that indicates which position is open, I suppose someone could rebuild the angle cock and mount the handle wrong but I haven't encountered that yet.
Greetings from Alberta
-an Articulate Malcontent
With some exceptions, all the builders used Woodward governors to control (or help control) diesel engine speed and load. Fundamentally, this created 8 notches and the ability to shut down the diesel engines from the lead locomotive. The MU connector and what pins did what were left to the builders and the roads to negotiate. The railroads often specified what they wanted to the builders, specifically.
The RRs eventaully standardized on the 27 pin MU system, with specifically designated pin assignments. Many of the pins were unassigned or flexibly assigned for future use. (Conrail's Select-a-Power system, for example)
Sometimes these unassigned pins could create problems. Early in Conrail, they needed to have special jumper cables for EL locomotives because EL used some pins for sanding control that PC (and the others) didn't.
A good explainer of pin assignments is here: http://www.railway-technical.com/trains/rolling-stock-index-l/diesel-locomotives/us-locomotive-mu-control.html
-Don (Random stuff, mostly about trains - what else? http://blerfblog.blogspot.com/)
Overmod There's a reason the 8-notch system is standardized as 'AAR' (and there are reasons it persists long past tilting plates to implement binary relay logic). Manufacturers have gone far beyond 8-notch engine governing in a number of respects; GE tried for some time to provide 'intermediate notches' by tinkering with excitation, for example, and at least some versions of DPU are proportional in servo. It makes just as much sense to retain interworkable 8-notch (whether you call the notches 'run' positions as EMD did or not) as it did for Vail to implement the modern Bell system to be backward-compatible with older equipment and systems for so many years. A modern computer-controlled engine does not need to respond to notch commands, nor does a program like TO or Leader need to convert its commanded speeds into notch settings directly. But it helps if any engine has the ability to control any consist it 'finds' itself controlling in MU, and contrariwise can be controlled by anything if necessary.
As long as the Woodward flyball governor is back there on the engine, you'll have 8 engine speeds (and one or two low idle speeds)... and a load regulator to protect the diesel engine from overloading.
SD70Dude You can get the MR and trainline gladhands to go together, but you have to twist one hose around 180 degrees, and even then I don't think the gladhands will 'lock' together properly.
Our local mechanical trouble shooters have rigged up a hose that has a gladhand that fits the brake pipe on one end, the other end's gladhand will fit the main reservoir hose. They did this so they could recharge the MR on a dead DP equipped with the air start system. This way they don't have to get a live engine next to the dead one to restart it.
It comes in quite handy when the mid-train DP has the auto stop shut down, but then doesn't automatically restart and then the MR bleeds down. That happened to me once. I figured we would have to set over half the train to the other main track to get the head end next to the dead DP. The mechanical guy came out, applied the compromise hose and was able to get enough pressure in the main and the auxilary air start reservoir to restart it.
The bad habit of DPs shutting themselves down, but not starting up again when needed is why I don't care for the Auto Engine Stop/Start systems. At one time, engines in DP mode weren't supposed to have the AESS systems active because of that problem. It doesn't happen everyday, but way more often than it should.
What about the "MU" cables for what is called train control? Does it use the same identical connectors? Does anyone know how the pins are assigned>?
I can’t say that I have ever found a clear-cut statement as to the origin of the AAR 8-throttle notch control and its associated MU system.
The fairly well-known “Trains” 1968 December article (1) stated that the 8-notch control was of GE origin, developed during the 1930s.
In a 1947 AIEE paper by GE (2), in discussing the then-new Amplidyne control system, it was said:
“Standard practice for several years by most manufacturers of diesel-electric road locomotives has established the use of eight throttle handle notches for obtaining eight different engine operating speeds on units which are operated in multiple. The amplidyne excitation control circuits therefore were designed on this same basis in anticipation of probable requests by ultimate customers to operate in multiple with existing locomotives.”
That comes across as more of a “following the crowd” exercise, with no claim to having pioneered the system. It could also be interpreted as recognition that by then it was too late to change, even were such technically desirable.
Be that as it may, the 8-position remote control, which could be obtained using three binary actuators and so three control wires, probably appealed as a reasonable trade-off between functionality and complexity, and was found elsewhere in the world at about the same time.
In some respects, Woodward may have been more of a follower than an initiator. Early locomotive applications of the Woodward governors, including the SI and UG8 models, had speed control units that were external to the governors themselves, and so could be configured as the locomotive builder required. Load control pilot valves and floating levers, where used, were also external. Both EMD and GE had such speed control devices, of the electropneumatic type, although I have yet to find diagrams of their internal workings. A GE device of that type was used on the small Export Universal models fitted with Caterpillar engines, which in those applications had Caterpillar governors.
The change to internal speed control, and an internal load control pilot valve, came with the Woodward “New SI” model circa 1945, which I think may have made its debut on the EMD F3 (although I am not sure). This could be fitted with either electric-hydraulic or pneumatic-hydraulic speed control. The electric-hydraulic version could provide up to 15 engine speeds, using four speed control solenoids. However, the modal speed configuration was 8 evenly spaced speeds, also using the four solenoids, in the same pattern as has been established for the existing external speed controllers used by EMD and GE. To do this, the fourth, or D solenoid did double-duty as both a speed decrement solenoid and a shutdown solenoid. This speed control mechanism was described in patent US2496284 of 1950 February 07, filed 1945 May 03. The same speed control options were carried over to the ubiquitous Woodward PG governor of the late 1940s. I don’t know if there were any Woodward governor applications that used all 15 speeds, but there were some (non-US builders) examples that used 14, and also 10 and 6.
(1) Trains Magazine, 1968 December, p.44ff: “Lash ‘em Up – How to Mate Miscellaneous Makes and Models”, by Jerry Pinkepank.
(2) AIEE paper 47-37, 1947 January: “Developments in Diesel-Electric Traction-Generator Excitation Control Systems”, by C.A. Brancke and G.M. Adams.
Cheers,
Hey, thanks.
It was Herman Lemp at GE that developed the control system for GE's early gas-electrics that was eventually used in diesel-electric:
https://www.asme.org/wwwasmeorg/media/resourcefiles/aboutasme/who%20we%20are/engineering%20history/landmarks/229-great-northern-2313-montana-western-31-gas.pdf
Frank Sprauge invented MU, dynamic braking, axle hung motors, and more for the rail industry:
https://lemelson.mit.edu/resources/frank-sprague
The diesel MU case probably deserves an in-depth historical and technical treatment by someone in or with close connections to the industry. As an outside observer I still have more questions than answers after several decades of looking at it.
As well as the central issues of origins and the steps towards a common standard, some other aspects worthy of consideration are:
AAR involvement in what was originally probably a locomotive builder matter.
Air brake compatibility issues (resolved in principle with the arrival of the 26L brake, in particular with its universal version.)
Dynamic braking – EMD field loop vs. GE potential wire – somewhat resolved with dual systems in the mid-1950s, probably led by railroad initiatives more than the locomotive builders, then finally addressed by EMD’s adoption of potential wire control in 1961, with retrofits available for older units. Incidentally, this was never a problem in export markets, as EMD adopted potential wire control for export models from the start. There compatibility did not seem to have been the primary driver; rather the field loop control was found to be functionally less suitable for the export models.
The inclusion of supplementary functions, such as the humping control, which – in some export applications anyway, where low adhesive weight-to-power output ratios obtained - was also used to obtain finely graduated control during starting and initial acceleration.
The various engine governors, engine speed controls and load control systems that were/are used.
The “other” systems that were abandoned as the 8-notch system became the modal type and then the standard. These included the Baldwin pneumatic throttle, the Fairbanks-Morse pneumatic throttle (different to the Baldwin), and the Alco-GE three-solenoid, eight-speed control used on early switchers. It may also be noted that the GE 70-tonner, when MU equipped, had a four solenoid, seven-speed control. Some export derivatives of the 70-tonner had pneumatic seven-speed control, as I think did the US Gypsum 54-ton model.
The situations where the standard system was interfaced with control/MU systems that were somewhat different or quite different. Some examples:
The Milwaukee “Wylie Throttle” that allowed leading DC electrics to control trailing diesel-electrics. Good information on this is available.
The UP system that allowed leading GTELs to control trailing diesel-electrics. There seems to be no detailed information available as to how this was done.
The EMD system used on the FL9 that allowed 8-notch throttle handle control of the 28-step DC electric side. Reasonable information on this is available.
The GE 16-notch system. Only partial information seems to be available.
The SP diesel-hydraulic case. Finding the details is a work-in-progress, see: https://cs.trains.com/trn/f/111/t/65687.aspx?page=2#3410266.
There were some export situations of this nature, as well. Two examples: circa 1965, EMD developed for New Zealand Railways a two-way interface between the AAR control and the English Electric 110-volt, 10-notch protocol. And in the early 1970s, for Queensland Railways Australia, Locotrol developed a one-way interface from AAR to the English Electric 110-volt, pneumatic protocol that allowed Locotrol receiver cars to be coupled to either Clyde-GM (AAR) or English Electric mid-train units.
Returning to AAR involvement, in the Trains 1968 December article, Pinkepank made a couple of observations. Firstly: “By far the most prevalent jumper is the 27-wire type first standardized by GM in the 1940's. The 27-point receptacle succeeded the 17-point jumper originally used for FT units and the 16-point jumper still used by some roads for E units. It is now the Association of American Railroads recommended standard, but since locomotives are not interchange equipment, the AAR standard is not binding.”
And secondly: “The AAR has tried for the past 12 years to settle on a standard 27-point receptacle for adoption by all railroads, but the recommended standard has been changed so often that any road which attempted to keep up with it would have spent a fortune by now in electrician man-hours.”
From that we may deduce that the AAR had been involved at least since 1956, and that by 1968 it had standardized the on the type of jumper connection (27-pin), but not the pinout sequence thereof. Presumably that came later.
Apparently, the applicable AAR standard today is S-512, although I have not yet found a copy. But I have found APTA (American Public Transportation Association) RP-E-019-99, Recommended Practice for 27-Point Control and Communication Trainlines for Locomotives and Locomotive-Hauled Equipment. This was edited 2004 March 22; I do not know if it is the latest issue.
This shows four cases for the 27-point connector:
System for Diesel-Electric Locomotives (black colour-coded receptacle) [presumably the same as the AAR standard]
MU System for Cab Car Compatible to Diesel-Electric Locomotive (black colour-coded receptacle)
MU System for Electric Locomotives (white colour-coded receptacle)
MU System for Electric Locomotive Equipped for Diesel Logic Cab Car Control (black colour-coded receptacle)
The use of diesel logic for electric locomotive control effectively goes back to the FL9, but I imagine that it was much easier to do once suitable electronics became available, to the extent that it was more-or-less routine. The description of the available control systems for the EMD GM6C electric locomotive (Railway Age, 1975 May 12, p.10) suggest this: “Availability of five different types of throttle control, including the typical constant-horsepower diesel-electric type with eight throttle steps (this to facilitate m-u operation with diesels); constant-tractive effort controls which result in the throttle's controlling the actual tractive effort regardless of speed: and various combinations of the diesel-electric type and the constant-tractive-effort type, whatever best suits the railroad's train-handling.”
Supposed Herman Lemp quote: "To hell mit der volts, it's der amps dat count".
Trains had a series of articles ca 1979 on the Lemp system of control, basically wiring feedback windings on the generator to give it an approximation of constant power output for a given shaft speed.
Both of Lemp’s basic load control approaches, namely the 1914 speed control with a load regulator rheostat, and the 1924 inherent characteristic type, in which by differential decompounding and other means the main generator curves were made to approximate the hyperbolic ideal, have been widely used in practice. In some cases, elements of both have been used together. The original speed control system had a single load setpoint, appropriate for the engines of the time with their relatively flat torque curves. Later developments allowed multiple load setpoints, either proportional to engine speed or independent of it.
The inherent characteristic type was mostly used on lower-powered locomotives, say under 1000 hp, although in the 1950s both GE and Alsthom used it on some of their more powerful designs. With this type, the key control parameter was engine speed, which is perhaps wherefrom came the concept that each throttle notch represented a specific engine speed, with a monotonic relationship. Some of the systems derived from the 1914 patent, such as that from Brown Boveri, assigned more than one load setpoint to each engine speed, thus there might be for example four engine speeds but eight throttle notches. Where each engine speed had its own load control setpoint that was proportional to speed, then a single throttle operator could be used to set both, engine speed directly and engine load via a floating lever. So again the one engine speed for each throttle notch was a logical arrangement.
The choice of eight notches might not have been derived as a standalone answer to the question - how many steps are reasonably needed to have fine enough control - but rather because in the first instance that number was seen as the highest that could be implemented without undue complexity, and so was accepted as a reasonable trade-off point. The eight steps could be provided by a three-unit binary operator, which in turn required three trainwires for speed control alone. Once established by early use, the eight-notch control essentially became a fixture, regardless of the underlying details.
On the classification of actual excitation control systems according to Lemp type, one could say that EMD’s 1950s approach was essentially 1914, but with a hint of 1924 in that the main generator self-field winding provided some measure of decompounding.
GE’s Amplidyne system is more difficult to characterize. It constructed highly drooping basic main generator curves, so in that sense there was a 1924 element. But it also used load regulation (1914) to provide the necessary hyperbolic portions of these curves. It appears that just six load control setpoints were used, though, notches 1 through 3 sharing the same setpoint.
GE’s three-field system, used on the Cooper-Bessemer engined export Universals of the 1950s (except the UD18B, which had the Static system), and as far as I know on the Alco DL541/543 export models, seems to have been basically a 1914 system with a significant element of 1924 through differential decompounding of the exciter. The small export Universals with Caterpillar engines had a pure 1924 system.
On the other hand, the Alco export models of the 1950s and 1960s that were fitted with the 6-251 engine had what was basically an inherent characteristic system assisted by a load regulator rheostat, so might have been characterized as majority 1924 with some 1914. Possibly the same system had been used on earlier 6-251-engined locomotives, such as the GE White Pass & Yukon shovel-noses and the Alco S-5 and S-5 6 switchers. Earlier 539-engined Alco switchers I think had a 1924 system.
An earlier GE approach, used for example on the New Haven Alco-GE DL109, combined the differential exciter (Lemp 1924) with a centrifugal speed switch. The latter moved to reduce excitation whenever the engine rotational speed reduced below its maximum due to the load, so in principle conformed to the 1914 system. I suppose that it was a rudimentary form of chopper, effectively doing pulse width modulation of the supply to the exciter battery field.
The advent of electronics, such as the GE Type E system, allowed the construction of very close approximations to hyperbolic constant power main generator curves without the help of the customary load regulator. In such cases the latter served simply to trim the curve, and to protect the engine from overloading in the event of a malfunction. An interesting early application was in the UP GTEL8500s, in which a series of constant power hyperbolic main generator curves were constructed electronically. Doing this by progressive load regulation as in the diesel case was not possible, although there was supervening exhaust temperature load regulation/limiting. The earlier GTEL4500s had a set of “natural” (convex with respect to the origin) main generator curves, with only the supervening exhaust temperature load limiting providing a hyperbolic element.
I am not aware that relative power outputs, or relative power output ranges have been formally assigned to the notches in the AAR system. Rather, whatever convention might exist, e.g. that notch 5 is around half-power, has more likely arisen from actual practice, with subsequent desire, in the interests of compatibility, not to depart too much therefrom.
Anyway, I think it is fair to say that the eight-notch control was a product of what was reasonably possible back in the early days of the electromechanical era.
One puzzle though is the standard AAR solenoid pattern for eight-step speed control, which is as follows:
Shutdown D
1st speed no solenoids active
2nd speed A
3rd speed C
4th speed A, C
5th speed B, C, D
6th speed A, B, C, D
7th speed B, C
8th speed A, B, C
The respective relative speed increments are:
A = 1
B = 4
C = 2
D = negative 2 and shutdown
Thus it may be seen that at the 5th and 6th speeds, the B and D solenoids cancel each other out. 5th speed could be obtained by B alone, and 6th speed by the A and B combination. And in fact that pattern was used in some non-US applications of the Woodward PG governor.
The inclusion of the D solenoid in the array for shutdown purposes was I think driven by the use of governors with rod shutdown, where the speed control rod was moved in the direction opposite to that for speed increase in order to effect shutdown. Thus it could also be used to provide a speed decrement in association with the other solenoids, and when it was set not to exactly offset the increment provided by one of the latter. For example, to obtain 15 speeds on the Woodward PG governor, the relative settings would be: A = 2, B = 8, C = 4, and D = negative 1. Apparently, the settings for each solenoid were variable over a wide range.
So why were the C and D solenoids, cancelling each other out, used together for the 5th and 6th speeds in the AAR system? As the answer is nowhere to be found, I imagine that that whatever the reason was, it goes back a very long way. Perhaps there was an early use of an electropneumatic speed setter where the eight speeds were nor evenly spaced, and C and D differed in magnitude? And when eight evenly-spaced speeds were to be used, it just happened to work out that C and D had the same magnitude.
In that regard, I understand that Lima-Hamilton did use a different solenoid pattern, namely:
5th speed B
6th speed A, B
7th speed A, B, C, D
I do not know the associated speed increments associated with that solenoid pattern, nor the respective engine speeds. But assuming that the 7th speed was somewhere between the 6th and 8th speeds, then it is clear that the magnitude of D must have been less than the magnitude of C, as well as opposite in sign. In this case compatibility was maintained by using a sequence converter between the master controller, which generated an AAR-type output, and the governor.
The "odd" control sequences look a bit similar to a Gray Code, which strictly has only one bit changing between steps. This is a bit safer than straight binary when using a rotary encoder (throttle controller in this case), where a slight error in positioning whatever encodes the individual bits would lead to short term spurious codes being sent when the encoder/controller was moved.
From posts on other sites there appears that the new Siemens ALCs and ACS-64s have MU issues due to fact Siemens does not use the relay based 27 pin operation. Uses some kind of voltage sensing that can be affected by moisture on the 27 pin connectors.
Erik_Mag ....where a slight error in positioning whatever encodes the individual bits would lead to short term spurious codes being sent when the encoder/controller was moved.
....where a slight error in positioning whatever encodes the individual bits would lead to short term spurious codes being sent when the encoder/controller was moved.
Newer units display the throttle position on the computer screen. If you move the throttle halfway between notches on GEs the computer display will briefly show notch 1, even if you were between, say, 4 and 5.
I had wondered if the AAR sequence was chosen to simplify the solenoid changes required at notch transitions, although I had not made the connection with the Gray Code. In any event, the A solenoid alternated between off and on for the odd and even notches respectively. But the C solenoid was on from notch 3 through notch 8, which would not have been the case for a straight binary sequence, where it would be on only for notches 3 & 4, and 7 & 8. Either way, the B solenoid was on from notch 4 through notch 8.
Nonetheless, there were other railroad applications that did use the straight binary sequence.
One example was the throttle control used in the Alco 539-engined switchers when equipped for MU. These had a Woodward SI governor with external speed control and solenoid shutdown of the energize-to-run type. The shutdown solenoid played no part in engine speed control. The sequence was:
Shutdown no solenoids active
2nd speed T1
3rd speed T2
4th speed T1, T2
5th speed T3
6th speed T1, T3
7th speed T2, T3
8th speed T1, T2, T3
I don’t have the engine speed schedule, but I suspect that the relative speed increments were, or were proximate to: T1 = 1, T2 = 2 and T3 = 4.
As I understand it, the governor operator was a version of the GE 17MK electropneumatic unit devoid of one of its customary four operating cylinders, but I have not been able to confirm this.
The 17MK3 operator, with all four cylinders, was used on the standard GE 70-tonner when fitted for MU. The non-MU version had a mechanical throttle control. Here a Woodward UG8 governor, with solenoid shutdown, of the energize-to-run type; and not used for engine speed control, was employed. Seven-notch rather than eight-notch control was used. Seven notches appear to have been an established GE norm for industrial switchers when the 70-tonner was introduced.
The sequence was:
2nd speed T1, T2
3rd speed T1, T3
4th speed T1, T2, T3
5th speed T1, T4
6th speed T3, T4
7th speed T1, T2, T3, T4
I have no information about the associated engine speeds. If T1 is set aside, then T2, T3 and T4 appear to form a normal binary sequence except that the T2+T4 step is missing, with a jump from T4 alone to T3+T4. T1 might have provided a small speed increment, or perhaps a small decrement. Which of those it was would have affected where 6th speed sat in relation to 5th and 7th speeds.
In the early days of dieselization, locomotives could be purchased with or without MU equipment (the non-MU option was probably cheaper).
One railroad that did this in the late 1940"s was the NYS&W. To easily determine which locomotives did or did not MU, the railroad used odd numbers on the non-MU locos, even numbers on the MU locos. The NYS&W continues this practice to this day.
When the Delaware Otsego system purchased the NYS&W around 1980, they assigned an odd numbered locomotive (with MU) to the Little Ferry yard. The unit suddenly developed minor, and a few major, operating problems. The DO ended up transfering it to another one of their railroads, where it ran fine.
The NYS&W occasionally leases locomotives from CSX or NS. I've seen a few leased units with odd road numbers, but haven't heard anything yet about any "strange" problems.
Susie Q's numbering plan has continued to this day but it has become more of an anomaly than anything else since everything on the roster is MU-equipped anyway.
In my June 26 post, I mentioned “The Milwaukee “Wylie Throttle” that allowed leading DC electrics to control trailing diesel-electrics. Good information on this is available.”
The main source is a 1959 April AIEE paper “Multiple-Unit Operation of Diesel and Electric Locomotives on the Milwaukee Road”, by Lawrence Wylie.
A more abbreviated description was provided in Railway Locomotives and Cars, 1958 September, p.58ff in an article “Diesel Boosters for Electric Locomotives”.
The scheme was simple but really quite ingenious. Coupling between the electric and diesel control systems was entirely mechanical, with no electric interconnections. And the link could be disconnected to allow independent control of the diesel if desired. Supply current for the diesel control was obtained from the diesel locomotive itself.
The throttle coupling was by a rack-and-pinion device that provided a variable ratio by having an eccentrically mounted pinion.
The electric controller had 37 notches (plus separately control for two stages of field shunting), and the mapping to the 8 diesel notches was as follows:
Electric Diesel
1 - 3 1
4 – 6 2
7 – 9 3
10 – 12 4
13 – 15 5
16,17 6
18 – 23 7
24 – 37 8
(Electric notches 16 (series), 26 (series-parallel) and 37 (parallel) were the running positions.)
The objective was to have the trailing diesel-electric consist in notch 8 at about 15 mile/h, by which speed slipping was less likely.
As built, there was no control of trailing diesel dynamic braking, that that was noted as a future possibility.
This was an early example of mating two different control systems that were not originally designed to work together.
Pneudyne A more abbreviated description was provided in Railway Locomotives and Cars, 1958 September, p.58ff in an article “Diesel Boosters for Electric Locomotives”.
That issue is available for viewing and downloading from "The Internet Archive". In addition to that issue, the complete collection of "Railway Locomotives and Cars", "Railway Mechanical Engineer" and "American Railway Journal" are available for viewing and downloading.
The viewing option became a lot more palatable for me after getting fiber internet service as the new pages load almost instantly.
Our community is FREE to join. To participate you must either login or register for an account.