At the 6 Stand Cold reduction Mill of the US Steel, Gary Works, they needed a system to control the pairs of 9000 HP motors that drove the top and bottom rollers to stretch cold steel inches thick.
Solid state electronics, and even tubes, weren't up to the task.
Earl Pace was the Westinghouse engineer to figured out how to do it. He used a set of Synchronous AC motors to take the incoming 3 Phase, and drive a set of DC generators. The technology to control the field windings on the DC generators was just within reach. So he used the biggest SCR systems of the day to drive those. Driving the field windings allowed you to zero or even reverse the output of the generator.
In effect, it was one of the largest magnetic amplifiers in history.
Those generators then drove the DC motors on 6-stand.
Some of the control circuitry used Magnetic amplifiers when they need to have multiple isolated inputs. The amazing thing is that they can pass DC signals to about 1 khz. (They had a 10 Khz drive signal)
All of this from 1964. I repaired some of the control modules back in the 1980s, which is how I learned about it all.
Unfortunately, no. That was before my first digital camera. It was all Westinghouse controls in modules. There were many 3 phase SCR packs, good for about 100 AMPS each. I never got to see the motor generator sets.
When I read the title, my first thought was about the cryotron: "Tantalum in superconducting state can carry large amount of current as compared to its normal state. Now when current is passed through the niobium coil (wrapped around tantalum) it produces a magnetic field, which in turn reduces (kills) the superconductivity of the tantalum wire and hence reduces the amount of the current that can flow through the tantalum wire. Hence one can control the amount of the current that can flow in the straight wire with the help of small current in the coiled wire."
I actually mentioned the Cryotron in the original draft of this article, but there wasn't room for it. Some of the other forgotten computing technologies of the 1960s are tunnel diodes, microwave oscillators (Parametron), and electroluminescent photoconductors.
It was an experiment set for International Physics Olympiad. There was a black box with a tunnel diode, inductor and capacitor inside. If the diode would have been impossible to source, another set-up for the black box would have been used instead. Experiment is described here: http://hedgehog.ee/ipho
As far as I can understand superconductivity, the devices are similar but opposite. Superconductors have a both a critical current (in the absence of a field), and a critical magnetic field (in the absence of a current). So, cranking the control line raises the field, which squelches current capacity to zero (and/or since there's an excess of either/both, superconductivity is lost).
> “Many engineers are under the impression that the Germans invented the magnetic amplifier; actually it is an American invention. The Germans simply took our comparatively crude device, improved the efficiency and response time, reduced weight and bulk, broadened its field of application, and handed it back to us.”
Quoting a quote: I found this quote of the 1951 US manual amusing.
It’s funny today. But to put it in context, many Americans wouldn’t use German products after the war. My parents refused to buy German cars and other items for decades.
Much like Russian products are abhorred in America right now. Caviar, LukOil, Kaspersky AntiVirus, etc…
Very little vodka sold in the US is actually from Russia. Svedka is Swedish, Smirnoff is a British brand produced in the US and Tito’s is from Texas, to name a few.
But incredibly expensive. Really, it's just for showing off. Beluga sturgeons can't be farmed, so they are endangered. Farmed Italian sevruga caviar is good enough for a couple of blinis. I'd say it's still ridiculously expensive stuff, but it's a small fraction of the price of Russian beluga caviar.
So when you're buying an antivirus you're buying paranoia. Russian paranoia is simply much more refined and unimpregnable than American paranoia, much worse persecution, much worse consequences. It's spinal.
There's other stuff, like parachutes and ejection seats, they made much better parachutes for aerospace because they've seen uniquely fucked-up accidents like disgusting, and instead of slapping lasers and computers on everything they're more like old-school American inventors who actually used cheap cheap stuff, ideally stuff nobody wants, oh man there was tons of that in the Soviet Union! Factories producing too little of one thing, too much of the other, like the incentives didn't match up, but hey, if the inventor figures out a way to work with that, I mean of course he can't it was proven impossible BUT IF HE DOES ANYWAY, if he thinks the unthinkable, it may be the last time but it won't be the first.
In addition there was some really high quality stuff the west just refused to talk about and carried out historical and media blackouts against. So just as the Soviet Union didn't talk about the American Lunar Landing, in the west they don't talk about the Soviet Lunar Landing, 1967 I think it was? First they got a rocket to do a hard landing, but all over the nearby solar system, hard landings everywhere, largely because they had bigger rockets, much better rocketry. Then after the hard landings, soft landings, take photos of Venus on the surface (the only ones humanity has), and put this little drone car on the Moon.
What is funny about it? Boycotts do have economic effects, depending on participation. Also, what the Germans did in WWII was far, far beyond what Russia has done (not downplaying at all what Russia has done).
Once, while helping to assemble a brand-new bike, I was amazed when somebody came behind me and spun the wheel I had just attached. It didn't spin straight at all. They grabbed a spoke wrench and went to work. Tweak-tweak, spin. Tweak-tweak, spin. They patiently worked it into a perfect spin. I could not believe that something had come from the factory so crooked.
That was when I realized that it pays to keep improving on the wheel.
While it may have 1 less bump, the bump is bigger. What you want to do is minimize the size of the bumps. You can improve your design by instead of basing your wheels of triangles, base them off of a shape such as the Reuleaux triangle. This will essentially remove all of the bumps.
Thanks! I'd long wondered exactly how radio carriers were modulated before vacuum tubes. (Technical details of that era are often skimpy in histories.)
Might want to get a head start on the receiving hardware, though. Most SDRs don't go that low, and though soundcards can go that high, their input impedance may not be well suited to whatever antenna you can cobble up. Oh, and you'll want an absolutely enormous antenna. Get a roll of cheap fence wire and string it halfway down the block...
Magnetic amplifiers have been in use long past WWII, they were never obsoleted by War's end.
For example, many stage lighting systems still use them. For example, the Sydney Opera House which opened in 1973 with the latest equipment used them for its stage lighting dimming (although they have now been replaced with solid state dimmers).
Magnetic amplifiers are wonderful devices albeit a bit slow for some applications. Moreover, unlike SCR and other solid state switching, they produce no RF switching noise.
> In the 1920s, improvements in vacuum tubes made this combination of Alexanderson alternator and magnetic amplifier obsolete. This left the magnetic amplifier to play only minor roles, such as for light dimmers in theaters.
Very true, but things get messy when one wants to dim/control say power at an mVA level and still do so at domestic mains voltage (in my example, the feed to the building is 3 mVA). At that power vacuum tubes require many kV anode voltage which isn't helpful when running loads that have to run at domestic mains voltage for safety reasons. Also with dimmer circuits one often has hundreds of channels, until solid state came along using vacuum tubes would have made the electronics very complicated not to mention highly inefficient.
Another reason for using magnetic amplifiers is their utter reliability - they just don't fail (being so simple). In fact, they're still used in some industrial applications specifically for that reason (unlike the Alexanderson alternator which was a
monster to operate and keep reliable - but then they each served different purposes).
BTW, I'm not anti vacuum tube as for part of my career I worked with FM and TV transmitters. Even today, replacing, say, a 4CX10000D vacuum tube with a solid state equivalent is certainly no easy feat.
Here, I mean 'mega' of course. Sorry about that. Conventions can be confusing and I was taught both SI and the naming convention where nouns based on inanimate objects/properties are in lowercase and those based on proper names in uppercase. For example, 'kilo', 'deci', '(V)olt' (Volta), '(A)mp' (Ampère). Of course, this all falls apart when we run out of symbols, 'Giga' being another notable example of needing to use u/c. In this case, it was me being slack and not using the now standard convention. Incidentally, I often do the same by using Gaussian (cgs) units when I should be using SI. Trouble is I was taught CGS before SI so I'm used to working in it.
BTW, you'll note that after 'Gaussian' 'cgs' is in l/c, that's deliberate on my part as it's often designated that way despite the fact that the browser's speller automatically corrected it to u/c. The second instance I've left as the browser corrected it.
An important transition is missed, that is that magnetic amplifiers are analog amplifiers.
It was William Steagall of Sperry-Rand who realized that the flux saturation could be used as a digital logic element. If you send a pulse in the same direction as the core is already magnetized, you get no output pulse. From this, and a multi-phase clock to supply drive power, digital computing elements were created.
A paper bag of magnetic cores disappeared while the engineers were out to lunch:
> But shortly after, the engineer called and asked if the shipment was there. This did not sound too good. With a little detective work we found a cleaning crew had worked in the office while we were gone. A little more sleuthing revealed that the bag had been accidentally knocked into a waste basket, and that load of waste had already been dumped into the plant incinerator. The incinerator ashes were spread over a concrete floor, and sure enough there were small magnetic cores, about one sixteenth of an inch in outside diameter, mixed in with the ashes. The CP-642 B had 32,768 30-bit words in its memory, meaning, with spares, there were just about one million magnetic cores in the ashes. At ten cents per core, the ashes held about one hundred thousand dollars worth of cores.
We reasoned the cores were the result of a firing process, and the heat of the incinerator probably had not hurt them. Maybe it even made them better. A quick test of some of the cores picked from the ashes revealed the cores were as good as ever. We and a contingent of Univac engineers & technicians spent a fun filled day rescuing the cores from the ashes with long needles. The cores were strung into the machine’s memory planes, and it passed all performance and environmental tests with flying colors.
That entire history is worth a read if you are interested in computer history of the military variety.
Thanks for that charming bit of history. Humorous but also astonishing that the cores were salvageable after such rude treatment. The use of such cores was before my time in the computer world so I have no experience with them. I wondered if the tiny cores could have been scooped out of the ashes by a magnet. I'm sure the engineers would have thought of it, I'm guessing that would have damaged them. (Or more likely it wasn't even possible to collect them that way...)
Threading the burnt cores onto needles would definitely not be my idea of fun. Though I imagine a needle full of them would resemble a string of beads. Come to think of it, as described those cores would make a mighty interesting necklace (and FWIW my wife thought so too).
Interesting story. By the way, the NTDS computer system you mentioned was very successful and important for military computing, but it's almost forgotten now.
> Researchers soon constructed what was called core memory from dense grids of magnetic cores. And these technologists soon switched from using wound-metal cores to cores made from ferrite, a ceramic material containing iron oxide. By the mid-1960s, ferrite cores were stamped out by the billions as manufacturing costs dropped to a fraction of a cent per core.
> In the mid-1990s, the ATX standard for personal computers required a carefully regulated 3.3-volt power supply. It turned out that magnetic amplifiers were an inexpensive yet efficient way to control this voltage, making the mag amp a key part of most PC power supplies.
Somebody think of poor yttrium -- once the world's go-to source for electron-beam stimulated red dots of color TVs (although the red photons came, confusingly, out of europium instead) -- now languishing in relative obscurity, used mainly to make high-temperature superconductors, solid-state garnet lasers, and (with iridium and manganese) an intensely blue pigment. Or its cousins, ytterbium, terbium, and erbium, all of which were first refined out of an obscure mineral, ytterbite, and all named with a stunning deficit of creativity.
Erbium has found a use, in picogram quantities, to amplify optical signals in fibers directly from exposure to light. Terbium was once used for the green dots on TV screens, but is still used in a material that expands and contracts in response to magnetic fields. Uses for ytterbium are more obscure.
High-temperature superconductivity might be useful someday.
Curiously, the most important CURRENT application of magnetic amplifiers was not mentioned. In every excimer laser used for photolithography and thin film panel annealing, and possibly for LASIK, their is a multistage saturable inductor chain that compresses and amplifies the electrical pumping pulse generated by an IGBT or SCR switch. There are indeed few uses for this technology OTHER than the excimer laser.
> One Navy training manual of 1951 explained magnetic amplifiers in detail -- although with a defensive attitude about their history: "Many engineers are under the impression that the Germans invented the magnetic amplifier; actually it is an American invention. The Germans simply took our comparatively crude device, improved the efficiency and response time, reduced weight and bulk, broadened its field of application, and handed it back to us."
Does that sound "defensive" to everyone?
I don't know this area of engineering, so, to my ear, this could also plausibly be an almost admiring acknowledgement of someone else succeeding where you'd failed, combined (non-defensively) with confidence, because you had the strength or other superior merit to take it from them?
Not that it should matter, but considering the source (and time), the phrasing could be understood to imply a level of pride that we got there first. Although, they do seem to give the Germans credit for expanding the technology. Nevertheless, it's the reader that chooses how to interpret the text, and what lessons to learn from it.
Various nationalism and other thinking, in different groups, at different times, are still largely a mystery to me. It seems relevant to group thinking we see today, which is still confusing (e.g., why is much of the thinking and dialogue around the two main US political parties doing what it's doing, and how is that representative/determining of what the broader population actually thinks).
At the height of early computing there were relays (Turing-Welchman
Bombe) competing with vacuum-tubes (Manchester Mk1 etc), and all had
MTBF times measured in hours. Constant replacement of valves and
relays was a full time job. Why didn't magamps replace them? I don't
think it was necessarily the efficiency, but the problem of doing
logic using AC signals. Any EE's have thoughts on this?
"After the war, U.S. intelligence officers scoured Germany for useful scientific and technical information. Four hundred experts sifted through billions of pages of documents and shipped 3.5 million microfilmed pages back to the United States, along with almost 200 tonnes of German industrial equipment."
Presumably refers to technology that was supplanted by an alternative technology or implementation for wider use. Inferred meaning, no sources to cite.
Not supplanted, but parallel evolution (theoretical or otherwise) that could have become the modern/main way of doing things. Compare to "alternate history" used in fiction.
"Amplification occurs because a relatively small DC control current can modify a much larger AC load current."
Doesn't the larger AC load current itself also saturate the core though, overpowering whatever the much weaker current does? Or is only DC current able to saturate it?
To use them in a computer for logic gates, if you want to have multiple logic gates work together (such as an adder where bits propagate through many gates), does that then mean you'd need to convert AC outputs to DC control inputs for every single connection between gates? Or is there some more efficient way they did this?
For logic gates, the signals are pulses, so it doesn't really make sense to categorize them as AC or DC.
If you want the exact circuit, take a look at the Univac Solid State manual page 6-89: http://www.bitsavers.org/pdf/univac/uss/UNIVAC_7900_Central_...
Apparently, due to lagging semiconductor fabrication technology, they were the state-of-the-art devices in the USSR even several years after losing to transistors in the US. I was terribly confused when I first heard about "ferrite-diode systems" since you obviously can't make a diode out of ferrite!
I think that is correct. But maybe you can use diode logic instead for most of your gates, resorting to ferrite only when you need inversion, amplification, or memory.