During the 1950s, the world saw the emergence of the first commercial (and military), general, and special purpose computers.
As each computer built had its own logic design, at first, they were only programmable by technicians and engineers who were intimately familiar with the specific Logic design. Using a set of switches on a control panel, special sequences of switches were used to start and stop the computer, add program information, as well as data. Later in the decade, both general and special purpose computers were programmable using media, both "punched cards" and "paper tape", and large printers were produced that could show program code, data and results.
The first computers stored information in either "core" memory or in "magnetic drums". Later in the decade, magnetic tapes -- like those used in Hi-Fi equipment, were built to hold both programs and data.
Computers were built in the US, UK, Holland, Germany and Japan. New IT companies were formed and older ones completely revamped their business around the new machines.
These efforts also produced some of the first automation devices -- robots -- that could be used for specific repetitive tasks in factory lines. Finally towards the end of the decade, we see the emergence of transistors taking over from vacuum tubes. Transistors were more reliable and of course, smaller, resulting in "cheaper" computers.
In short, the 1950s saw the beginning of an explosion in the development and utilization of the Computer. This explosion, has not abated. In fact 60 years later, the rate of the explosion continues to increase.
Probably the blooper of the decade was this:
On election night, 1952, a A UNIVAC I computer made by Remington-Rand and leased by CBS, accurately predicted the outcome of the US presidential election -- a landslide for President Eisenhower -- within an hour or two of the poll closing. But Mr. Cronkite the anchor of the news show and his editorial team could not believe the results that the machine calculated and they had the predictions changed to report a closer margin. So it appeared to the public, that the computer results were wrong. The journalists felt it not to be "politically correct" to make the predictions of the Univac public in such an early stage. Or as others said later: they could not believe the statistical analysis. As the true nature of what happened came to light, this brought the computer to the attention of the general public, and begins the era of poll manipulation, er I meant computer projections.
(Above) Univac I and co. during the election. Note the magnetic tapes in the background. These were used to store data much like hard drives and flash drives today
The statement from Eckert and Cronkite, afterwards:
The image above is the EDSAC computer. If you look closely at the image, you can see the vacuum tube assemblies. They blew out often (like light bulbs) and replacement was extremely hard, as you had to diagnose which tube was blown. Thus diagnostic tools started to be developed.
Maurice V. Wilkes at Cambridge University uses "symbolic assembly language" on the EDSAC. Assembler was developed by David Wheeler under Wilkes' management. Assembler was a second generation language -- in that programs were developed that Assembler -- well assembled, into the instruction set of the specific computer. Thus each major computer that was built after this time -- well into the 1980s, had its own assembler version. The neat part was that the assembler code was similar, and it was the assembler program that made the translation to the specific machine.
Programmers only had to know generic (albeit arcane) terms and the specific assembler would take care of the translation to the computer.
The first generation of computers were essentially either hard-wired -- specific functional tubes were connected by wire to other tubes, or control panel switches had to be precisely set to enter program and data. The first programs were actual machine code, telling the computer specifically what tubes to turn on and off, for each step of the program execution. Each computer design had its own "instruction set" and still do today,
Obviously, these required massive attention to detail, and the slightest error required re-entry.
Conrad Zuse in Germany, made available the world's first commercial computer -- the Z4.
The Standards Western Automatic Computer (SWAC), built under Harry Huskey's management, at the UCLA (University of California, Los Angeles USA). This computer was built for the US Government's National Bureau of Standards, (NBS), now the National Institute of Standards (NIS), ushering in a new era of FLAs (Four Letter Acronyms). In the sixties, advances in acronyms led to TLAs or Three-Letter acronyms. Acronyms were developed so that mere mortals could never understand what programmers and engineers were doing with the millions of dollars in research funds, and would just smile politely so as not to appear to be idiotic.
The first entirely commercial special purpose computer is made in the USA: ERA 1101. The name was quickly changed to something less understandable, but tangy -- the Univac 1101. This new name allowed more people to smile knowingly. Seriously, this computer used a magnetic drum as memory and could store 1 million bits. The "special purpose" of this computer was to read bank checks printed with special magnetic ink.
THE MAGNETIC DRUM FROM THE IBM 650 COMPUTER.
THE ADVENT OF THE PRINTED CIRCUIT BOARDS (PCB)
Moe Abramson and Stanislaus F. Danko developed the "Auto-Sembly" process, in which component leads are inserted into a copper foil interconnection pattern and dip soldered. With the development of board lamination and etching techniques, this concept evolved into the standard printed circuit board fabrication process in use today. In 1956 they won the patent from the American Patent Office.
PATENT DRAWING FOR THE PRINTED CIRCUIT BOARD
The Pilot ACE was one of the first computers built in the United Kingdom, at the National Physical Laboratory (NPL) in the early 1950s. It was a preliminary version of the full ACE, which had been designed by Alan Turing. After Turing left NPL (in part because he was disillusioned by the lack of progress on building the ACE) James H. Wilkinson took over the project, Harry Huskey helped with the design. The Pilot ACE ran its first program on May 10, 1950 and was demonstrated to the press in December 1950. Although originally intended as a prototype, it became clear that the machine was a potentially very useful resource, especially given the lack of other computing devices at the time. After some upgrades to make operational use practical, it was into service in late 1951, and saw considerable operational service over the next several years. It had approximately 800 vacuum tubes, and used mercury delay lines for its main memory. The original size of the latter was 128 32-bit words, but that was later expanded to 352 words; a 4096-word drum memory was added in 1954. Its basic clock rate, 1 megahertz, was the fastest of the early British computers. The time to execute instructions was highly dependent of where they were in memory (due to the use of delay line memory). An addition could take anywhere from 64 microseconds to 1024 microseconds.
The machine was so successful that a commercial version of it, named the DEUCE, was constructed and sold by the English Electric Company. The Pilot ACE was shut down in May, 1955, and was given to the Science Museum, where it remains today.
Remington Rand bought the Eckert-Mauchly Computer Corporation. (Ecjard and Maunchly had designed and built the first american computer - the ENIAC. Remington Rand (1927–1955) was an early American business machines manufacturer, best known originally as a typewriter manufacturer and in a later incarnation as the manufacturer of the UNIVAC line of mainframe computers. They made their fame (infamy) by inventing fast action Remington Rifles in the nineteenth century. For a time, the word "UNIVAC" was recognized as a generic synonym for "computer". Remington Rand was a diversified conglomerate making other office equipment, electric shavers, etc. The Remington Rand Building at 315 Park Avenue South in New York City is a 20-floor skyscraper completed in 1911.This company later changes its name to UNIVAC, and later to Sperry. In the mid-1980s, Sperry could not keep up with the paradigm change to personal computers and was bought by the Burroughs Corporation. The combined company became known as UNISYS.
Howard Aiken, chief designer of the Mark I, during a conversation to John Curtiss, of the National Council of Investigation of the USA in reference to the situation of the investigation and development of the UNIVAC.
"We will never have enough problems to have enough work for one or two computers working on it".
In 1950, Clyde Shannon proposes the idea of a chess program. Shannon was one of those special scientists, who did not have the fame of others, yet without him, nothing that we take for granted today would have been possible. His idea that using logic, he could program a computer to play a decent chess game was in reality, the forerunner to both commercial software, and Artificial Intelligence systems (such as Pattern Recognition). He used this skill in a somewhat humorous way as well. Shannon and his wife Betty also used to go on weekends to Las Vegas with M.I.T. mathematician Ed Thorp, and made very successful forays in blackjack using game theory type methods co-developed with fellow Bell Labs associate, physicist John L. Kelly Jr. based on principles of information theory.
They made a fortune, as corroborated by the writings of Elwyn Berlekamp, Kelly's research assistant in 1960 and 1962. Shannon and Thorp also applied the same theory, later known as the Kelly criterion, to the stock market with even better results. But it was with his groundbreaking paper on computer chess entitled Programming a Computer for Playing Chess that had long-term dividends for the industry. His process for having the computer decide on which move to make is called a minimax procedure, based on an evaluation function of a given chess position.
He was known to invent fun and silly objects. One of his more humorous devices was a box kept on his desk called the "Ultimate Machine", based on an idea by Marvin Minsky. Otherwise featureless, the box possessed a single switch on its side. When the switch was flipped, the lid of the box opened and a mechanical hand reached out, flipped off the switch, then retracted back inside the box.
In addition he built a device that could solve the Rubik's cube puzzle.
He is also considered the co-inventor of the first wearable computer along with Edward O. Thorp. The device was used to improve the odds when playing roulette!
According to Neil Sloane, an AT&T Fellow who co-edited Shannon's large collection of papers in 1993, the perspective introduced by Shannon's communication theory (now called information theory) is the foundation of the digital revolution, and every device containing a microprocessor or microcontroller is a conceptual descendant of Shannon's 1948 publication: "He's one of the great men of the century. Without him, none of the things we know today would exist. The whole digital revolution started with him."
However, Shannon was oblivious to the marvels of the digital revolution because his mind was ravaged by Alzheimer's disease. His wife mentioned in his obituary that: "he would have been bemused" by it all.
One final point about Dr. Shannon, Theseus, he created in 1950, was a magnetic mouse controlled by a relay circuit that enabled it to move around a maze of 25 squares. Its dimensions were the same as an average mouse. The maze configuration was flexible and it could be modified at will. The mouse was designed to search through the corridors until it found the target. Having travelled through the maze, the mouse would then be placed anywhere it had been before and because of its prior experience it could go directly to the target. If placed in unfamiliar territory, it was programmed to search until it reached a known location and then it would proceed to the target, adding the new knowledge to its memory thus learning. Shannon's mouse appears to have been the first learning device of its kind.
As a footnote to the Chess-playing computer, it would be another 40 years before a computer could beat a Chess Grand Master. The computer below - IBM's Deep Blue, beat the Russian Garry Kasparov by two games to one in 1997.
IBM'S DEEP BLUE
A discussion of these early days of technology cannot be complete with including Britain's brightest and probably, most troubled scientist, Alan Turing. If Clyde Shannon, Eckart and Mauchley led the computer revolution in the US, Turing, was Britain's answer and is considered one of the fathers of modern computer science. Time Magazine in naming Turing one of the 100 most influential people of the 20th century, states: "The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine." Turing was brilliant in too many things to quote here without making me envious, however, he conceptualized a universal machine. A machine that was capable of performing whatever was required of it. He also came up with the concept of the Turing Test, that essentially said that a human being -- sitting answering questions, would not be able to differentiate whether another human or a computer, was asking the question and responding to the answers.
By detecting which core is magnetic and which not one could "determinate" certain values. With these values can be made calculations. But they were more solid and reliable than vacuum tubes. When you think of electricity passing through specific points -- or not, you have a series of cores with a state of zero and one. This was how the binary system (initially) was used for storage, first for data, and later for simple logical operations. The image shows an early core. This piece was 6.25" square and held 2000 bits of information. For reference, today's computers often come with 2-6 billion bits of storage capacity, and for specialized applications, can hold much more.
Independently of Wang Labs, F.W. Viehe also invented the ferrite core memory, as did Jay Forrester of MIT a couple of years later (using an enhanced technology).
Coronado Corporation changed its name into Texas Instruments Incorporated.
The largest vacuum tube computers ever build were those for the SAGE project an USA airforce project build by IBM consisting of 50 computers. It was also the very first air defense system and the first one with real time computer human interfaces.
THE WHIRLWIND PROGRAM
During World War II, the U.S. Navy approached MIT about the possibility of creating a computer to drive a flight simulator to train flight crews. They envisioned a fairly simple system in which the computer would continually update a simulated instrument panel based on control inputs from the pilots. Unlike older systems like the Link Trainer, the system they envisioned would have a considerably more realistic aerodynamics model that could be adapted to any type of plane.
A short study by the MIT Servomechanisms Laboratory concluded that such a system was certainly possible. The Navy decided to fund development under Project Whirlwind, and the lab placed Jay Forrester in charge of the project. They soon built a large analog computer for the task, but found that it was inaccurate and inflexible. Solving these problems would require a much larger system, perhaps one so large as to be impossible to construct.
In 1945 Perry Crawford, another member of the MIT team, saw a demonstration of ENIAC and suggested that a digital computer was the solution. Such a machine would allow the accuracy of the simulation to be improved with the addition of more code in the computer program, as opposed to adding parts to the machine. As long as the machine was fast enough, there was no theoretical limit to the complexity of the simulation.
Up until this point all computers constructed were dedicated to single tasks, run in batch mode. A series of inputs were set up in advance and fed into the computer, which would work out the answers and print them. This was not appropriate for the Whirlwind system, which needed to operate continually on an ever-changing series of inputs. Speed became a major issue, whereas with other systems it simply meant waiting longer for the printout, with Whirlwind it meant seriously limiting the amount of complexity the simulation could include.
After Whirlwind was completed and running, a design for a larger and faster machine to be called Whirlwind II was begun. But the design soon became too much for MIT's resources. It was decided to shelve the Whirlwind II design without building it and concentrate MIT's resources on programming and applications for the original machine, now called Whirlwind I.
When the Air Force decided to construct the Semi Automatic Ground Environment (SAGE) air defense system, IBM, the prime contractor for the AN/FSQ-7 computer based the machine's design more on the stillborn Whirlwind II design than on the original Whirlwind. Thus the AN/FSQ-7 is sometimes incorrectly referred to as "Whirlwind II", even though they were not the same machine or design.
Here is a great example of a deep cold war propaganda film on Sage.
THE SAGE CONTROL ROOM
A transistor is a semiconductor device commonly used to amplify or switch electronic signals. A transistor is made of a solid piece of a semiconductor material, with at least three terminals for connection to an external circuit. A voltage or current applied to one pair of the transistor's terminals changes the current flowing through another pair of terminals. Because the controlled (output) power can be much larger than the controlling (input) power, the transistor provides amplification of a signal. The transistor is the fundamental building block of modern electronic devices, and is used in radio, telephone, computer and other electronic systems. Some transistors are packaged individually but most are found in integrated circuits.
As a boy, I remember the shift from vacuum tube radios to so called "Transistor Radios". In fact, my Dad and I built a transistor radio from a kit. It involved holding a wire to a water pipe (that acted as an antenna, and being very quiet while we heard the radio through a tiny earpiece.
As transistors could mimic the behavior of vacuum tubes (holding, amplifying, releasing electrical currents), they quickly were adopted as a replacement for vacuum tubes in almost all electronic devices, including of course, computers.
How did this happen?
On 17 November 1947 John Bardeen and Walter Brattain, at AT&T Bell Labs, observed that when electrical contacts were applied to a crystal of germanium, the output power was larger than the input.
William Shockley saw the potential in this and worked over the next few months greatly expanding the knowledge of semiconductors and is considered by many to be the "father" of the transistor, although truly, the invention was a series of events linked to several physicists. The term was coined by John R. Pierce. The transistor is considered by many to be one of the greatest invention of the twentieth-century. It is the key active component in practically all modern electronics. Its importance in today's society rests on its ability to be mass produced using a highly automated process that achieves astonishingly low per-transistor costs and astonishingly reliable devices.
The first patent for the field-effect transistor (FET) principle was filed in Canada by Austrian-Hungarian physicist Julius Edgar Lilienfeld on October 22, 1925, but Lilienfeld did not publish any research articles about his devices. Hence the term "publish or perish". The term "field effect" meant that the behavior of the transistor could be modified depending on the requirement at the moment (ie a switch, a gate or a store).
Although several companies each produce over a billion individually-packaged (known as discrete) transistors every year, the vast majority of transistors produced are in integrated circuits (often shortened to IC, microchips or simply chips) along with diodes, resistors, capacitors and other electronic components to produce complete electronic circuits. A logic gate consists of about twenty transistors whereas an advanced microprocessor, as of 2006, can use as many as 1.7 billion transistors (MOSFETs). "About 120 million transistors were built this year  ... for [each] man, woman, and child on Earth."
The transistor's low cost, flexibility and reliability have made it a ubiquitous device. Transistorized mechatronic circuits have replaced electromechanical devices in controlling appliances and machinery. It is often easier and cheaper to use a standard microcontroller and write a computer program to carry out a control function than to design an equivalent mechanical control function.
EARLY TRANSISTOR DEVICES
Singer introduced the first electronic sewing machine. This was a prime example what electronics could do. It meant that 350 expensive fine mechanical precision parts were replaced by logic in a cheap special purpose processor.
The first product to use transistors is tremendously useful to a lot of people: hearing aids. This is what they looked like in the 1920s:
Here's what they looked like with transistor technology (my mother, who was deaf, had one of these):
And this is today:
Before transistors, radios looked like this:
In the 1950s, "Transistor Radios" became the rage. A fond memory is building one as a kit with my Dad. We had to use a water pipe as an Antenna and the sound was terrible.
A first concept of Integrated Circuits is published by Geoffrey Dummer in Washington.
IC patent Noyce c.s. 1959 (Fairchild)
The first 'flat transistor' is designed by Jean Hoerni et al.
Flat because this form of the transistor consists of layers of thin semi conductor material.
Jean Hoerni, Kurt Lehovec, and Robert N. Noyce at Fairchild laboratories take part in the development of the Integrated Circuit - a circuit on a single slice of silicon.
Noyce´s practical integrated circuit, invented at Fairchild Camera and Instrument Corp., allowed printing of conducting channels directly on the silicon surface.
The illustration above shows a technical drawing from the patent Hoerni, Lehovec, and Noyce submitted. In this case it is a so called NPN type of transistor - negative - positive - negative layered. In 1959 they will complete the project successfully. In 1961 the first commercial Integrated Circuit will be put on the market.
Though which team invented the IC is controversial it is generally accepted that both teams at Fairchild and Texas Instruments developed the IC independently.
In December Jack St. Clair Kilby at Texas Instruments completed building the first IC (integrated circuit), containing five components on a piece of germanium measuring half an inch long and thinner than a toothpick(3)
IC by Jack Kilby
Kilby created the circuit to prove that resistors and capacitors could exist on the same piece of semiconductor material. His circuit consisted of a sliver of germanium with five components linked by wires. It is generally accepted that both teams at Fairchild and Texas Instruments developed the IC independently.
Both Texas Instruments (February) and Fairchild Semiconductor corporation (July) file for a patent for the process to produce transistors on a flat layer.
This is the process that will take the IC to mass production in about two years time. Both firms engaged in a legal battle that will last through the decade of the 60's until both companies decided to cross-license their technologies.
Fairchild Camera and Instrument Corp. invented the resistor-transistor logic (RTL) product on a chip.
Fairchild Camera and Instrument Corp. invented the resistor-transistor logic (RTL) product, a set/reset flip-flop and the first integrated circuit available as a monolithic chip.
G.W.A. Dummer, exhibits during the Royal Radar Exhibition in England a model of an Integrated circuit.
Jack Kilby and Robert Noyce invent the SILICON WAFER. With this they lay the foundation of the: Third Generation of Computers.
DATA COMPRESSION AND EARLY DATA COMMUNICATIONS
David A. Huffman (1926 - 1999) (USA) developed the Huffman Code. This code that will be used to compress data to be transmitted over networks and modems, programming for video recorders and high definition television. The algorithm Huffman designed made it possible to compress data over 25%, depending on the type of data. That saved a lot of time and money if you know that transmissions were done by 110 bits per second
Professor Robert M. Fano (MIT, Boston USA) put Huffman to a choice: either an assignment to graduate or a final exam. Huffman choose the assignment.
The assignment was:
"Design with the help of binary code (0 en 1) the most efficient method to represent characters, figures and symbols."
Such a code could send information over the network or be saved in a computer's memory
The assignment sounded simple but after some months of studying and trying this appeared not the case to Huffman, on the verge to give up he got suddenly the insight of a solution. He thought of using the so called binary tree technique.
The binary tree
The principle of the code is as follows:
Assign to the most frequently used characters the shortest binary code, as at the other hand the less used symbols gets the longest binary code. This process is done by a kind of coding tree. The probability that the symbol occurs is represented as a leaf on a tree. The two lowest probabilities will be added to form a new probability. The combinations of these probabilities will go via the branches of the tree until the last to numbers are 1 and 0. Thus forming the root of the tree
Each probability is a leaf, each branch get a 1 or 0. Code words are formed via branches from the root till the top of the tree. In that way the binary code is formed
When characters are formed, an E for example - having a probability of approximately 0.13 - could be represented as short living. and thus a short binary code.
Example of huffman's binary tree
The first high speed printer is developed by Remington Rand for use on the UNIVAC
ILLIAC - the most powerful computer yet is built at the University of Illinois.
"We can only see a short distance ahead, but we can see plenty there that needs to be done."
Alan Turing, on computers, just before his death in 1954
You will notice the beginning of change in the format for the rest of this story. While there were certain individuals who made tremendous contributions to the industry, and I will talk about them, the IT industry really changed in the mid 1950s so that corporations -- with teams of people -- started to develop the technology that we use today. Thus I will start talking more about the companies more than individuals at this point. Further, what you will see is that huge companies -- seemingly invincible -- were often unable to adapt to paradigm changes in technology, and a lot of those early leaders do not even exist today.
Paul Niquette is sometimes credited with coining the word 'software.' However, it's first used in print by John Tukey in a 1958 article for the American Mathematical Monthly, and thus some attribute the term to him. We'll never know if they meant the same thing. Software in computers is -- well, anything that isn't hardware. Software includes websites such as this one, computer programs, video games, etc. that are coded by programming languages like C, C++, etc. Hierachically, software includes the following elements:
Firmware which is software programmed resident to electrically programmable memory devices on board mainboards or other types of integrated hardware carriers. A typical Firmware device is a TV Remote Control. However, back in the 1950s, people actually had to stand up, walk to the TV set and change the channel using a dial switch (hardware). Can you imagine?
System software is software that basically makes the computer work such as operating systems, which interface with hardware to provide the necessary services for application software. However, besides operating systems, other examples are anti-virus software, communication software and printer drivers.
Operating Systems (commonly abbreviated to either OS or O/S) is an interface between hardware and user; it is responsible for the management and coordination of activities and the sharing of the limited resources of the computer. The operating system acts as a host for applications that are run on the machine. As a host, one of the purposes of an operating system is to handle the details of the operation of the hardware. This relieves application programs from having to manage these details and makes it easier to write applications. Almost all computers, including handheld computers, cell phones, desktop computers, supercomputers, and even video game consoles, use an operating system of some type.
Application software such as word processors which allow people to write ridiculous volumes of text for use in reports, newspapers, etc. Just like having a paintbrush does not make you an artist, having word processing software does not make you a writer. There is no substitute for learning good writing skills. A book written well before computers were dreamed of is "The Elements of Style" written by William Strunk in 1918. The book was re-written in 1959 by E. B. White. He described it as a "forty-three-page summation of the case for cleanliness, accuracy, and brevity in the use of English." A few weeks later, White wrote a piece for The New Yorker lauding Professor Strunk and his devotion to "lucid" English prose. The book's author died in 1946, and Macmillan and Company commissioned White to recast a new edition of The Elements of Style, published in 1959. In this revision, White independently expanded and modernized the 1918 work, creating the handbook now known to millions of writers and students as, simply, "Strunk and White." White's first edition sold some two million copies, with total sales of three editions (over a span of four decades) surpassing ten million copies.