Escolar Documentos
Profissional Documentos
Cultura Documentos
By John Kopplin
The first computers were people! That is, electronic computers (and the earlier mechanical computers) were given this name because they performed the work that had previously been assigned to people. "Computer" was originally a job title: it was used to describe those human beings (predominantly women) whose job it was to perform the repetitive calculations required to compute such things as navigational tables, tide charts, and planetary positions for astronomical almanacs. Imagine you had a job where hour after hour, day after day, you were to do nothing but compute multiplications. Boredom would quickly set in, leading to carelessness, leading to mistakes. And even on your best days you wouldn't be producing answers very fast. Therefore, inventors have been searching for hundreds of years for a way to mechanize (that is, find a mechanism that can perform) this task.
This picture shows what were known as "counting tables" [photo courtesy IBM]
A more modern abacus. Note how the abacus is really just a representation of the human fingers: the 5 lower rings on each rod represent the 5 fingers and the 2 upper rings represent the 2 hands.
Revised 2002 Page 2 of 41
A slide rule
Leonardo da Vinci (1452-1519) made drawings of gear-driven calculating machines but apparently never built any.
Revised 2002
Page 4 of 41
A 6 digit model for those who couldn't afford the 8 digit model
A Pascaline opened up so you can observe the gears and cylinders which rotated to display the numerical result
Just a few years after Pascal, the German Gottfried Wilhelm Leibniz (co-inventor with Newton of calculus) managed to build a four-function (addition, subtraction, multiplication, and division) calculator that he called the stepped reckoner because, instead of gears, it employed fluted drums having ten flutes arranged around their circumference in a stair-step fashion. Although the stepped reckoner employed the decimal number system (each drum had 10 flutes), Leibniz was the first to
Revised 2002 Page 6 of 41
Leibniz's Stepped Reckoner (have you ever heard "calculating" referred to as "reckoning"?) In 1801 the Frenchman Joseph Marie Jacquard invented a power loom that could base its weave (and hence the design on the fabric) upon a pattern automatically read from punched wooden cards, held together in a long row by rope. Descendents of these punched cards have been in use ever since (remember the "hanging chad" from the Florida presidential ballots of the year 2000?).
By selecting particular cards for Jacquard's loom you defined the woven pattern [photo 2002 IEEE]
Revised 2002
Page 8 of 41
Revised 2002
Page 9 of 41
A small section of the type of mechanism employed in Babbage's Difference Engine [photo 2002 IEEE]
Revised 2002
Page 10 of 41
Revised 2002
Page 12 of 41
A few Hollerith desks still exist today [photo courtesy The Computer Museum]
Revised 2002 Page 13 of 41
Incidentally, the Hollerith census machine was the first machine to ever be featured on a magazine cover.
Revised 2002
Page 15 of 41
IBM continued to develop mechanical calculators for sale to businesses to help with financial accounting and inventory accounting. One characteristic of both financial accounting and inventory accounting is that although you need to subtract, you don't need negative numbers and you really don't have to multiply since multiplication can be accomplished via repeated addition. But the U.S. military desired a mechanical calculator more optimized for scientific computation. By World War II the U.S. had battleships that could lob shells weighing as much as a small car over distances up to 25 miles. Physicists could write the equations that described how atmospheric drag, wind, gravity, muzzle velocity, etc. would determine the trajectory of the shell. But solving such equations was extremely laborious. This was the work performed by the human computers. Their results would be published in ballistic "firing tables" published in gunnery manuals. During World War II the U.S. military scoured the country looking for (generally female) math majors to hire for the job of computing these tables. But not enough humans could be found to keep up with the need for new tables. Sometimes artillery pieces had to be delivered to the battlefield without the necessary firing tables and this meant they were close to useless because they couldn't be aimed properly. Faced with this situation, the U.S. military was willing to invest in even hair-brained schemes to automate this type of computation. One early success was the Harvard Mark I computer which was built as a partnership between Harvard and IBM in 1944. This was the first programmable digital computer made in the U.S. But it was not a purely electronic computer. Instead the Mark I was constructed out of switches, relays, rotating shafts, and clutches. The machine weighed 5 tons, incorporated 500 miles of wire, was 8 feet tall and 51 feet long, and had a 50 ft rotating shaft running its length, turned by a 5 horsepower electric motor. The Mark I ran non-stop for 15 years, sounding like a roomful of ladies knitting. To appreciate the scale of this machine note the four typewriters in the foreground of the following photo.
Revised 2002
Page 16 of 41
A central shaft driven by an outside waterwheel and connected to each machine by overhead belts was the customary power source for all the machines in a factory
Revised 2002 Page 17 of 41
One of the four paper tape readers on the Harvard Mark I (you can observe the punched paper roll emerging from the bottom)
One of the primary programmers for the Mark I was a woman, Grace Hopper. Hopper found the first computer "bug": a dead moth that had gotten into the Mark I and whose wings were blocking the reading of the holes in the paper tape. The word "bug" had been used to describe a defect since at least 1889 but Hopper is credited with coining the word "debugging" to describe the work to eliminate program faults.
Revised 2002
Page 18 of 41
(that's just the operator's console, here's the rest of its 33 foot length:)
Revised 2002
Page 20 of 41
to be bested by a home computer of 1976 such as this Apple I which sold for only $600:
The Apple 1 which was sold as a do-it-yourself kit (without the lovely case seen here)
Revised 2002 Page 21 of 41
Typical wiring in an early mainframe computer [photo courtesy The Computer Museum]
Revised 2002
Page 22 of 41
Revised 2002
Page 25 of 41
Revised 2002
Page 26 of 41
Two views of ENIAC: the "Electronic Numerical Integrator and Calculator" (note that it wasn't even given the name of computer since "computers" were people) [U.S. Army photo]
Revised 2002
Page 28 of 41
To reprogram the ENIAC you had to rearrange the patch cords that you can observe on the left in the prior photo, and the settings of 3000 switches that you can observe on the right. To program a modern computer, you type out a program with statements like: Circumference = 3.14 * diameter To perform this computation on ENIAC you had to rearrange a large number of patch cords and then locate three particular knobs on that vast wall of knobs and set them to 3, 1, and 4.
Revised 2002
Page 30 of 41
ILLIAC II built at the University of Illinois (it is a good thing computers were one-of-a-kind creations in these days, can you imagine being asked to duplicate this?)
Revised 2002 Page 31 of 41
HAL from the movie "2001: A Space Odyssey". Look at the previous picture to understand why the movie makers in 1968 assumed computers of the future would be things you walk into. JOHNNIAC was a reference to John von Neumann, who was unquestionably a genius. At age 6 he could tell jokes in classical Greek. By 8 he was doing calculus. He could recite books he had read years earlier word for word. He could read a page of the phone directory and then recite it backwards. On one occasion it took von Neumann only 6 minutes to solve a problem in his head that another professor had spent hours on using a mechanical calculator. Von Neumann is perhaps most famous (infamous?) as the man who worked out the complicated method needed to detonate an atomic bomb. Once the computer's program was represented electronically, modifications to that program could happen as fast as the computer could compute. In fact, computer programs could now modify themselves while they ran (such programs are called self-modifying programs). This introduced a new way for a program to fail: faulty logic in the program could cause it to damage itself. This is one source of the general protection fault famous in MS-DOS and the blue screen of death famous in Windows. Today, one of the most notable characteristics of a computer is the fact that its ability to be reprogrammed allows it to contribute to a wide variety of endeavors, such as the following completely unrelated fields: the creation of special effects for movies, the compression of music to allow more minutes of music to fit within the limited memory of an MP3 player, the observation of car tire rotation to detect and prevent skids in an anti-lock braking system (ABS), the analysis of the writing style in Shakespeare's work with the goal of proving whether a single individual really was responsible for all these pieces.
Page 32 of 41
Revised 2002
By the end of the 1950's computers were no longer one-of-a-kind hand built devices owned only by universities and government research labs. Eckert and Mauchly left the University of Pennsylvania over a dispute about who owned the patents for their invention. They decided to set up their own company. Their first product was the famous UNIVAC computer, the first commercial (that is, mass produced) computer. In the 50's, UNIVAC (a contraction of "Universal Automatic Computer") was the household word for "computer" just as "Kleenex" is for "tissue". The first UNIVAC was sold, appropriately enough, to the Census bureau. UNIVAC was also the first computer to employ magnetic tape. Many people still confuse a picture of a reel-to-reel tape recorder with a picture of a mainframe computer.
Revised 2002
Page 34 of 41
There were 2 ways to interact with a mainframe. The first was called time sharing because the computer gave each user a tiny sliver of time in a round-robin fashion. Perhaps 100 users would be simultaneously logged on, each typing on a teletype such as the following:
The Teletype was the standard mechanism used to interact with a time-sharing computer
Revised 2002
Page 35 of 41
Revised 2002
Page 36 of 41
After observing the holes in paper tape it is perhaps obvious why all computers use binary numbers to represent data: a binary bit (that is, one digit of a binary number) can only have the value of 0 or 1 (just as a decimal digit can only have the value of 0 thru 9). Something which can only take two states is very easy to manufacture, control, and sense. In the case of paper tape, the hole has either been punched or it has not. Electro-mechanical computers such as the Mark I used relays to represent data because a relay (which is just a motor driven switch) can only be open or closed. The earliest allelectronic computers used vacuum tubes as switches: they too were either open or closed. Transistors replaced vacuum tubes because they too could act as switches but were smaller, cheaper, and consumed less power. Paper tape has a long history as well. It was first used as an information storage medium by Sir Charles Wheatstone, who used it to store Morse code that was arriving via the newly invented telegraph (incidentally, Wheatstone was also the inventor of the accordion). The alternative to time sharing was batch mode processing, where the computer gives its full attention to your program. In exchange for getting the computer's full attention at run-time, you had to agree to prepare your program off-line on a key punch machine which generated punch cards.
Revised 2002
Page 37 of 41
An IBM Key Punch machine which operates like a typewriter except it produces punched cards rather than a printed sheet of paper
University students in the 1970's bought blank cards a linear foot at a time from the university bookstore. Each card could hold only 1 program statement. To submit your program to the mainframe, you placed your stack of cards in the hopper of a card reader. Your program would be run whenever the computer made it that far. You often submitted your deck and then went to dinner or to bed and came back later hoping to see a successful printout showing your results. Obviously, a program run in batch mode could not be interactive. But things changed fast. By the 1990's a university student would typically own his own computer and have exclusive use of it in his dorm room.
Revised 2002
Page 38 of 41
Revised 2002
Page 39 of 41
Revised 2002
Page 40 of 41
Bibliography: "ENIAC: The Triumphs and Tragedies of the World's First Computer" by Scott McCartney.
Revised 2002
Page 41 of 41