Você está na página 1de 4

Brief History of Electronics Maxwell's theory was developed in the 1870s.

About fifteen years later the German physicist Heinrich Hertz generated radio waves. He supplied an electric charge to a capacitor and inductor, and then shortcircuited the capacitor through a spark gap. Charges surging back and forth, created an oscillating electric discharge. Some of the energy of this oscillation was radiated from the spark gap in the form of electromagnetic waves at radio frequencies. Practical radio communication required good amplifiers but these were not available until the development of vacuum tubes which mark the real beginnings of electronics. The vacuum tube traces its developments to the discovery made by the American inventor Thomas Alva Edison. He discovered that a current will flow between the hot filament of an incandescent lamp and another electrode placed in the same lamp, and that this current will flow in only one direction. The British physicist and electrical engineer John Ambrose Fleming in 1904 observed that current would only flow in one direction in such a device. Since this device consisted of an anode and a cathode it was called a diode and used as a radio detector. A revolutionary advance, which made possible the science of electronics, occurred in 1906 when the American inventor Lee De Forest mounted a third element, the grid, between the filament and cathode of a vacuum tube.

De Forest's tube, which he called an audion but which is now called a triode (three-element tube). It was first used only as a detector, but its potentialities as an amplifier and oscillator were soon developed. The rectifying properties of crystals were discovered in 1912 by the American electrical engineer Pickard. (Solid State?) Improvements in electronics continued through the 30s and was especially rapid during World War II. The development of the cavity magnetron was a very significant event that greatly improved the capabilities of the Allies to detect enemy ships and planes. Airborne radar even provided a means for bombing without actually seeing the target. The beginnings of electronic countermeasures can also be traced back to this period including the use of window to hinder radar detection in 1943. Another significant aspect was the decoding of enemy radio transmissions which was one of the factors that greatly speeded up the development of the computer. Both German and Japanese unbreakable encoding schemes were broken with disastrous consequences for the armed forces of those countries. Howard H. Aiken, a Harvard engineer working with IBM, succeeded in producing an all-electronic calculator by 1944. The purpose of this machine was to create ballistic charts for the U.S. Navy. It was about half as long as a football field and contained about 500 miles of wiring.

The machine was slow (taking 3-5 seconds per calculation) and inflexible (in that sequences of calculations could not change). Another computer development spurred by the war was the Electronic Numerical Integrator and Computer (ENIAC), produced by a partnership between the U.S. government and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery that it consumed 160 kilowatts of electrical power, enough energy to dim the lights in an entire section of Philadelphia. In the mid-1940's John von Neumann (1903-1957) joined the University of Pennsylvania team, initiating concepts in computer design that remained central to computer engineering for the next 40 years. In 1951, the UNIVAC I (Universal Automatic Computer), built by Remington Rand, became one of the first commercially available computers to take advantage of these advances. By 1948, the invention of the transistor greatly changed the electronics industry. The transistor replaced the large, cumbersome vacuum tube in televisions, radios and computers. As a result, the size of electronic machinery has been shrinking ever since. Coupled with early advances in magnetic-core memory, transistors led to second generation computers beginning in 1956 that were smaller, faster, more reliable and more energy-efficient than their predecessors. Throughout the early 1960's, there were a number of commercially successful second generation computers used in business, universities, and government. The invention of the integrated circuit and the

Though transistors were clearly an improvement over the vacuum tube, they still generated a great deal of heat, which damaged the computer's sensitive internal parts. Jack Kilby, an engineer with Texas Instruments, developed the integrated circuit (IC) in 1958. This IC combined three electronic components onto a small silicon disc. In the following year, Robert Noyce of Fairchild independently developed an integrated circuit that was fabricated using techniques similar to what is used today. By the 1980's, very large scale integration (VLSI) squeezed hundreds of thousands of components onto a chip. Ultra-large scale integration (ULSI) increased that number into the millions. The ability to fit so much onto an area about half the size of a U.S. dime helped diminish the size and price of computers and also increased their power, efficiency and reliability. The Intel 4004 chip, developed in 1971, took the integrated circuit one step further by locating all the components of a computer (central processing unit, memory, and input and output controls) on a minuscule chip. Whereas previously the integrated circuit had had to be manufactured to fit a special purpose, now one microprocessor could be manufactured and then programmed to meet any number of demands . The 1980s saw a great expansion in computer use as personal computers became more affordable. The number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were in use and numbers have continued to grow very rapidly.

Você também pode gostar