Você está na página 1de 6

Moores law

The observation made in 1965 by Gordon Moore, co-founder ofIntel, that the number
of transistors per square inch on integrated circuits had doubled every year since the integrated
circuit was invented. Moore predicted that this trend would continue for the foreseeable future.
In subsequent years, the pace slowed down a bit, but data density has doubled approximately
every 18 months, and this is the current definition of Moore's Law, which Moore himself has
blessed. Most experts, including Moore himself, expect Moore's Law to hold for at least another
two decades.

Moores Law is a computing term which originated around 1970; the simplified version of this
law states that processor speeds, or overall processing power for computers will double every
two years. A quick check among technicians in different computer companies shows that the
term is not very popular but the rule is still accepted.
To break down the law even further, it specifically stated that the number of transistors on an
affordable CPU would double every two years (which is essentially the same thing that was
stated before) but more transistors is more accurate.
If you were to look at processor speeds from the 1970s to 2009 and then again in 2010, one may
think that the law has reached its limit or is nearing the limit. In the 1970s processor speeds
ranged from 740 KHz to 8MHz; notice that the 740 is KHz, which is Kilo Hertz while the 8 is
MHz, which is Mega Hertz.
From 2000 2009 there has not really been much of a speed difference as the speeds range from
1.3 GHz to 2.8 GHz, which suggests that the speeds have barely doubled within a 10 year span.
This is because we are looking at the speeds and not the number of transistors; in 2000 the
number of transistors in the CPU numbered 37.5 million, while in 2009 the number went up to
an outstanding 904 million; this is why it is more accurate to apply the law to transistors than to
speed.
With all this talk of transistors the average technician or computer user may not understand what
the figures mean; a simpler way to explain is that the earlier CPUs on the market had a single
speed or frequency rating while the newer models have a rating which refers to more than one
CPU.
If youve purchased a computer recently you might have an idea of what this means as
salespersons may have sold you on the wonders of multi-core CPUs. In the example given
above, the speeds over a large number of years went between 1.3 and 2.8, which is barely double
but what needs to be kept in mind is that the 2.8 is a QUAD CORE while the 1.3 is a single
CORE. This means that the actual power of the 2.8 would be found if you multiply by four
which would give you a whopping 11.2 which is a far cry from 1.3.
Due to the rapid rate that technology has grown in the past few years, most computer technicians
you speak with whether they have heard of Moores Law or not will tell you that CPU speeds
double each year. Though Moores Law had said every two years, this rapid increase in
technological production has lessened the period in the minds of technicians and users alike.
The limitation which exists is that once transistors can be created as small as atomic particles,
then there will be no more room for growth in the CPU market where speeds are concerned.


His prediction has proven to be accurate, in part because the law is now used in
the semiconductor industry to guide long-term planning and to set targets for research and
development.
[4]
The capabilities of many digital electronic devices are strongly linked to Moore's
law: quality-adjustedmicroprocessor prices,
[5]
memory capacity, sensors and even the number
and size of pixels in digital cameras.
[6]
All of these are improving at roughly exponential rates as
well. This exponential improvement has dramatically enhanced the impact of digital electronics
in nearly every segment of the world economy.
[7]
Moore's law describes a driving force of
technological and social change, productivity and economic growth in the late 20th and early
21st centuries.
[8][9][10][11]

The period is often quoted as 18 months because of Intel executive David House, who predicted
that chip performance would double every 18 months (being a combination of the effect of more
transistors and their being faster).
[12]

Although this trend has continued for more than half a century, Moore's law should be
considered an observation orconjecture and not a physical or natural law. Sources in 2005
expected it to continue until at least 2015 or 2020.
[note 1][14]
However, the 2010 update to
the International Technology Roadmap for Semiconductors predicted that growth will slow at the
end of 2013,
[15]
when transistor counts and densities are to double only every three years.
The term "Moore's law" was coined around 1970 by the Caltech professor, VLSI pioneer, and
entrepreneur Carver Mead in reference to a statement by Gordon E. Moore.
[2][16]
Predictions of
similar increases in computer power had existed years prior. Moore may have heard Douglas
Engelbart, a co-inventor of today's mechanical computer mouse, discuss the projected
downscaling of integrated circuit size in a 1960 lecture.
[17]
A New York Times article published
August 31, 2009, credits Engelbart as having made the prediction in 1959.
[18]

Moore's original statement that transistor counts had doubled every year can be found in his
publication "Cramming more components onto integrated circuits", Electronics Magazine 19
April 1965. The paper noted that the number of components in integrated circuits had doubled
every year from the invention of the integrated circuit in 1958 until 1965
[19]
and then concluded:
The complexity for minimum component costs has increased at a rate of roughly a factor of two
per year. Certainly over the short term this rate can be expected to continue, if not to increase.
Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to
believe it will not remain nearly constant for at least 10 years. That means by 1975, the number
of components per integrated circuit for minimum cost will be 65,000. I believe that such a large
circuit can be built on a single wafer.
[1]

Moore slightly altered the formulation of the law over time, in retrospect bolstering the perceived
accuracy of his law.
[20]
Most notably, in 1975, Moore altered his projection to a doubling
every two years.
[21][22]
Despite popular misconception, he is adamant that he did not predict a
doubling "every 18 months." However, David House, an Intel colleague, had factored in the
increasing performance of transistors to conclude that integrated circuits would double
in performance every 18 months.
[note 2]

In April 2005, Intel offered US$10,000 to purchase a copy of the original Electronics
Magazine issue in which Moore's article appeared.
[24]
An engineer living in the United
Kingdom was the first to find a copy and offer it to Intel.
[25]

On 13 April 2005, Gordon Moore stated in an interview that the law cannot be sustained
indefinitely: "It can't continue forever. The nature of exponentials is that you push them out and
eventually disaster happens". He also noted that transistors would eventually reach the limits of
miniaturization at atomic levels:
In terms of size [of transistors] you can see that we're approaching the size of atoms which is a
fundamental barrier, but it'll be two or three generations before we get that farbut that's as far
out as we've ever been able to see. We have another 10 to 20 years before we reach a
fundamental limit. By then they'll be able to make bigger chips and have transistor budgets in the
billions.
In January 1995, the Digital Alpha 21164 microprocessor had 9.3 million transistors. This 64-bit
processor was a technological spearhead at the time, even if the circuit's market share remained
average. Six years later, a state of the art microprocessor contained more than 40 million
transistors. It is theorised that with further miniaturisation, by 2015 these processors should
contain more than 15 billion transistors, and by 2020 will be in molecular scale production,
where each molecule can be individually positioned.
[103]

In 2003, Intel predicted the end would come between 2013 and 2018 with 16 nanometer
manufacturing processes and 5 nanometer gates, due to quantum tunnelling, although others
suggested chips could just get bigger, or become layered.
[104]
In 2008 it was noted that for the
last 30 years it has been predicted that Moore's law would last at least another decade.
[91]

Some see the limits of the law as being in the distant future. Lawrence Krauss and Glenn D.
Starkman announced an ultimate limit of around 600 years in their paper,
[105]
based on rigorous
estimation of total information-processing capacity of any system in the Universe, which is
limited by the Bekenstein bound. On the other hand, based on first principles, there are
predictions that Moore's law will collapse in the next few decades [2040 years]".
[106][107]

One could also limit the theoretical performance of a rather practical "ultimate laptop" with a
mass of one kilogram and a volume of one litre. This is done by considering the speed of light,
the quantum scale, the gravitational constant and the Boltzmann constant, giving a performance
of 5.4258 10
50
logical operations per second on approximately 10
31
bits.
[108]

Then again, the law has often met obstacles that first appeared insurmountable but were indeed
surmounted before long. In that sense, Moore says he now sees his law as more beautiful than he
had realized: "Moore's law is a violation of Murphy's law. Everything gets better and better."
[109]




Kurzweil's extension of Moore's law from integrated circuits to earlier transistors, vacuum
tubes,relays and electromechanicalcomputers.


If the current trend continues to 2020, the number of transistors would reach 32 billion

Technological change is a combination of more and of better technology. A 2011 study in the
journal Science showed that the peak of the rate of change of the world's capacity to compute
information was in the year 1998, when the world's technological capacity to compute
information on general-purpose computers grew at 88% per year.
[114]
Since then, technological
change has clearly slowed. In recent times, every new year allowed mankind to carry out roughly
60% of the computations that could have possibly been executed by all existing general-purpose
computers before that year.
[114]
This is still exponential, but shows the varying nature of
technological change.
[115]

The primary driving force of economic growth is the growth of productivity,
[116]
and Moore's law
factors into productivity. Moore (1995) expected that the rate of technological progress is going
to be controlled from financial realities.
[26]
However, the reverse could and did occur around the
late-1990s, with economists reporting that "Productivity growth is the key economic indicator of
innovation."
[11]
An acceleration in the rate of semiconductor progress contributed to a surge in
US productivity growth
[117][118][119]
which reached 3.4% per year in 1997-2004, outpacing the
1.6% per year during both 1972-1996 and 2005-2013.
[120]
As economist Richard G. Anderson
notes, Numerous studies have traced the cause of the productivity acceleration to technological
innovations in the production of semiconductors that sharply reduced the prices of such
components and of the products that contain them (as well as expanding the capabilities of such
products).
If you were to chart the evolution of the computer in terms of processing power, you would see
that progress has been exponential. The man who first made this famous observation is Gordon
Moore, a co-founder of the microprocessor company Intel. Computer scientists, electrical
engineers, manufacturers and journalists extrapolatedMoore's Law from his original
observation. In general, most people interpret Moore's Law to mean the number of transistors on
a 1-inch (2.5 centimeter) diameter of silicon doubles every x number of months.
The number of months shifts as conditions in the microprocessor market change. Some people
say it takes 18 months and others say 24. Some interpret the law to be about the doubling of
processing power, not the number of transistors. And the law sometimes seems to be more of a
self-fulfilling prophecy than an actual law, principle or observation

The discovery of semiconductors, the invention of transistors and the creation of the integrated
circuit are what make Moore's Law -- and by extension modern electronics -- possible. Before
the invention of the transistor, the most widely-used element in electronics was thevacuum tube.
Electrical engineers used vacuum tubes to amplify electrical signals. But vacuum tubes had a
tendency to break down and they generated a lot of heat, too.
Bell Laboratories began looking for an alternative to vacuum tubes to stabilize and strengthen the
growing national telephone network in the 1930s. In 1945, the lab concentrated on finding a way
to take advantage of semiconductors. A semiconductor is a material that can act as both
a conductor and an insulator. Conductors are materials that permit the flow of electrons -- they
conduct electricity. Insulators have an atomic structure that inhibits electron flow.
Semiconductors can do both.
The control of the flow of electrons is what makes electronics work. Finding a way to harness the
unique nature of semiconductors became a high priority for Bell Labs. In 1947, John Bardeen
and Walter Brattain built the first working transistor. The transistor is a device designed to
control electron flows -- it has a gatethat, when closed, prevents electrons from flowing through
the transistor. This basic idea is the foundation for the way practically all electronics work.
Early transistors were huge compared to the transistors manufacturers produce today. The very
first one was half an inch (1.3 centimeters) tall. But once engineers learned how to build a
working transistor, the race was on to build them better and smaller. For the first few years,
transistors existed only in scientific laboratories as engineers improved the design.
In 1958, Jack Kilby made the next huge contribution to the world of electronics: the integrated
circuit. Earlier electric circuits consisted of a series of individual components. Electrical
engineers would construct each piece and then attach them to a foundation called a substrate.
Kilby experimented with building acircuit out of a single piece of semiconductor material and
overlaying the metal parts necessary to connect the different pieces of circuitry on top of it. The
result was an integrated circuit.
The next big development was the planar transistor. To make a planar transistor, components
are etched directly onto a semiconductor substrate. This makes some parts of the substrate higher
than others. Then you apply an evaporated metal film to the substrate. The film adheres to the
raised portions of the semiconductor material, coating it in metal. The metal creates the
connections between the different components that allow electrons to flow from one component
to another. It's almost like printing a circuit directly onto a semiconductor wafer.

While Moore's original observation focused on technological advances and the economics
behind producing circuits, many people reduce his observation to the simple statement we call
Moore's Law. The most common version of Moore's Law is that the number of transistors on a
circuit doubles every 18 (or 24) months. Remarkably, this prediction has held true -- today,
Intel's Core i7 microprocessor has 731 million transistors, while its Xeon processor has 1.9
billion transistors [source: Intel].

Você também pode gostar