What is Moore's Law?
Moore’s Law Definition
Moore’s Law refers to a prediction made in 1965 by Gordon E. Moore, cofounder of Intel, that the number of transistors that can be packed into a computer processor of a given size should be expected to double every two years, while the cost of said computers is halved.
When Moore made this postulation, he hadn’t set out to create a law or predict a truism; nonetheless, time proved that not only was Moore’s assumption accurate, but that the rate of doubling was increasing faster than he thought.
Today, the doubling rate for transistor capacity is around 18 months.
Define Moore’s Law in Simple Terms
Moore’s Law has been a guiding force in the semiconductor industry for planning research and development goals.
In turn, it has led to the prevalence of affordable yet microscopic transistors that have shaped all facets of society.
From consumer smartphones to weather forecasting to life-saving hospital equipment, every economic sector has seen improvements in productivity and efficiency due to the shrinking size of transistors.
Limitations of Moore’s Law
There is a limit to Moore’s Law, however.
As transistors approach the size of a single atom, their functionality begins to get compromised due to the particular behavior of electrons at that scale.
In a 2005 interview, Moore himself stated that his law “can’t continue forever.”
Most experts agree, stating that the physical limits of transistor technology should be reached sometime in the 2020s.
Quantum Computing and Moore’s Law
However, exponential increases in computational technology might not end with traditional transistors.
Quantum computing, for all intents and purposes, is not subject to many of the limitations of normal transistors.
While household quantum computers are still a ways off, in April of 2020, Intel announced that they had successfully built a quantum computer that could operate at a cost of only a few thousand dollars, much less than the older models that cost millions.