Integrated Circuit, photo courtesy of

With the debut of technology theories like the technological singularity and the realization of “the internet of things” on the horizon, there has been clamorous panic among technocrats as they debate whether we can continue to accurately predict or control technological advancement. The optic we have used to predict computational power for the last fifty years or so has been Moore’s Law. Without getting into the highly intellectualized rigmarole of digital electronics, Moore’s law reads like this, “the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years” but is interpreted to read like this, ” the number of transistors that can be placed on an integrated circuit doubles approximately every two years increasing computational power or performance exponentially without diminishing returns”.

How did we get here? a simple thought experiment called the Sand Heap Paradox can be used to put things in perspective. We have a heap of sand and we continuously remove one grain from it. The change in the size of the heap is nominal, so much so that we fail to realize that it is reducing in size, although very slow and on a miniscule scale. Fast forward a few years and there is only a single grain of sand left and no heap. Think of the end of Moore’s law as the moment we realize that there isn’t an infinite amount of sand available and that all predictions have their limits. Sand of course is almost poetic in our case since silica is used to make silicon which is a key ingredient found in every microprocessor transistor.


This is where we find ourselves. The number of transistors you can cram into a chip can’t increase forever because of the physical limitations of silicon based chips. Some research is suggesting that this was already the case at 28nm(nanometer) but microprocessor giant Intel reported a 14nm achievement in 2014. The biggest hurdle to keep shrinking transistors to tiny atomic sizes is heat and leakage. At 5nm the laws of physics turn the chip into a frying pan and quantum mechanics at that size scrambles the atom and disrupts information flow (ability for signals to travel through a logic gate on a silicon wafer in a coordinated fashion). So Moore’s law falls short at postulating leaps in computational power primarily because the axiom is untenable at a certain size and that limit is fast approaching. Cutting edge research is instead looking at quantum and molecular computing to foster in the new paradigm for processing power with post silicon transistors. In this TED talk Ray Kurzweil gives the silicon based transistors another 10 years before we reach the performance apex. I need to mention that Kurweil has an impeccable history of predicting trends in technology. Renowned futurist Michio Kaku also echoes Kurzweil’s sentiments. The more closely we examine Moore’s law or its inaccurate interpretation the more it appears that it is a rule of “dumb” or self-fulfilling prophesy that merely coincided with Intel’s success in the microprocessor industry, Moore’s law for any scientific purposes is already dead and is only used purely for marketing purposes. So really the question is not whether Moore’s law is still valid, but for how long it will be be the conceptual framework we use to fuel our postulations of computational processing, pundits say 10 years but add on some reverse engineering with 3D transistor arrangement and we have roughly fifty years more.

mooreslaw_660In conclusion the debate on Moore’s law can be polarized into two camps, those that think computational power on silicon based transistors will keep increasing forever under the Moore paradigm and those that think the days of increasing computational power using silicon based transistors are numbered. Now you’re probably wondering whether all of this matters to you as a consumer, the answer is it probably doesn’t but the next paradigm which we think of to conceptualize computational performance leaps will probably give rise to greater computational power. When we move from Moore’s law and believe me we will, this will punctuate a transformation of our technological civilization. Think positronic brains and human like interactions with virtual personas. The silver lining on the dark cloud of Moore’s law might be as Ray Kurzweil puts it, that

“the dwindling of any paradigm is that it creates research pressure to come up with another paradigm that improves on and supplants the previous paradigm”.

Moshe Y. Vardi who wrote an article (Is Moore’s Party Over?) also seems to agree, adding that the death of Moore’s law will plunge us into a time when we will have to become creative with algorithms and systems in order to leverage the stagnation. Exponential growth of computing power under Moore’s law will definitely slow, perhaps to continue under molecular computing or some other far out concept.That is it for now, time to retire Moore’s law to the same place we put Ptolemaic planetary theories.

You can read Intel co-founder Gordon Moore’s original paper here

Published by Mark Mushiva

Otaku sub-culture enthusiast, gamer, developer, UCD research assistant, technophile and uncommitted investigator. Founding member at The Tech Guys

Leave a comment