Today's processors are made from silicon, which itself is fashioned from one of the most abundant materials on earth: sand. But as it gets harder and harder to make ever more miniature circuits – processor technology has moved from 90nm fabrication in the mid-2000s to 14nm now, with that predicted to shrink further to a barely believable 7nm or even 5nm by 2021 – chipmakers are looking for alternatives; not just materials, but maybe even biological components.
A little bit of history
Intel's first microprocessor, the 4004, had 2,300 transistors. Modern processors have several billion. That's been achieved by cramming ever more transistors into the same amount of silicon, but as you do that the laws of physics kick in and your processor starts generating heat – and the more power you want, the more heat you generate. The fastest Pentium 4 processors could be overclocked beyond 8GHz, but to achieve that you needed liquid nitrogen to stop them burning up.
Today's processors are much more complicated than the single-core Pentiums, with multiple cores and three-dimensional architectures performing incredible feats of engineering, but sooner or later silicon will hit a wall. It won't be able to provide the exponential growth in processing power we're used to, because we'll be down to components that are only a few atoms wide.
What happens then?
Rack 'em and stack 'em
One option is to stick with silicon but to use it in different ways. For example, today's processors are largely flat – rather than try to cram ever more transistors into the same amount of space, we could take an architectural approach and build up to make the silicon equivalent of skyscrapers (hopefully avoiding the silicon equivalent of The Towering Inferno).
Or we could take what's known as a III-V approach, which uses elements from either side of silicon in the periodic table – it's in what's known as group IV, so you'd use materials from groups III and V in layers above the silicon. This would reduce the amount of power needed to move electrons around, which should make it possible to manufacture transistors smaller and pack them in more tightly. The favourite candidate for III-V manufacturing is gallium nitride, which has been used in LEDs for a few decades and can operate at much higher temperatures than the previous favourite, gallium arsenide.
Another option is to rethink the CPU itself. Intel, Nvidia and AMD are all moving in the same direction with processors, and if you look at the amount of space given over to the GPU on the processor die of chips with integrated graphics you'll see more and more real-estate being handed over to the GPU.
Traditionally the CPU has done the difficult stuff while the GPU has handled the graphics – so for example in a game the CPU's doing the AI – but the GPU's ability to do simpler tasks in massively parallel ways means designers are increasingly looking at sharing the overall workload between CPU and GPU based on their suitability for the job.
The more the GPU can do the better, because GPUs are massively parallel circuits, which is another way of saying they're fairly simple components crammed together in big numbers – thousands of cores compared to the handful in a CPU. Unlike CPUs, which are already pushing the limits of miniaturisation, GPUs have a long way to go before the laws of physics ruin their exponential growth.
What if silicon runs out of road? Carbon could come to the forefront instead, in the form of carbon nanotubes. In October, IBM published a paper in the journal Science describing a new method for creating carbon nanotubes from sheets of the "miracle material" graphene.
Unlike previous attempts, IBM's method didn't encounter increasing electrical resistance as contact sizes were reduced. IBM says that we may see carbon nanotube processors "within the decade".
However, as IBM's Shi-Jen Han explains in IBM's blog, those processors are still a long way from reality. "We've developed a way for carbon nanotubes to self-assemble and bind to specialised molecules on a wafer. The next step is to push the density of these placed nanotubes (to 10nm apart) and reproducibility across an entire wafer," he says.
But when IBM finally cracks it, the potential is enormous. "Better transistors can offer higher speed while consume less power. Plus, carbon nanotubes are flexible and transparent. They could be used in futuristic 'more than Moore' applications, such as flexible and stretchable electronics or sensors embedded in wearables that actually attach to skin – and are not just bracelets, watches, or eyewear," Han says.
In 2011, researchers in Belgium created a plastic microprocessor by printing 4,000 plastic transistors on flexible plastic foil. It's hardly a Core i7 – it can run one program consisting of 16 instructions – but while it isn't very powerful, it's very cheap. And if research into roll to roll or sheet to sheet printing pays off, it could get cheaper still – processors would be printed using organic 'inks'.
For that to happen, though, we'd need to get much more accurate organic printers – while silicon processors head for single digits in the nanometre scale, lab-scale printing is still working in micrometres. There are issues of variability too, because plastic transistors aren't as predictable as silicon ones.
Should have gone to Specsavers
Maybe the problem isn't the silicon. Maybe it's electricity. One of the most promising alternatives to silicon is to create optical or photonic computers based on light. Unfortunately those computers have to deal with a number of issues. Light doesn't like sharp bends. Optical wires need to be significantly bigger than electronic ones – think 1000 nanometres compared to the 14 nanometres silicon is achieving. And miniaturising optical transistors is tough.
In late 2015, researchers at the University of Colorado-Boulder, MIT and Berkeley made a breakthrough: they combined photonic circuitry and electronic circuitry on a single chip. "It's the first processor that can use light to communicate with the external world," said project lead professor Vladimir Stojanović. "No other processor has photonic I/O in the chip."
The chip achieved a density of 300gb/s, which is up to 10 times better than electrical microprocessors. Once again the prototype isn't very powerful, but it shows a middle way between electrical and photonic circuitry that could bring some form of optical computing to our computers in a fairly short time period.
Last but not least, there's quantum computing. Google's working on it and the first working quantum processor was built at Yale in 2009, but it's exceptionally hard to explain.
Quantum computers aren't binary like traditional ones, where 0 is off and 1 is on. Quantum numbers can exist in multiple states at the same time, which means quantum computers could consider multiple options simultaneously. In a 64-bit quantum computer, each 64-bit register is capable of holding 18,446,744,073,709,551,616 different values at the same time. If you were to carry out some computation, it would therefore be carried out on all those values at once.
There's an astonishing amount of money being spent on quantum computing research by organisations and governments, but for now it remains a far away if rather tantalising possibility.