Undergraduate engineering and science consists of learning various rules and laws that govern the domains of interest. For me, it was Maxwell’s Equations for electromagnetics, the Navier-Stokes equation for acoustics, the Rayleigh criterion for imaging, the speed of light, et cetera ad nauseam. What is frequently missed or neglected in teaching and in practice is how these rules and limits are simply the boundaries of the game – endpoints on a spectrum of possibilities. That’s why a recent headline caught my attention: “Computers to Get Faster Only for 75 More Years[1]. I find it hard to believe that humans a thousand years from now will be commemorating 2084 as “The Year Computers Stopped Getting Faster”. After reading the research paper from which this headline arose, I was reminded that innovative science doesn’t set limits, it uses them as tools. Since this is precisely what we do in Applications Engineering at Maplesoft, I thought it would be worth looking into a little further.

The headline in question was a flawed interpretation of an otherwise interesting research result from quantum mechanics. One of the most talked-about research areas in quantum mechanics is quantum computing, which is being touted as the next revolution in computing. Whereas traditional computers perform sequential operations on binary bits, quantum computers perform parallel operations on many so-called “qubits”. A qubit is a quantum bit that has a state with some probability of being measured as 0 and some probability of being measured as 1. These properties may allow quantum computers to break traditional data encryption schemes, search databases in a fraction of the usual time, and speed up computation in general.

Towards this end, researchers were recently able to pin-down the fundamental time it takes to perform a quantum computation[2]. It is fast – on the order of 10-25 seconds. That’s so fast that most people have never even heard of the prefix used to describe that time scale: 0.1 yoctoseconds. It is approximately ten quadrillion times faster than the transistors in your home computer and ten trillion times faster than the fastest transistors ever fabricated.

Given the enormous speedup possible, it was simultaneously laughable and disheartening to see that this important and exciting fundamental research appeared in the popular media with the unfortunate headline touting the end of computational performance gains. The gist of the article is that Moore’s Law, which has successfully predicted an exponential increase in computing power over the last several decades, would push computing speed up against this quantum limit in 75 years. But there are a number of fundamental issues that will slow down this exponential growth well before any quantum limits are reached. The headline, article, and conjecture are flawed in many ways, and represent a gross failure on the part of scientists to speak human. As a scientist I can’t help but be elated that there is still “plenty of room at the bottom”, to quote Richard Feynman. Some basic messaging from the paper’s authors could have given us the headline: “Quantum Computing gives Nearly Limitless Speedup”. Engineers and scientists must remember that limits only serve to limit unless a big-picture view of a problem is maintained. Consider the following two examples of how creativity has trumped the passive acceptance of physical limits.

Most of us have been taught that light has a speed and that’s how fast it goes, end of story. Some materials can slow it down, but even then it’s still much faster than any of us can perceive, right? Well it seems that statement of the speed of light as a limit neglects the opportunities that lie away from this end point. Recently, researchers have slowed light, on the order of µm/s[3], that even the effect of gravity on its trajectory can be measured in the lab[4]. Devices using such slow and stationary light could lead to quantum optical memories – the RAM of a quantum computer.

In the systems modeling world, it was a commonly held belief that to simulate a model in a reasonable amount of time, you must ultimately end up with ordinary differential equations (ODEs). Any more complicated equations would cripple your computational performance. In additional to faster hardware, new computational routines have made this “limit” completely irrelevant. MapleSim solves systems with differential algebraic equations (DAEs), the ungainly big brother of ODEs, without even the slightest hiccup. Figure 1 shows a hydraulic-powered Stewart platform governed by DAEs that would have been very difficult to simulate even just ten years ago that MapleSim simulates in just seconds.

Apart from performing the research that leads to these breakthroughs, scientists and engineers must also provide a message with their results to avoid such unfortunate misinterpretations by the general public as the aforementioned headline. As the advertiser George Lois succinctly put it, “Creativity can solve almost any problem. The creative act, the defeat of habit by originality, overcomes everything.” We as scientists and engineers must be in the habit of breaking the habit. All of our education and experience must lead us to the spaces between the rules and limits that we have been taught, for within these spaces we find great ideas and true innovation.

Figure 1 – A MapleSim model of a Stewart platform governed by DAEs.


[2] PRL 103, 160502 (2009)

[3] PRL 95, 253601 (2005)

[4] EPL 82 60002 (2008)


Please Wait...