Sunday, March 21, 2010

Moore's Law No More

Moore was the co-founder of Intel. In the 1960’s he wrote several papers on the topic of circuits and computing. From these papers emerged what would become Moore’s law. It began with the prediction that the number of transistors that could be placed on an integrated circuit would double every two years. Subsequently, many corollaries evolved, one stating that processing speed would also double every two years. This should mean, of course, that your computer, your smartphone, and your DVRs should work faster and faster as you upgrade over time, right?

Wrong.

In fact, what will more than likely happen is that while computers will be faster, the user experience will be more and more variable. There are two explanations for this—a technical reason and a intuitive reason. The technical reason tells us that processing speed is not the only factor that determines the speed of a computer. Speed is also dependent on the speed of the RAM, the speed of the hard drive, and the order in which calculations are preformed. This is why one the easiest way to speed up a sluggish computer is to increase the RAM available or get a faster spinning boot hard drive. Unfortunately, while RAM can follow Moore’s law and its corollaries, Hard drive speeds do not. This means that as processors continue to get faster, they will begin to be held back by hard drive seek times for data. Also, much software is not written for faster processors and so cannot fully take advantage of the speed. This means that calculations are done out of order or inefficiently, slowing the perceived speed of the program. In fact, while dual core processors have been out for some time, software truly taking advantage of both processors simultaneously have only recently become more common.

The intuitive reason is slightly different and what I believe more to be why computers do not seem much faster despite having faster processors. People’s wants always supersede their needs. There is no clearer example of this than the everyday occurrence of people spending lots of money on rock concerts or sports tickets or car upgrades than on healthcare. People want to be able to do bigger and showier things all the time. They want their games to have better graphics and more complex gameplay. They want to play ultra high definition videos. They want to run browsers with hundreds of add-ons. But in order to have these things more calculations per unit time need to be done by the computer. And faster computers allow that to be done. But if you have that kind of speed, would you create a program that only uses eight percent of that power? Of course not. You would use it plus more if you could. The problem with that is that operating system developers are thinking the same thing—more background tasks, fancier desktops, and more widgets. So a lot of the computer’s processing power is already used.

So you can see that as processing speed increases, OS needs increase parallel to it. But because new software is written to take advantage of faster processing power, it becomes, well, a power struggle. In a very simplistic sense, with increasing multitasking by today’s computers, available processor time diminishes exponentially for the patron user’s computer. After all, a software designer is testing his program on a computer running the most basic OS components that he needs so that the application can use all the processing power it requires. This then suggests that as computers get faster they will get slower.