It's been more than thirty five years since Gordon Moore, the co-founder of Intel, observed that the transistor density of semiconductor chips doubles roughly every 18 months. This observation was both accurate as well as profound and became known to us as Moore’s Law. The effect of Moore's Law has been simple, new PCs are almost twice as powerful as the prior generation. This in turn led to mass market adoption as PCs became more powerful, capability and functionality increased and appeal widened the market for new adopters with every generation of hardware and the software functionality it enabled. It appears Moore’s Law remains in force and by most accounts will continue for at least some time. But I often wonder whether there is still benefit for most users and in what ways will raw speed empower them?
While I don't want to sound like the apocryphal quote Bill Gates was said to have made about no one needing more than 640k of RAM, I feel that the impact of Moore's Law on processing speed is having less and less impact on more and more users. With multi-core processors the norm and not the exception, multi gigahertz speeds even on entry-level are in mainstream use. In many cases, fast has finally become fast enough. For many consumers doing knowledge worker functions such as word processing, spreadsheets, e-mail or Web browsing, the performance difference in going from a mainstream chip to a state of the art high-end processor just isn’t readily apparent.
I can already hear some of you arguing that this line of reasoning is wrong. For certain classes of users, you are no doubt correct. Sure, if you’re an engineer working on decoding and reengineering genomes, or you're trying to play the latest and greatest game at resolutions that make high definition seem clunky, fast might not be fast enough. If you’re creating a giant engineering simulation or trying to render “Toy Story 4” in real-time, fast might not be fast enough. But if you're outside the power curve and primarily use an office suite of products for word processing, spreadsheet, graphics programs, play some casual games, and you run some version of Windows, Mac OS or Linux well, fast is definitely fast enough.
A threshold has been crossed for most users. While there have always been relatively cheap PCs available, no one wanted to buy them because they couldn’t run current generations of applications. It has been this slowdown in the benefits of Moore’s Law that has made cheap PCs such as netbooks possible. It also means that people are rethinking the purchase funnels of the past. In the old days, decision making was easy: you bought the fastest processor on the market and added as much memory and disk space as you could afford. I suspect that's no longer the best approach. Even sophisticated users don’t need the fastest processor that Intel or AMD offer in their new systems.
So is this the end game for high end computing? Naah! Nothing lasts forever, and I suspect neither will this lull in the benefits of Moore’s Law. Technology tends to move at curves rather than right angles and new technologies will emerge over time to take advantage of what's next both in terms of new applications and UI enhancements that will finally advance beyond the mouse and keyboard. Real advances will require not just faster processors, but also some software advances that go with them. The good news is that most users can safely ride price curves down and extend the life of older systems. Has Moore's Law been suspended for you or are you still feeling the need for speed?