Rik's Treehouse

Site Tools

babbling_in_binary:trends_in_computing:start

Trends in Computing

Years ago I heard it said that computers were doubling in performance every year or two (a claim related–but not identical–to Moore's Law); that is, two years from now you could buy a computer twice as powerful/fast as today for the same price or less. I was curious to see if this was true so I started tracking the prices of individual components on a monthly basis. I am biased towards Wintel PCs with a total price under $2000 Canadian, so I track components typical for such machines. Motherboards Architecture Speed Factor i8086 0.0025 i286 0.011 i386sx 0.016 i386dx 0.023 i486dx 0.047 i486dx2 0.039 i486dx4 0.031 Pentium 0.093 Pentium MMX 0.11 Pentium 2 0.12 Pentium 3 0.14 AMD Duron 0.14 AMD Athlon XP1) 0.13 Intel Core 2 Duo 0.51 Intel Core 2 Quad 0.72 Intel Core i5 0.90 Intel Core i7 1.00 Estimated relative speed factors of past chip architectures (if running at the same clock speeds). Units: MHz/$, megaHertz per dollar (Cdn)
Doubling time: 2)

Motherboard performance, measured in MHz/$, is currently doubling every months. Notice that I am measuring performance for the motherboard as a whole, not just the CPU. There have been fluctuations but the trend does look more or less exponential. A friend of mine (Andy Horton) pointed out there was a noticeable dip every year around December. This is most likely due to prices being jacked up for Christmas sales. Processor speed depends on how many transistors can be fit on a chip, which has a theoretical maximum set by quantum mechanics. If the circuits get too close together the electrons will start “tunneling” through barriers and the chip won't function properly. I'm not sure what the limit is, but as we approach it, expect the doubling time to stretch out–at least until manufacturers find a new way to get around (or take advantage of) this problem. Of course, every new generation of processor may perform better (or worse!) than the last, even at the same clock speed so I weighted the clock speed by an estimation of how fast each chip architecture is compared to a current chip: So a 2.4GHz Core 2 Duo E6600 will perform about five times as fast as an Athlon XP 2000+. Comparing different processors opens up a can of worms such as the impact of other components (eg. RAM, front-side bus, etc.) on performance. But, I don't really care about all this…I just want to get an estimate of the typical speed of these machines so I just get my numbers by comparing application and game benchmarks for typically-configured machines. RAM Units: MiB/$, mebiBytes per dollar (Cdn)
Doubling time: 3)

RAM performance, measured in MiB/$, is currently doubling every …but it's a real wild ride! Notice that the graph is virtually flat until November 1995 and then explodes upwards. As I understand it this is due to a monopoly on SIMM modules which toppled around then. At the time (October 1995) the doubling time was a whopping 53 ± 9 months! After some fluctuations when the market opened up, it looks like RAM performance grew at a natural rate for a while. There was an huge hiccup at the end of 2001 which was apparently due to price fixing. Again, performance depends on how many transistors can be fit on a chip which has a theoretical maximum. Unlike motherboard performance, however, I don't expect this to slow down the RAM performance doubling time because–Hey!–the manufacturers can always just add more memory banks, right? Hard Drives Units: GB/$, gigabytes per dollar (Cdn)
Doubling time: 4)

Hard drive capacity, measured in GB/$, is currently doubling every . The graph was showing exponential growth until late 2011 when flooding in Thailand caused production shortages. Bandwidth Units: kbps/$/month, kilobits per second per dollar (Cdn) per month
Doubling time: 5)

Bandwidth (the speed of an internet connection) is difficult to measure because it depends on your internet provider, and current conditions (eg. is there heavy usage?) so I don't have as good of data. But the data I do have suggests the doubling time for bandwidth performance (versus monthly fees) is . I calculate my bandwidth by doing speed tests weekly and then taking the median value from the last five weeks to reduce the noise.

Math Stuff

The component prices come from advertisements in local shops. I compiled the data and made estimates of what the actual doubling time, $\tau$, is, as follows. If there is a doubling trend over time, $t$, then the performance (per dollar), $P(t)$, should increase as $P(t) = P(0) 2^{t/\tau}.$ Taking the logarithm of both sides gives $\log_2(P(t))=t/\tau + \log_2(P(0))$ which is just the straight line equation $y(t)=mt+b$ where $m=1/\tau$. So all we have to do is take the logarithm of the performance data and fit it to a straight line to get the parameters $m$ and $b$, which allow us to calculate $\tau$. The exact calculations for fitting to a straight line can be found in any statistics textbook or try Press et al. 1992, Chapter 15.

Also given therein is a procedure to estimate the uncertainties in the parameters even without knowing the measurement errors. So it is possible to calculate the uncertainty in the slope $m$ which is related to the uncertainty in the doubling time by $\frac{\sigma_\tau}{\tau} = \frac{\sigma_m}{m}.$ By using this method the “goodness of fit” is incorporated into the doubling time uncertainty: a large uncertainty means a poor fit.

References

Press, W. H., S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery. 1992. Numerical Recipes in C: The Art of Scientific Computing. Second. Cambridge: Cambridge University Press. http://www.nr.com/.