There’s an old adage in the auto industry that says your vehicle loses value as soon as you drive it off the lot. The computer and technology industry is not so different, in that new parts and systems are constantly pushing the envelope of performance and value, making it impossible to have a top-of-the-line system for very long. The purpose of this paper is to step back in time 10 years and examine the technology landscape then to see how it compares to today’s technologies we know and love.
Much has been written, and much is still debated about the technology axiom called; Moore’s Law. Some people interpret the law to mean a doubling of computer power every 18 months to 2 years, while others maintain it means a doubling of the numbers of transistors (microscopic on/off switches if you will) engineers are able to print into the same amount of silicon. In fact Moore stated that transistor counts would double every year, which he later amended to be every 2 years) Regardless, this trend of performance and transistor growth has held true for over 40 years.
More recently, Moore’s law has been applied to graphics processing units (GPUs) as well. The GPU is actually not even 10 years old at the time of writing. The official term GPU was coined with the nVidia GeForce 256 series of 3D cards. Released in August 1999, this was the first chip that was able to compute vertex transformations and pixel lighting in hardware. Since its release, graphics processing power has followed a similar trend of doubling performance every 18 months to 2 years. This article is not a technical document, rather a glimpse into the past 10 years ago to give you an idea of how far we’ve come in such a short time.
A Little History About the Author:
The year was 1999. It was my sophomore / junior year in high school. Now I’ll pause so that 90% of the readers who grew up in the 70s and 80’s can turn up their nose and stop reading.
Okay, now that that’s out of the way, let’s continue.
I am an internet child. I got into computers, and all manner of electronic geekery just as the Internet was becoming something that “just might catch on”. My high school was not quite on the bleeding edge of technology as it were, but we were one of the first public schools in my area to be equipped with a full T1 line and the 1.5Mbps of awesome dedicated bandwidth that it piped through the network. Our network was built from a hodge-podge mixture of nice PC computers labs (Mostly K6 200MHz/Win 95 machines), and a network engineers nightmare concoction of Macintoshes; LC II’s up through new G3 iMacs.
In 1999 I was still rocking a 486 DX-33 PC with 8MB of ram and Windows 3.11 as my PC at home, while my parents had a nice Pentium 100 Windows 95 machine my siblings and I would fight for time on. So what was the point of those last couple paragraphs you’re probably asking? Well, that is where I was, and what I was using 10 years ago in computing, and hopefully it got you thinking back a decade to what you were computing on.
10 years ago today, The computer world was working itself into a frenzy, getting ready for the near future release of the brand new Intel Pentium III Processor. The best your money could do was a Pentium II processor at 450MHz with PC100 Memory. Rival AMD was doing alright for itself peddling its budget conscious K6-2 series of processors. Your average computer cost around $2000-$2500 at this time for a PC with a solid Pentium II processor. The Pentium III was due to be released on February 26th, and was the subject of a vicious debate over the merits of its built in unique ID number. Also noteworthy was its inclusion of the first SSE instruction set (The modern Core i7 processor is at SSE4+). However without software designed to make use of the new instructions the processor was not much faster than an equivalently clocked Pentium II CPU and took quite a bit of heat from the press.
On the graphics front the war between nVidia and 3DFx was raging. nVidia, a relative newcomer to the field, was riding high on top of their very successful TNT (Twin-Texel) video processor, while 3DFx was fighting to retain its market leadership title with its Voodoo3 line of graphics cards. The Voodoo cards were excellent performers, however the TNT cards had the advantage of 32 bit color while the Voodoo’s were stuck at 24 bits. ATI was around and kicking with their Rage series of 3D cards (especially popular as integrated chips in servers and workstations), however they never really entered into the hardcore 3D accelerator race until the Radeon line debuted in 2000.
In 1999 the internet boom was just beginning and attracting massive amounts of attention and investment capital from all over the globe. The big business of the time was in the search and portal business. Yahoo, AltaVista, Excite, and Lycos were perhaps the most recognizable and sizable players in this industry. Google -say-wha? Other companies from other industries, such as television media, and ISPs were taking notice, and buying up internet portals right and left in an attempt to tap into the viral spread of the internet. New technologies like Java were coming into their own, and began to lay the groundwork for the internet as we know it today. Studies determined that by 1999 about half of US households had a computer, and most of the new computer purchases were being made to get online.
PDA’s were quickly becoming the new must have gadget. The Palm Pilot being the device that led the charge. I myself had a Palm IIIe towards the later half of 2000. Personal pagers were still hot sellers, but cell phones were getting smaller and more practical and affordable. Also a war was heating up in the computer memory segment. Samsung introduced new DDR memory chips that promised to double the effective bandwidth of existing memory chips in future setups, however Rambus was busy developing their own high speed RAM for use in high end Intel systems.
This field is perhaps a bit too large to cover here, but a brief OS recap: The PC world was running strong with windows 98, and anxiously awaiting the updated Windows 98 Second Edition due out in May 1999. The Macintosh crowd was plugging away happily on their Mac OS 8.5 Machines, while awaiting an update to OS 9 sometime later that year. Linux was also starting to come into its own, as big name vendors like IBM were starting to bundle distributions of Redhat, Caldera, and others with some servers.
I’ll admit this article was more for my amusement and reminiscing than anything else. I hope that somewhere in the course of this mass of words and broken sentences, you too had some good recollections of computer equipment past. It’s good to remember the technologies we grew up with and we are fortunate enough to have the robust internet we are used to today to help conjure up images and technical specs of those glory days. Sites like the Tech Time Machine exist to help us relive those days, and have a few chuckles at equipment we thought we’d never use to its full potential. I remember touching a Pentium 200MMX machine for the first time and being in awe at its raw speed and seemingly infinite possibilities. Now I constantly get upset with the speed of my current 3GHz dual core Athlon X2 6000+ PC. I can’t even imagine how many windows that Pentium 200 would be thrown out of today. It’s a crazy world we live in.