This is a 3DFX Voodoo 2 V2 1000 PCI still sealed in the box.
I actually own three Voodoo 2’s. The first one is a Metabyte Wicked 3D (below, with the blue colored VGA port) that I bought from a friend in high school. The second one is the new-in-box 3DFX branded Voodoo 2 I bought off of ShopGoodwill last year. The third one (below, with the oddly angled middle chip) is a Guilliemot Maxi Gamer 3D2 I bought at the Cuyahoga Falls Hamfest earlier this year.
The Voodoo 2, in all of its manifestations, is my favorite expansion board of all time. It’s one of the last 3D graphics boards that normally operated without a heatsink so you can gaze upon the bare ceramic chip packages and the lovely 3DFX logos emblazoned upon them. It was also pretty much the last 3D graphics board where the various functions of the rendering pipeline were still broken out into three separate chips (two texture mapping units and a frame buffer). The way the three chips are laid out in a triangle is like watching jet fighters flying in formation.
It’s hard to explain to someone who wasn’t playing PC games in the late-1990s what having a 3DFX board meant. It was like having a lightswitch that made all of the games look much, much better. There are perhaps three tech experiences that have utterly blown my mind. One of them was seeing DVD for the first time. Another is seeing HDTV for the first time. Seeing hardware accelerated PC games on a Voodoo 2 was on the same level.
A friend of mine in high school won the Metabyte Wicked 3D in an online contest. I remember the day it arrived at his house I had walked home from school trudging up and down piles of snow that had been piled up on the sidewalk to clear the roads and I got home exhausted…And he calls me asking if I wanted to come over as he installed the Voodoo 2 card and fired up some games. Even though I was exhausted I eagerly accepted.
I think I saw hardware accelerated Quake 2 that day…Nothing else would ever be the same. I was immensely jealous.
Ever since personal computers have been connected to monitors there has been some sort of display hardware in a computer that output video signals. Often times this hardware included capabilities that enhanced or took over some of the CPU’s role in creating graphics.
When we talk about 2D graphics we mean graphics where for the most part the machine is copying an image from one place in the computer’s memory and putting int another place in the computer’s memory. For example, if you imagine a scene from say, Super Mario Bros. the background is made up of pre-prepared blocks of pixels (ever notice how the clouds and the shrubs are the same pattern or pixels with a color shift?) and Mario and the bad guys are each particular frame in a short animation called a sprite. These pieces of images are combined together in a special type of memory that is connected to a chip that sends the final picture to the TV screen or monitor.
It’s sort of like someone making one of those collages where you cut images out of a magazine and paste them on a poster-board. The key to speeding up 2D graphics in a computer is speeding up the process of copying all of the pieces of the image to the special memory where they need to end up. You might have heard about the special “blitter” chips in some Atari consoles and the famous Amiga computers that made their graphics so great. 2D graphics were ideal for the computing power of the time but they give videogame designers limited ability to show depth and perspective in videogames.
Outside of flight simulator games and the odd game like Hard Drivin and Indy 500 almost all videogames used 2D graphics until the mid-1990s. PC games during the 2D graphics era were mostly being driven by the CPU. If you bought a faster CPU, the games got more fluid. There were special graphics boards you could buy to make games run faster, but the CPU was the main factor in game performance.
Beginning in about 1995-1996 there was a big switch to 3D graphics in videogames (which is totally different than the thing where you wear glasses and things pop out of the screen…that’s stereoscopics) and that totally changed how the graphics were being created by the computer. In 3D graphics the images are represented in the computer by a wireframe model of polygons that make up a scene and the objects in it. Special image files called textures represent what the surfaces of the objects should look like. Rendering is the process of combing all of these elements to create an image that is sent to the screen. The trick is that the computer can rotate the wireframe freely in 3D space and then place the textures on the model so that they look correct from the perspective of the viewer, hence “3D”. You can imagine it as being somewhat akin to making a diorama.
With 3D graphics videogame designers gained the same visual abilities as film directors: Assuming the computer could draw a scene they could place the player’s perspective anywhere they desired.
The problem with 3D graphics is that they are much more taxing computationally than 2D graphics. They taxed even the fastest CPUs of the era.
In 1995-1996 when the first generation of 3D games started appearing in PCs they looked like pixelated messes on a normal computer. You could only play them at about 320×240, objects like walls in the games would get pixelated very badly when you got close to them, and the frame rate was a jerky 20 fps if you were lucky. Games had started using 3D graphics and as a result required the PC’s CPU to do much, much more work than previous games. When Quake, one of the first mega-hit 3D graphics-based first person shooters came out it basically obsoleted the 486 overnight because it was built around the Pentium’s floating-point (read: math) capabilities. But even then you were playing it at 320×240.
At the same time arcade games has been demonstrating much better looking 3D graphics. When you sat down in front on an arcade machine like Daytona USA or Virtua Fighter 2 what you saw was fluid motion and crisp visuals that clearly looked better than what your PC was doing at home. That’s because they had dedicated hardware for producing 3D graphics that took some types of work away from the CPU. These types of chips were also used in flight simulators and they were known to be insanely expensive. However, by the time the N64 came out in 1996 this type of hardware was starting to make it’s way into homes. What PCs needed was their own dedicated 3D graphics hardware. They needed hardware acceleration.
That’s what the Voodoo 2 is. The Voodoo 1 and it’s successor the Voodoo 2 were 3DFX’s arcade-quality 3D graphics products for consumer use.
A texture mapping unit (the two upper chips labeled 3DFX on the Voodoo 2) takes the textures from the memory on the graphics board (many of those smaller rectangular chips on the Voodoo 2) and places them on the wireframe polygons with the correct scaling and distance. The textures may also be “lit” where the colors of pixels may be changed to reflect the presence of one or more lights in the scene. A framebuffer processor (the lower chip labeled 3DFX) takes the 3D scene with the texture and produces a 2D image that is built up in the framebuffer memory (the rest of those smaller, rectangular chips in the Voodoo 2) that can be sent to the monitor via the RAMDAC chip (like a D/A converter for video, it is labeled GENDAC on the Voodoo 2).
3DFX was the first company to produce really great 3D graphics chips for consumer consumption in PCs. Their first consumer product was the Voodoo 1 in late 1996. It was soon followed in 1998 by the Voodoo 2.
The Voodoo 2 is a PCI add-in board that does not replace the PC’s 2D graphics card. Instead, there’s a cable that goes from the 2D board to the Voodoo 2 and then the Voodoo 2 connectors to the monitor. This meant that the Voodoo 2 could not display 3D in a window, but what you really want it for is playing full-screen games, so it’s not much of a loss.
My friend who won the Metabyte Wicked 3D card later bought a Voodoo 3 card and sold me the Voodoo 2 sometime in 1999.
I finally had hardware acceleration. At the time we had a Compaq Presario that had begun life has a Pentium 100 and had been upgraded with an Overdrive processor to a Pentium MMX 200. It was getting a bit long in the tooth by this time, which was probably 1999. Previously I had made the mistake of buying a Matrox Mystique card with the hope of it improving how games looked and being bitterly disappointed in the results.
Having been a big fan of id Software’s Doom I had paid full price ($50) for their follow-up game Quake after having played the generous (quarter of the game) demo over and over again. Quake was by far my favorite game (and it’s still in my top 5).
id had known that Quake could look much better if it supported hardware acceleration. They had become frustrated with the way that the needed to modify Quake in order to support each brand of 3D card. Basically, the game needs to instruct the card on what it needs to do and each of card used a different set of commands. id had decided to create their own set of commands (called a miniGL because it was a subset of SGI’s OpenGL API) in the hope that 3D card makers would supply a library that would convert the miniGL commands into commands for their card. The version of Quake they created to use the miniGL was called GLQuake and it was available as a freely downloadable add-on.
It’s a little hard to show you this today, but this is what GLQuake (and the Voodoo 2) did for Quake. First, a screenshot of Quake without hardware acceleration (taken on from the Pentium III with a Voodoo 3 3000):
Now, with hardware accelerated GLQuake:
Suddenly the walls and floor look smooth and not blocky. Everything is much higher resolution. In motion everything is much fluid. It may seem underwhelming now, but this was very hot stuff in 1997 and blew me away when I first saw in 1999.
What we didn’t realize at the time was that it was pretty much all downhill for 3DFX after the Voodoo 2. After the Voodoo 2 3DFX decided to stop selling chips to 3rd party OEMs like Metabyte and Guilliemot and produce their own cards. That’s why my boxed board is just branded 3DFX. This turned out to be disastrous because suddenly they were competing with the very companies that had sold their very successful products in the 1996-1998 period. They also released the Voodoo 3, which combined 2D graphics hardware with 3D graphics hardware on a single chip (that was hidden under a heatsink).
The Voodoo 3 was an excellent card and I loved the Voodoo 3 3000 that was in the Dell Pentium III my parents bought to succeed the Presario. However, 3DFX was having to make excuses for features that the Voodoo 3 didn’t have and their competitors did have (namely 32-bit color). Nvidia’s TNT2 Ultra suddenly looked like a better card than the Voodoo 3.
3DFX was having trouble producing their successor to the Voodoo line and instead was having to adapt the old Voodoo technology to keep up. The Voodoo 4 and 5, which consisted of several updated Voodoo 3 chips working together on a single board ended up getting plastered by Nvidia’s GeForce 2 and finally GeForce 3 chips which accelerated even more parts of the graphics rendering process than 3DFX did. 3DFX ceased to exist by the end of 2000. Supposedly prototypes of “Rampage”, the successor to Voodoo were sitting on a workbench being debugged the day the company folded.
Back in the late-1990s 3D acceleration was a luxury meant for playing games. Today, that’s no longer true: 3D graphics hardware is an integral part of almost every computer.. Today every PC, every tablet, and every smartphone sold has some sort of 3D acceleration. 3DFX showed us the way.
I found this Quadboard at the Abbey Ann’s off the Circle in Tallmadge sometime last year. It was sitting on their counter in a pile of items that had just came in.
When I saw it I immediately recognized that it was some sort of old RAM upgrade (from the ordered banks of DRAM chips on the left side of the board) but when I looked closer I impressed with how just how old it was. As third party add-ons for IBM PC hardware goes, this is almost as early as it gets.
I suspect this is one of those instances where the original owner replaced the older Quadboard with the newer model and the old one ended up in the newer one’s box with the newer one’s documentation.
The box and the documentation is for a 384K Quadboard from 1984 (with six banks of nine 64 kilobit DRAM chips) but the card inside is labeled 1982 and only had four banks of nine chips. Also notice how the picture on the box has two connectors for ribbons but the card I have only has one.
In doing some research about the Quadboard it looks like in 1982 Quadram sold several models all with the same number of DRAM chip sockets but with different number of chips installed. The one I have seems to be the high end of the original Quadboard line which means it’s fully populated with 256K RAM. If you had one of the cheaper models you could install more chips yourself as long as you added a whole bank at a time.
I believe the leftmost of the two sets of DIP switches on the top of the board are how you tell it how much RAM the PC has on the motherboard (so it knows what address to start with) and how much RAM is installed on the Quadboard.
Incidentally the reason that four banks of nine 64 kilobit memory chips equals 256K is that the 9th chip in each bank is used for parity checking, like on the IBM PC’s motherboard.
The Quadboard is interesting because from it you can learn a lot about the state of hardware in the early days of the PC.
The original IBM PC was well known to be an open architecture and expandable because of its five expansion slots. What is less well known is that you basically needed to use at least two or three slots in order to use the machine.
Other than a keyboard connector (and a rarely used and later removed cassette recorder port) the PC had almost no peripherals on the motherboard. In order to use a monitor or a disk drive or a printer, or a connect to an external modem you needed cards that supported these things.
In the simplest configuration using IBM’s hardware you could use a disk controller card and a monochrome display adapter (that came with a printer port) and the machine would be usable with only two cards. But, you didn’t have a serial port. Because the display adapter for a color monitor didn’t have a printer port you were looking at using three of the precious expansion slots just to be able to use the machine with a color monitor.
If you wanted to add something simple, like a clock that remembered the time when the PC was off, you we’re going to need to burn one of the few remaining slots. You could easily run out of slots and your expandable IBM PC would be a lot less expandable. As a result, expansion board manufacturers like Quadram started selling boards with multiple functions crammed on one board.
This Quadboard gives your PC a battery-backed real time clock, a serial port, a parallel port, and a RAM upgrade on a single board. I would imagine that because of the expense of the RAM, the reason you bought the Quadboard was for the memory upgrade aspect of the board.
The IBM PC models being sold in 1982 only came with up to 64K of RAM. In 1983 a later PC model came with a whopping 256K RAM and the XT started with 256K. If you want more RAM then upgrades consisted of an 8-bit ISA board like the Quadboard.
In the early 1980s RAM was expensive. According to this ad in Infoworld a fully populated 256K Quadboard like this one cost $995 in 1982.
An often overlooked aspect of the early history of personal computers is how important the cost of RAM was. In 1977 when Atari introduced the Atari 2600, they could only afford to put 128 bytes of memory in it.
The reason why the Macintosh in last week’s entry was released in January 1984 with only 128KB of memory was that Apple had a limited amount of space on the board for RAM. They either had to wait for the next generation of higher capacity RAM chips to fit more RAM in the same space (and add $1000 more to the cost of a machine that was already $2495) or send it out the door with only 128K memory, knowing that would limit the machine’s potential. But hey, real artists ship.
So, it must have been a huge deal for whoever bought this Quadboard in 1982 to be adding 256KB into their PC.
What would someone have gained at the time by adding 256K to their PC? You have to remember that unlike the period from about 1990 to 2005 or so when it seemed that your PC was obsolete after 2-3 years the period from 1981 from 1990 was very different.
Once the basic platform had been established of a PC with an 8088 processor and MS-DOS it sort of stuck that way for almost 10 years. When subsequent processors like the 286 and 386 came out most popular software largely used them as fast 8088s and didn’t take that much advantage of their new capabilities. It wasn’t really until 1990 and Windows 3.0 that the needs of graphics and multitasking finally forced the platform to move forward, which obsoleted the 8088 and 8086.
From 1981 to 1990 you probably wanted a full 640KB memory in your PC but if you didn’t, it wasn’t the end of the world.
For example, in 1987 the popular dBase III Plus only required a PC with 256KB RAM, though it recommended 384KB.
Lotus 1-2-3 Release 2.2 from 1989, which was considered a pretty heavy application at the time, wanted at least 320KB memory, though they recommended 512KB.
Even a pretty serious game like 1986’s Starflight from Electronic Arts just asked for 256KB.
So, someone dropping a grand on 256KB memory in 1982 was keeping their PC from the scrap heap through about 1989. That’s pretty good value for their money.
On the other hand, the owner probably wanted more and that’s why this 256KB Quadboard ended up in the box of the 1984 model that accommodated even more memory.
If you, like me have a bizarre fascination with the early days of PC hardware, these old books can be very informative.
PC’s From Scratch
by Corey Sandler, Tom Badgett, and Wade Stallings
Published by Bantam Computer Books 1990
The PC Upgrader’s Manual
by Gilbert Held
Published by John Wiley & Sons 1988