This is my Panasonic Senior Partner a “luggable” portable MS-DOS computer from the mid-1980s.
All buttoned up like this, you might wonder if it’s some sort of old video camera case.
When you open it up and plug in the keyboard, it becomes apparent that this is actually a very old PC…A very old PC that works.
I found this Senior Partner in the Fall of 2011 at Village Thrift and it’s probably one of my proudest thrift store finds of the last 5 or so years. It’s become an incredibly rare experience to find 1980s PC hardware at thrift stores and it blows my mind that this one is still in working order.
As PC hardware goes, this is almost as basic as it gets. You have an 8088, a monochrome CRT monitor (with a DB-9 connector for color RGB on the back), a serial port, a parallel and two 5.25″ disk drives. There’s no hard drive. There’s no built-in clock. This machine predates mice on the PC by several years (unless you used a serial mouse). The only “luxury” is that this machine has is 512K RAM and a built in thermal printer hidden under a flap on the top of the computer. I suspect the computer’s name derives from the fact that with a built-in printer this machine could be considered a portable office for mid-1980s businesspeople.
But, there’s no battery. This is not a mobile machine. It’s a machine you lugged from place to place where you had a place to sit it down and AC power available to plug into.
The keyboard doubles a a cover to enclose the monitor and floppy drives.
When you detach the keyboard you have to pull the retractable keyboard cable out it’s hiding place below the “Panasonic Sr. Partner” label to the left of the CRT and attach it to the connector that hides under a cap on the keyboard.
There are also little lifts you can pull out from the keyboard to place it at a comfortable angle.
As the name entails this was a machine its designers intended for business users. The monochrome CRT is extremely crisp for word processing and spreadsheets. When (before the paper ran out) I fired up an old copy of Print Shop the thermal printer gladly printed with no additional setup.
One could imagine some business travelers in a hotel room preparing for a meeting the next day huddled around the tiny green screen furiously printing curled up thermal printed documents…Almost.
Consider the fact that this thing is 35 pounds. Imagine lugging that around an airport. There’s a good reason why the luggable form factor that began with the Osbourne and the Kaypro luggables and continued with the famous Compaq Portable was a technological dead end. The Senior Partner is even larger than the Macintosh despite that machine having a larger screen not actually being intended to be luggable.
The reason for this, as I understand it is that luggables were just normal PC components with all of their heft and hungry power consumption, wedged into an unorthodox case that happened to have a handle. The engineering advances that needed to happen to make portable computers into “laptops” happened later in the PC realm (though certainly the Grid Compass and a few others were showing the way even when the Senior Partner was on store shelves).
As an antique though, this thing is fantastic. The Senior Partner is a self-contained retro-computing party.
Easy to setup and quick to put away when you’re done. When it goes back on the shelf you can easily stack stuff on it’s hard shell.
And simply as an object it looks fantastic. Sure, it does not look (or act) like the glorious 80s vision of the future embodied in the brilliant Macintosh and Macintosh SE designs. There’s no Snow White timelessness here. But, what the Senior Partner does look like is the offspring between a Mission Control command console and an armored personnel carrier. You have no doubt as to which floppy drive is which because there are huge thick drive letters printed beneath the drives. The huge embossed “Panasonic” name looks like what you see on the back of a pick up truck. This machine looks serious in a way that I just adore.
Nothing says retro quite like a brilliant glowing green CRT screen.
When you’re sitting with a machine like this you feel a closeness to technology that is unlike using a computer today. When you use a modern computer you are swathed in warm colors and pictures designed to make you feel comfortable. You can quickly switch between multiple programs or browser tabs. There are a million things saying “use me”.
On a machine like the Senior Partner you basically have one thing in front you. You have one program with a handful of options so it demands concentration, but the high contrast of the screen makes it easier to concentrate because only the program is glowing and all else is empty darkness. This is the cyberspace equivalent of a sensory deprivation chamber.
The closest thing I can compare that feeling to is using an e-Ink Kindle.
I suspect that this machine spent a lot of it’s life “buttoned up” and that accounts for what great shape it’s in today. Despite being almost 30 years old it seems like a missing pad on the “bottom” side that faces downward then the machine is laying handle side up and a few scuffs are the only things wrong with it. There was little opportunity for dust to get into the keyboard and the disk drives. I also suspect that this machine may not have gotten that much use in general considering the lack of burn-in on the monitor.
As a retro-computing machine, it is not perfect. For one thing I have no idea how to get inside of the machine, or if that is even a good idea. On the one hand, generally if a machine has a CRT I don’t want to get inside of it. On the other hand, I can’t find an obvious way to replace the printer paper and I wonder if they just intended you to open the case for that. The back of the machine has what looks to be where an indication of an internal expansion slot, which would be more evidence that you are intended to be able to safely get inside of the machine.
Having only a monochrome screen, no hard drive, only 512K RAM, and no joystick port makes this less than ideal to play many old games or some of the more prominent software I’ve collected. As you can imagine finding software for a PC with 512K RAM, no hard drive, and only 5.25″ floppies might be an issue.
However, I’ve had some good luck in this area.
When I first bought this machine I remembered that in my parent’s attic I had saved the 5.25″ floppies from an Epson 286 we had gotten as a hand-me-down from my aunt in Cleveland in 1995. When we had discarded the Epson I had made sure to save the 5.25″ MS-DOS boot and installation disks as well as some educational programs, including the immortal classics The Oregon Trail and Where in the World is Carmen Sandiego?.
This meant that when we brought the Senior Partner home from Village Thrift I had a working DOS startup disk and a few programs so I had the bare minimum needed to see the machine working.
Several months later I found this insane lot of 5.25″ PC games on ShopGoodwill. I think I paid $16.25 for this lot including ShopGoodwill’s usually exorbitant shipping cost. What I received is a treasury of late 80s/early 90s PC games.
Here are just a few of the games in that box.
Many of these games require hard disk installation but several, like Ultima I (which we saw running on the Senior Partner in the Commodore 1084 post) and Hitchhiker’s Guide to the Galaxy are perfectly at home running on an early floppy-only PC. It turns out that many games from the late 1980s basically assumed a PC with 384K-512K RAM so they run just fine on the Senior Partner.
Finally, last year my uncle gave me his old PC and the Commodore monitor I mentioned previously. Along with that was his box of 5.25″ disks that went with the PC.
The best thing in the box was a disk labeled IBM DOS 3.2.
The Epson MS-DOS 3.30 disk I had been using was fine for booting the machine but because it was only indented as a minimal OS to be used to install the other disks it was missing several important utilities like CHKDSK. With my uncle’s DOS 3.2 disk I could finally confirm how much memory the machine had.
There was also a disk labeled Lotus 1-2-3, which I had badly wanted to see running on a vintage machine.
There was a time when this screen was a common as the Google homepage to computer users.
Using this machine also taught me a lot about MS-DOS. Today DOS is remembered as a difficult monster of an OS; cold to use and brutal to configure. Some of that is true. Some of that was Apple advertising crud. But I think a lot of that image of MS-DOS came from the time after about 1988 until the release of Windows 95 (and even a little after) when so many odd tricks had to be crammed into DOS so that it could use more than 640K memory and use new hardware like sound cards that were not supported without strange autoexec.cfg and config.sys changes. The nonsense you had to go through to use the hardware in your PC had was truly insulting.
However, in the earlier period the Senior Partner belongs to DOS seems almost tame. You change directories. You list the files in a directory. You run a program. You change drives. You format a disk. It almost seems quaint compared to the ordeals that people had using DOS later. DOS was clearly meant for a machine like the Senior Partner; this was its heyday. After that point it slowly turned into a curmudgeonly antique.
I remember reading DOS for Dummies and seeing all of these commands the author basically told you you shouldn’t touch with a ten foot pole. I wondered what had happened for these commands to have been put into the OS and never taken out in the intervening years. These were things for configuring serial ports and display modes that made made sense on machines like the Senior Partner in the 1980s but were increasingly less relevant as time wore on. The large group of people who first encountered PCs in the early 90s ran headfirst into this confusing period where DOS was a geological dig of successive eras stacked on top of each other.
To use a machine as old as the Senior Partner seems quaint not just because of it’s age but because it’s so old that DOS actually makes sense.
This is a 3DFX Voodoo 2 V2 1000 PCI still sealed in the box.
I actually own three Voodoo 2’s. The first one is a Metabyte Wicked 3D (below, with the blue colored VGA port) that I bought from a friend in high school. The second one is the new-in-box 3DFX branded Voodoo 2 I bought off of ShopGoodwill last year. The third one (below, with the oddly angled middle chip) is a Guilliemot Maxi Gamer 3D2 I bought at the Cuyahoga Falls Hamfest earlier this year.
The Voodoo 2, in all of its manifestations, is my favorite expansion board of all time. It’s one of the last 3D graphics boards that normally operated without a heatsink so you can gaze upon the bare ceramic chip packages and the lovely 3DFX logos emblazoned upon them. It was also pretty much the last 3D graphics board where the various functions of the rendering pipeline were still broken out into three separate chips (two texture mapping units and a frame buffer). The way the three chips are laid out in a triangle is like watching jet fighters flying in formation.
It’s hard to explain to someone who wasn’t playing PC games in the late-1990s what having a 3DFX board meant. It was like having a lightswitch that made all of the games look much, much better. There are perhaps three tech experiences that have utterly blown my mind. One of them was seeing DVD for the first time. Another is seeing HDTV for the first time. Seeing hardware accelerated PC games on a Voodoo 2 was on the same level.
A friend of mine in high school won the Metabyte Wicked 3D in an online contest. I remember the day it arrived at his house I had walked home from school trudging up and down piles of snow that had been piled up on the sidewalk to clear the roads and I got home exhausted…And he calls me asking if I wanted to come over as he installed the Voodoo 2 card and fired up some games. Even though I was exhausted I eagerly accepted.
I think I saw hardware accelerated Quake 2 that day…Nothing else would ever be the same. I was immensely jealous.
Ever since personal computers have been connected to monitors there has been some sort of display hardware in a computer that output video signals. Often times this hardware included capabilities that enhanced or took over some of the CPU’s role in creating graphics.
When we talk about 2D graphics we mean graphics where for the most part the machine is copying an image from one place in the computer’s memory and putting int another place in the computer’s memory. For example, if you imagine a scene from say, Super Mario Bros. the background is made up of pre-prepared blocks of pixels (ever notice how the clouds and the shrubs are the same pattern or pixels with a color shift?) and Mario and the bad guys are each particular frame in a short animation called a sprite. These pieces of images are combined together in a special type of memory that is connected to a chip that sends the final picture to the TV screen or monitor.
It’s sort of like someone making one of those collages where you cut images out of a magazine and paste them on a poster-board. The key to speeding up 2D graphics in a computer is speeding up the process of copying all of the pieces of the image to the special memory where they need to end up. You might have heard about the special “blitter” chips in some Atari consoles and the famous Amiga computers that made their graphics so great. 2D graphics were ideal for the computing power of the time but they give videogame designers limited ability to show depth and perspective in videogames.
Outside of flight simulator games and the odd game like Hard Drivin and Indy 500 almost all videogames used 2D graphics until the mid-1990s. PC games during the 2D graphics era were mostly being driven by the CPU. If you bought a faster CPU, the games got more fluid. There were special graphics boards you could buy to make games run faster, but the CPU was the main factor in game performance.
Beginning in about 1995-1996 there was a big switch to 3D graphics in videogames (which is totally different than the thing where you wear glasses and things pop out of the screen…that’s stereoscopics) and that totally changed how the graphics were being created by the computer. In 3D graphics the images are represented in the computer by a wireframe model of polygons that make up a scene and the objects in it. Special image files called textures represent what the surfaces of the objects should look like. Rendering is the process of combing all of these elements to create an image that is sent to the screen. The trick is that the computer can rotate the wireframe freely in 3D space and then place the textures on the model so that they look correct from the perspective of the viewer, hence “3D”. You can imagine it as being somewhat akin to making a diorama.
With 3D graphics videogame designers gained the same visual abilities as film directors: Assuming the computer could draw a scene they could place the player’s perspective anywhere they desired.
The problem with 3D graphics is that they are much more taxing computationally than 2D graphics. They taxed even the fastest CPUs of the era.
In 1995-1996 when the first generation of 3D games started appearing in PCs they looked like pixelated messes on a normal computer. You could only play them at about 320×240, objects like walls in the games would get pixelated very badly when you got close to them, and the frame rate was a jerky 20 fps if you were lucky. Games had started using 3D graphics and as a result required the PC’s CPU to do much, much more work than previous games. When Quake, one of the first mega-hit 3D graphics-based first person shooters came out it basically obsoleted the 486 overnight because it was built around the Pentium’s floating-point (read: math) capabilities. But even then you were playing it at 320×240.
At the same time arcade games has been demonstrating much better looking 3D graphics. When you sat down in front on an arcade machine like Daytona USA or Virtua Fighter 2 what you saw was fluid motion and crisp visuals that clearly looked better than what your PC was doing at home. That’s because they had dedicated hardware for producing 3D graphics that took some types of work away from the CPU. These types of chips were also used in flight simulators and they were known to be insanely expensive. However, by the time the N64 came out in 1996 this type of hardware was starting to make it’s way into homes. What PCs needed was their own dedicated 3D graphics hardware. They needed hardware acceleration.
That’s what the Voodoo 2 is. The Voodoo 1 and it’s successor the Voodoo 2 were 3DFX’s arcade-quality 3D graphics products for consumer use.
A texture mapping unit (the two upper chips labeled 3DFX on the Voodoo 2) takes the textures from the memory on the graphics board (many of those smaller rectangular chips on the Voodoo 2) and places them on the wireframe polygons with the correct scaling and distance. The textures may also be “lit” where the colors of pixels may be changed to reflect the presence of one or more lights in the scene. A framebuffer processor (the lower chip labeled 3DFX) takes the 3D scene with the texture and produces a 2D image that is built up in the framebuffer memory (the rest of those smaller, rectangular chips in the Voodoo 2) that can be sent to the monitor via the RAMDAC chip (like a D/A converter for video, it is labeled GENDAC on the Voodoo 2).
3DFX was the first company to produce really great 3D graphics chips for consumer consumption in PCs. Their first consumer product was the Voodoo 1 in late 1996. It was soon followed in 1998 by the Voodoo 2.
The Voodoo 2 is a PCI add-in board that does not replace the PC’s 2D graphics card. Instead, there’s a cable that goes from the 2D board to the Voodoo 2 and then the Voodoo 2 connectors to the monitor. This meant that the Voodoo 2 could not display 3D in a window, but what you really want it for is playing full-screen games, so it’s not much of a loss.
My friend who won the Metabyte Wicked 3D card later bought a Voodoo 3 card and sold me the Voodoo 2 sometime in 1999.
I finally had hardware acceleration. At the time we had a Compaq Presario that had begun life has a Pentium 100 and had been upgraded with an Overdrive processor to a Pentium MMX 200. It was getting a bit long in the tooth by this time, which was probably 1999. Previously I had made the mistake of buying a Matrox Mystique card with the hope of it improving how games looked and being bitterly disappointed in the results.
Having been a big fan of id Software’s Doom I had paid full price ($50) for their follow-up game Quake after having played the generous (quarter of the game) demo over and over again. Quake was by far my favorite game (and it’s still in my top 5).
id had known that Quake could look much better if it supported hardware acceleration. They had become frustrated with the way that the needed to modify Quake in order to support each brand of 3D card. Basically, the game needs to instruct the card on what it needs to do and each of card used a different set of commands. id had decided to create their own set of commands (called a miniGL because it was a subset of SGI’s OpenGL API) in the hope that 3D card makers would supply a library that would convert the miniGL commands into commands for their card. The version of Quake they created to use the miniGL was called GLQuake and it was available as a freely downloadable add-on.
It’s a little hard to show you this today, but this is what GLQuake (and the Voodoo 2) did for Quake. First, a screenshot of Quake without hardware acceleration (taken on from the Pentium III with a Voodoo 3 3000):
Now, with hardware accelerated GLQuake:
Suddenly the walls and floor look smooth and not blocky. Everything is much higher resolution. In motion everything is much fluid. It may seem underwhelming now, but this was very hot stuff in 1997 and blew me away when I first saw in 1999.
What we didn’t realize at the time was that it was pretty much all downhill for 3DFX after the Voodoo 2. After the Voodoo 2 3DFX decided to stop selling chips to 3rd party OEMs like Metabyte and Guilliemot and produce their own cards. That’s why my boxed board is just branded 3DFX. This turned out to be disastrous because suddenly they were competing with the very companies that had sold their very successful products in the 1996-1998 period. They also released the Voodoo 3, which combined 2D graphics hardware with 3D graphics hardware on a single chip (that was hidden under a heatsink).
The Voodoo 3 was an excellent card and I loved the Voodoo 3 3000 that was in the Dell Pentium III my parents bought to succeed the Presario. However, 3DFX was having to make excuses for features that the Voodoo 3 didn’t have and their competitors did have (namely 32-bit color). Nvidia’s TNT2 Ultra suddenly looked like a better card than the Voodoo 3.
3DFX was having trouble producing their successor to the Voodoo line and instead was having to adapt the old Voodoo technology to keep up. The Voodoo 4 and 5, which consisted of several updated Voodoo 3 chips working together on a single board ended up getting plastered by Nvidia’s GeForce 2 and finally GeForce 3 chips which accelerated even more parts of the graphics rendering process than 3DFX did. 3DFX ceased to exist by the end of 2000. Supposedly prototypes of “Rampage”, the successor to Voodoo were sitting on a workbench being debugged the day the company folded.
Back in the late-1990s 3D acceleration was a luxury meant for playing games. Today, that’s no longer true: 3D graphics hardware is an integral part of almost every computer.. Today every PC, every tablet, and every smartphone sold has some sort of 3D acceleration. 3DFX showed us the way.
This week I’m going to indulge myself with some extremely nerdy PC history. Find a comfortable chair because this is going to take awhile.
This is a sealed copy of Windows/386 which I purchased on eBay recently. Windows/386 was a version of Windows 2 released in late 1987 that was able to multitask DOS applications on a 386.
There’s a good chance you’ve never heard of Windows/386. It’s possible you may have seen this over-the-top 12 minute marketing film that Microsoft created for Windows/386 where a businesswoman uses the multitasking power of Windows/386 (and 80s fashions) to save the day.
I had really only heard of Windows 2 in the context of books written later like Windows for Dummies that basically said “don’t bother with Windows 2, buy Windows 3”.
If you’re wondering what Windows 2 looked like, here’s a close up of that screenshot on the front of the box:
I started collecting boxed software around the same time that I started buying the old Macintoshes in the late 1990s. One of the first copies of Windows I found at a thrift store was this sealed copy of Windows 3.1 I found amongst the toys and board games at the old State Road Goodwill 10-15 years ago. Apparently I paid $3.00 for it.
At the time I bought it because I loved the way the front of the Windows 3.1 box looks. It’s still my favorite OS box.
Now, Windows 3.1 in a box is not hard to find. In fact, I also have an open box copy of Windows 3.1. But, Windows 3.0 is much harder to find in a thrift store. I recently found a copy of Windows 3.0’s disks and manuals without the box at the 2013 Cuyahoga Falls Hamfest and shortly after that I bought a sealed copy on eBay. I’ve never seen any Microsoft OSes older than Windows 3.1 “in the wild” just hanging around in a thrift store.
Windows 2.0 is even harder to find than Windows 3.0. Up until I bought this copy on eBay I had never seen it in person.
I was inspired to buy this copy of Windows/386 (which as you can imagine is a bit pricey) because I’ve been reading Andrew Schulman’s really fascinating (and some say controversial) book Unauthorized Windows 95, which I found for a few bucks on ABEBooks.com.
So what does a book about Windows 95 have to do with Windows/386 which came out years before? This is the point where this post is going to get extensively nerdy.
Let me begin by showing you the back of the Windows/386 box:
The description on the back of this box is glorious. This is one of the most horribly tortured chimeras of Microsoft marketing and technical jargon I’ve ever seen. But, by decoding this mess you can learn an enormous amount about what was going on with the PC in 1987-1989.
Here it is typed out so that it’s easier to read:
Integrate the next generation of Windows applications and existing DOS applications using multi-tasking Microsoft Windows/386. it’s the one graphical environment that gives you a standard path to OS/2, the operating system of the future.
Microsoft Windows/386 turns your 80386-based personal computer into a multi-tasking virtual machine. You can run any number of Microsoft Windows applications and existing DOS applications at the same time, limited only by the memory in your system. Each DOS application gets it’s own 640K “8086” environment to run in and will run in the background regardless of what else is running. And each DOS application can run in a window or use the full screen.
Your future is secure with Microsoft Windows/386 because once you learn its standard user interface, you’ll be well on your way to knowing how to use all the new graphical applications developed for Windows, even those developed for OS/2.
- Enhanced support for DOS applications–cut and paste selected data between Microsoft Windows applications and existing DOS applications.
- Full multi-tasking virtual machine environment–both DOS and Windows applications run concurrently in their own 640K memory space.
- Consistent with OS/2 presentation manager including overlapping windows and enhanced keyboard and mouse controls.
- Emulation of expanded memory specifications–no special hardware needed.
- Supports memory-resident applications.
- Compatible with programs written under Microsoft Windows versions 1 and 2.
- Supports Intel 80287 and 80387 math coprocessors.
- Includes a wide range of desktop applications, including Microsoft Windows Write and Microsoft Windows Paint.
You can probably make out a few important themes here.
The first is that in 1987 it’s very exciting to be able to multitask DOS applications (today we seem to have lost the hyphen in multi-task).
Second, is that for some reason, even though you just bought Windows/386, Microsoft would really rather you buy OS/2. Usually the back of an expensive thing’s box is trying to stroke your ego and make you feel real smart for buying this expensive thing. Here, oddly, they’re saying you’re smart for buying this thing because then you’ll be ready to buy the next thing.
Third, they’re really eager to tell you that they are able to multitask DOS applications through the magic of virtual machines.
All of these things are really about memory and the nightmare that was the 640K “barrier” created by the 8086/8088 and DOS and what had to happen to get around the barrier.
The 8086 was a 16-bit processor (along with it’s cheaper version the 8088). Processor “bit-ness” is a complicated subject but in general it means that the 8086 operated on numbers that were 16-bits long. The 8086 was designed to address (i.e. talk to) up to 1MB of RAM. 1 MB of RAM has 20-bit memory addresses so Intel came up with a technique where they added together two 16-bit numbers in a way that came up with a 20-bit memory address.
Here’s a diagram from page 25 of Peter Norton’s 1985 book Programmer’s Guide to the IBM PC that shows how they did it:
This method of dealing with memory addresses was a central part of how every DOS program worked. This became known as Real Mode.
DOS was an operating system for the 8086 microprocessor, so it too operated in Real Mode.
IBM decided to use some of the memory addresses to talk to ROM chips, graphics card memory, and other things so that the memory addresses that could actually be connected to real RAM were limited to 640K. That’s where the so-called “barrier” came from.
Windows 1.0 was an attempt to create an “Operating Environment” that still ran in Real Mode where programmers could create graphical applications. As you can imagine, Windows 1 was held back by the limited amount of memory available in Real Mode and the lack of built in multitasking features in the 8086.
In general, when you’re building a multitasking processor you want several things. One thing you want is a timer that the operating system can set inside the CPU. That way a program gets a set block of time in which to do some processing before the timer goes off and the CPU wakes up the OS and the OS checks if the program needs more time or if another program should get it instead.
These are the basic building blocks of task switching. If the CPU switches fast enough it can appear to a human being that their CPU that can only do one thing at a time is actually running multiple programs at once. In reality the programs are all getting bits of time given to them by the operating system in which to execute.
Another thing you want is a sort of built in security system in the CPU that separates the programs from the OS so that the OS is in charge if the computer and can juggle these programs. Ideally you want it to be so that when the program needs to do something important (like say, putting a pixel on the screen or writing a file to the disk) it has to ask the OS to do that thing for the program.
That way multiple programs asking for important things at once don’t interfere with each other. The OS becomes like a traffic light at major intersection that prevents cars from running into each other. If a program tries to get the CPU to do something only the OS should do, the CPU should stop it.
Those are both things that Intel built into the 80286, their followup CPU to the 8086. The 286 was built for multitasking and it was built to talk to up to 16MB of RAM. These multitasking features were called Protected Mode.
This would have been great except Protected Mode was incompatible with Real Mode. That method by which Real Mode came up with memory addresses in that diagram above did not exist in Protected Mode. It used a different method. You could not simply load up a Real Mode program (like all DOS programs at the time) into Protected Mode.
Fortunately, the 286 could also pretend to be a fast 8086 operating in Real Mode. But you also couldn’t switch between the two in an elegant way.
So when people brought home their expensive new IBM PC AT’s with 286 CPUs in late 1984 they were mostly running them in Real Mode, running DOS applications.
What was needed was a new Protected Mode operating system and new Protected Mode programs to really use the multitasking Protected Mode on the 286.
This is why in 1985 IBM and Microsoft decided to create OS/2 (aka Operating System/2) with was a protected mode operating system for the 80286. They also came up with a way where OS/2 could run a single DOS application at a time by fooling the 286 into switching between Protected Mode and Real Mode in a way Intel had not really intended. Running DOS applications on OS/2 version 1 got a reputation for being slow and terrible.
OS/2 was a large undertaking and while it was announced in April 1987 and version 1.0 was released in December 1987, it wasn’t really complete. OS/2 1.0 did not come with a graphical interface at all. That would not come until OS/2 1.1 in October 1988.
By the way, OS/2 1.1 called it’s graphical interface the Presentation Manager, which is why Windows/386 is also calling itself a presentation manager.
So, now we have two messes. The first mess is the “640K barrier” that limited the PC platform. The second mess was the fact that the 286 could not multitask DOS applications.
So, let’s say it’s 1987 and you have a pile of very expensive, serious DOS applications.
Are you happy with OS/2? Do you really want to have to replace all of your applications with new Protected Mode versions? Or, would you rather have something that could just multitask the pile of applications you have now?
Meanwhile, Intel released the successor to the 286, the 80386 in late 1985 and by late 1986 you could buy one in outrageously expensive PCs like the Compaq Deskpro 386 . Here Intel finally set the groundwork to clean up both messes.
The 386 was now a 32-bit processor and could address up to 4GB of RAM. The 286’s protected mode was enhanced to become 32-bit Protected Mode in the 386. It also contained a Memory Management Unit, which breaks memory up into pages that can be swapped out to disk if necessary. This is the technique called virtual memory. As you can imagine this is very useful for multitasking programs.
With the 386 a PC contained most of the hardware features that allowed expensive mainframe and mini-computers the ability to multitask large programs but it would need a new operating system to take advantage of them.
There were ways for new programs that ran on DOS to use the new features in the 386. If you ever ran Doom on DOS you saw a bunch of text gobbledegook fly past before the game loaded. Part of what it was doing was loading a program called a DOS Extender that was basically a sort of mini-OS that let Doom’s 32-bit Protected Mode code talk to Real Mode DOS.
But, running old DOS programs needed something else. In order to finally solve the 8086 and DOS’s memory mess, Intel borrowed a trick from the mainframe world: virtual machines.
In order to explain what a virtual machine is here is a modern example:
That’s Super Mario Bros., which was an NES game from 1985. But it’s running on a Nintendo 3DS from 2011 rather than an NES from 1985. The game thinks it’s running on an NES, but in reality it’s running on a piece of software called a virtual machine that just looks like an NES to the game but is really a different machine entirely.
What Intel did with the 386 was to build in a new mode called Virtual 8086 Mode (aka V86 Mode) that created 8086 virtual machines. V86 mode was really a version of Protected Mode modified so that the Memory Management Unit would take care of handling the different methods of dealing with memory addresses.
A 386 could create multiple virtual machines running in V86 Mode. In order to do this Intel said a software vendor would have to supply a program called a Virtual Machine Monitor, or VMM, which was actually a small 32-bit Protected Mode operating system that would manage the virtual machines. The VMM was really in charge of the computer. A program running in a V86 virtual machine thought it was talking to DOS but in reality the VMM was deciding to let it talk to a copy of DOS. The VMM is like the 3DS’s Virtual Console is my example above.
This is where Windows/386 comes in. Windows/386 was the first Microsoft operating system to contain Microsoft’s VMM. In fact, the whole thing ran inside of virtual machines that were really controlled by the VMM. So, there was one virtual machine that contained a copy of Windows 2 that ran Windows programs and then as many other virtual machines as DOS programs you needed to run.
There was other software that contained VMMs that let you run multiple DOS applications. You can read about some of them in this February 1989 Infoworld article. But the raison d’être for Windows/386 was that it ran graphical Windows applications too.
In these screenshots from the Windows/386 box notice how some applications have the garish yellow bar with drop down menus and others do not. The windows with the menus are Windows applications and the ones without are DOS applications and they’re all shown living in harmony together.
You can see how this came dangerously close to dooming OS/2 already in 1987. Why buy a new OS that couldn’t run the applications you own now very well when there’s an alternative?
At the time however, in 1987, Microsoft and IBM were spending oodles on developing OS/2, and this is why you see such tortured marketing language on the Windows/386 box. They still really wanted to get you into a real Protected Mode OS.
The big problem for Windows/386 was that at the time very few people had 386s. When Windows/386 came out the Deskpro 386/20 (the 20MHz update to the original Deskpro 386) listed at $7,499 and went for a lot more when you added a larger hard disk and a better graphics card. Most people and businesses were still buying 286s, which is why IBM and Microsoft continued to bet on OS/2.
The other problem for Windows/386 was that people were not enthused about Windows 2 applications. Back in 1987-1989 Microsoft was not yet dominant.
The most notable Windows 2 application was Excel, but for the most part people were sticking with DOS applications they knew and loved like WordPerfect and Lotus 1-2-3.
But, Windows/386 was just the first VMM-based OS from Microsoft. This is where we come to that Unauthorized Windows 95 book.
Unauthorized Windows 95 is all about what the VMM did in Windows 3.0 through Windows 95. Windows 3.0, Windows 3.1, Windows 95, Windows 98, and Windows ME all were based on the VMM I described above. You may not have heard of Windows/386 but you most certainly have heard of Windows 95.
Windows 3.0 changed everything in 1990. It moved Windows into the 286’s Protected Mode and if you ran it on a 386 it also run under the VMM which allowed you to multitask DOS applications. Suddenly peopled wanted to use Windows applications and sales took off. Suddenly there was no need for OS/2.
Windows 3.0 basically killed OS/2 as the successor to DOS and led to the breakup of the joint development deal Microsoft had with IBM. Windows 3.0 is what made Microsoft the dominant beast that people remember today.
The ultimate expression of Microsoft’s power over the PC market was Windows 95. When Microsoft said “jump” everyone jumped to Windows 95.
When Windows 95 came out a lot of people wondered what exactly it was. Was it a real Protected Mode operating system? Was DOS still in there somewhere? Microsoft said it was 32-bit…Was it really?
The answers to these questions are contained within the 500+ pages of Unauthorized Windows 95. Basically, DOS existed in Windows 95 as a servant controlled by the VMM used for compatibility with DOS applications and drivers. By Windows 95 the VMM became modular and whole pieces of DOS could be replaced with newer Protected Mode code. It was true that the whole OS was not 32-bit, but the most important part, the VMM was definitely 32-bit.
In a real way, the seeds of Windows 95 started in 1987 with Windows/386.
If you’ve read this far, I thank you. I collect this stuff because I love tying together the strings of history attached to these old objects. Normal service with continue next week.
I found this Quadboard at the Abbey Ann’s off the Circle in Tallmadge sometime last year. It was sitting on their counter in a pile of items that had just came in.
When I saw it I immediately recognized that it was some sort of old RAM upgrade (from the ordered banks of DRAM chips on the left side of the board) but when I looked closer I impressed with how just how old it was. As third party add-ons for IBM PC hardware goes, this is almost as early as it gets.
I suspect this is one of those instances where the original owner replaced the older Quadboard with the newer model and the old one ended up in the newer one’s box with the newer one’s documentation.
The box and the documentation is for a 384K Quadboard from 1984 (with six banks of nine 64 kilobit DRAM chips) but the card inside is labeled 1982 and only had four banks of nine chips. Also notice how the picture on the box has two connectors for ribbons but the card I have only has one.
In doing some research about the Quadboard it looks like in 1982 Quadram sold several models all with the same number of DRAM chip sockets but with different number of chips installed. The one I have seems to be the high end of the original Quadboard line which means it’s fully populated with 256K RAM. If you had one of the cheaper models you could install more chips yourself as long as you added a whole bank at a time.
I believe the leftmost of the two sets of DIP switches on the top of the board are how you tell it how much RAM the PC has on the motherboard (so it knows what address to start with) and how much RAM is installed on the Quadboard.
Incidentally the reason that four banks of nine 64 kilobit memory chips equals 256K is that the 9th chip in each bank is used for parity checking, like on the IBM PC’s motherboard.
The Quadboard is interesting because from it you can learn a lot about the state of hardware in the early days of the PC.
The original IBM PC was well known to be an open architecture and expandable because of its five expansion slots. What is less well known is that you basically needed to use at least two or three slots in order to use the machine.
Other than a keyboard connector (and a rarely used and later removed cassette recorder port) the PC had almost no peripherals on the motherboard. In order to use a monitor or a disk drive or a printer, or a connect to an external modem you needed cards that supported these things.
In the simplest configuration using IBM’s hardware you could use a disk controller card and a monochrome display adapter (that came with a printer port) and the machine would be usable with only two cards. But, you didn’t have a serial port. Because the display adapter for a color monitor didn’t have a printer port you were looking at using three of the precious expansion slots just to be able to use the machine with a color monitor.
If you wanted to add something simple, like a clock that remembered the time when the PC was off, you we’re going to need to burn one of the few remaining slots. You could easily run out of slots and your expandable IBM PC would be a lot less expandable. As a result, expansion board manufacturers like Quadram started selling boards with multiple functions crammed on one board.
This Quadboard gives your PC a battery-backed real time clock, a serial port, a parallel port, and a RAM upgrade on a single board. I would imagine that because of the expense of the RAM, the reason you bought the Quadboard was for the memory upgrade aspect of the board.
The IBM PC models being sold in 1982 only came with up to 64K of RAM. In 1983 a later PC model came with a whopping 256K RAM and the XT started with 256K. If you want more RAM then upgrades consisted of an 8-bit ISA board like the Quadboard.
In the early 1980s RAM was expensive. According to this ad in Infoworld a fully populated 256K Quadboard like this one cost $995 in 1982.
An often overlooked aspect of the early history of personal computers is how important the cost of RAM was. In 1977 when Atari introduced the Atari 2600, they could only afford to put 128 bytes of memory in it.
The reason why the Macintosh in last week’s entry was released in January 1984 with only 128KB of memory was that Apple had a limited amount of space on the board for RAM. They either had to wait for the next generation of higher capacity RAM chips to fit more RAM in the same space (and add $1000 more to the cost of a machine that was already $2495) or send it out the door with only 128K memory, knowing that would limit the machine’s potential. But hey, real artists ship.
So, it must have been a huge deal for whoever bought this Quadboard in 1982 to be adding 256KB into their PC.
What would someone have gained at the time by adding 256K to their PC? You have to remember that unlike the period from about 1990 to 2005 or so when it seemed that your PC was obsolete after 2-3 years the period from 1981 from 1990 was very different.
Once the basic platform had been established of a PC with an 8088 processor and MS-DOS it sort of stuck that way for almost 10 years. When subsequent processors like the 286 and 386 came out most popular software largely used them as fast 8088s and didn’t take that much advantage of their new capabilities. It wasn’t really until 1990 and Windows 3.0 that the needs of graphics and multitasking finally forced the platform to move forward, which obsoleted the 8088 and 8086.
From 1981 to 1990 you probably wanted a full 640KB memory in your PC but if you didn’t, it wasn’t the end of the world.
For example, in 1987 the popular dBase III Plus only required a PC with 256KB RAM, though it recommended 384KB.
Lotus 1-2-3 Release 2.2 from 1989, which was considered a pretty heavy application at the time, wanted at least 320KB memory, though they recommended 512KB.
Even a pretty serious game like 1986’s Starflight from Electronic Arts just asked for 256KB.
So, someone dropping a grand on 256KB memory in 1982 was keeping their PC from the scrap heap through about 1989. That’s pretty good value for their money.
On the other hand, the owner probably wanted more and that’s why this 256KB Quadboard ended up in the box of the 1984 model that accommodated even more memory.
If you, like me have a bizarre fascination with the early days of PC hardware, these old books can be very informative.
PC’s From Scratch
by Corey Sandler, Tom Badgett, and Wade Stallings
Published by Bantam Computer Books 1990
The PC Upgrader’s Manual
by Gilbert Held
Published by John Wiley & Sons 1988