Graphics Cards


Throughout the days of DOS an evolution took place in display technology which took us from text-only (seriously, no graphics at all) through to beautiful, lifelike graphics that aren't too far off what we see today. The first IBM PC, being very modular in design came with a "display card" that could be replaced or upgraded with ease. Without a display card the system would not be able to show anything to the user.

Third parties also produced expansion cards and video chipsets that were compatible with the standards set by IBM from the early days of the PC. Most of the key manufacturers have their own dedicated pages here at DOS Days - just head over to the DOS Hardware Index.


A computer's performance originally stemmed from the microprocessor's clock speed because nearly every operation was conducted by the CPU. As graphics-intensive programs (CAD, games, etc) came around, more burden was placed on a computer's graphics card. As time went on, Microsoft Windows and 3D games arrived, which led to the advent of 2D and 3D 'accelerator' cards.

The earliest PC graphics cards were 'full length' - they filled the IBM PC's chassis from front to back - and consisted of many discrete components with a single display controller chip and a small amount of DRAM to store the screen attributes. From 1987, it became more commonplace to integrate the majority of graphics circuitry on a card into a 'chipset'. Chipset manufacturers could sell their chipsets to OEMs (Original Equipment Manufacturers) to use on their own branded cards, requiring the OEM to need a much smaller quantity of other supporting components to produce a fully functional graphics card - the total component count could be reduced from over 200 to about 15. The chipset was typically either one or two chips. While the chipset does have a big impact on a card's performance, it's important to note that two cards that use the same chipset can still differ widely in overall performance.

Up to early 1992 all PC expansion cards used the ISA bus, which ran at 8 MHz - some motherboards supported overclocking to 10 or 12 MHz on this bus, but a lot of ISA cards weren't designed to run at these higher speeds. Naturally, a 16-bit VGA card is able to execute its graphics demands up to twice as fast as an 8-bit card because of the wider bus width (same clock frequency but twice as many 'lanes on the highway' to transfer data to and from the card).

From early 1992 until late 1993, VESA Local Bus (sometimes called "VL Bus" or simply "VLB") cards arrived, and these eclipsed ISA-based cards in terms of performance due to no longer being bound by the slow 8 MHz ISA bus. A VLB graphics card typically transferred data to and from the motherboard at 33 MHz.

In late 1993, PCI graphics cards took over from VLB, making the separation of expansion bus clock from system clock universal for all expansion cards. VESA Local Bus had proven to be unreliable at higher clock speeds. The arrival of AGP in 1997 was a derivative of the PCI standard, and returned us to having a dedicated slot for graphics (VLB was not explicitly for graphics cards, but that was the primary motivating driver to its inception). ISA cards continued to be produced until around 1995.


Just as with sound cards, VGA graphics cards have a digital-to-analogue converter (DAC) - before IBM introduced the VGA standard, graphics cards produced digital TTL signals for display by a digital monitor, so there was no need to convert signals to analogue.

For graphics cards, the DAC is usually referred to as the RAMDAC (Random Access Memory Digital to Analogue Converter). These are used to store the colour palette (in embedded fast Static RAM), and to convert the digital colour 'codes' into an analogue signal that are then sent to an analogue monitor. Each colour code is made up of separate Red, Green and Blue (RGB) codes, and these codes are sent to separate embedded DAC circuits inside the RAMDAC which produce an analog voltage on that colour signal line to represent the given colour.

Since the mid-90s, the SRAM (Static RAM) portion of a RAMDAC is usually bypassed, with the DACs being sent display data directly for True Color modes. In such cases, the SRAM-stored palette is retained only for backward-compatibility with older software.

The speed of a graphics card is more a case of the main chipset and supporting circuitry than the RAMDAC used. The RAMDAC and the size of the card's video memory have a greater bearing on the number of display colours and palette size available.


The first display card for the PC was launched in 1981 - the MDA (Monochrome Display Adapter) and was available for both the 5150 (IBM PC) and 5160 (IBM PC/XT). It was designed to connect directly to the IBM 5151 monochrome [green phosphor] monitor. The card also came with a printer port to work with IBM printers, so it was often referred to as the Monochrome Display and Printer Adapter.

IBM Monochrome Display and Printer Adapter

The card came with 4 KB of video memory to store the screen attributes which are stored at memory address B0000-B0FFF, plus an 8 KB character ROM which held the character set images, though only 4 KB was actually used. The card could be accessed via I/O ports 3B0 - 3BF.

The MDA standard was intended to be used with a fixed frequency monitor, providing a 720 x 350 image format, although it was only capable of producing text output using its ROM-based character generator. These characters were "cells" of 9 x 14 pixels, so the image format is more accurately described as 25 lines by 80 characters each. This was known as 'mode 7', or 'mode 07h'. The character set was not redefinable. MDA cards that were sold in the USA contained what was called code page 437 - this was the set of characters that include all printable U.S. ASCII characters (32-126), extended codes for accented characters and some line drawing ones. For foreign markets, the ROM's character set was modified to suit Greek, Cyrillic, Hebrew and other characters.

The character set used by MDA

It uses a digital TTL output, meaning it fires either a 'high' or 'low', where a 'high' switches the CRT's beam on and a 'low' switches it off to create the characters. A separate 'intensity' output is also provided which could be used to change the brightness on a character-by-character basis.

It also provides a separate horizontal sync and vertical sync signal - something that is still retained in PC standards today. These synchronisation signals are required for CRT monitors and televisions to tell the The horizontal frequency used by the MDA standard is 18.432 kHz.

The MDA card could live happily alongside a CGA card (see below), but it's important to note that the MDA card can damage monitors that are not designed to take MDA video/sync signals such as the IBM 5153 (CGA) monitor and the IBM 5154 (EGA) monitor. When fitting different display cards into these older PCs you usually had to set some 'jumpers' or 'DIP' switches on the motherboard to tell it what was installed. Typically in an XT or clone, there was 8 DIP switches in 4 pairs. The third pair told the system what display card was installed: either monochrome, CGA, or no video at all. This setting used to confuse people, as for the more advanced display cards (EGA and VGA) you had to set the mainboard switches off. In fact, it was more logical than it seemed at first: the XT BIOS handled mono and CGA screens itself, but EGA and VGA cards had their own BIOS, so you needed to tell the onboard BIOS you had 'no video' in order to allow the EGA or VGA expansion card to have control.

On the MDA card itself were some jumpers. J1 is to be kept open, J2 is for an optional light pen, though IBM never released one of these themselves. If the card's bracket appears to be too large for your system it is because it was designed for the original IBM 5150, which had a wider spacing between expansion slots.



The HGC (Hercules Graphics Card) became a standard of sorts as it fully supported MDA but added a high-resoultion monochrome graphics mode of 720 x 348 pixels. This was widely adopted in the early 80s, and actually outlived CGA it terms of usefulness over the years due to the high quality text mode from its MDA backward compatibility whilst also offering a high resoution graphics mode (albeit in 2 colours). It was often used in a dual-monitor setup with an MDA card outputting to one display and the Hercules card outputting to a second display. Dithering techniques were often employed by games and other software to produce what appeared to be shades to further differentiate the 2 colours. To support the graphics mode, a Hercules card came with 64 KB of video RAM - 16 times that of an MDA card.

"Before VGA, monochrome graphics meant but one thing - the Hercules Graphics adapter. Fully compatible with IBM's Monochrome Display Adapter (MDA) in displaying text, the Hercules board added graphics to the capabilities of ordinary digital monochrome displays. Along the way, it brought the highest graphics resolution available to standard PC monitors before VGA: 720 by 348 pixels.

For nearly 5 years, the Hercules card was the standard for monochrome graphics. Dozens of programs were rewritten to take advantage of its capabilities, including the one that made the Hercules card a success, Lotus 1-2-3. But in 1987, the introduction of the VGA standard challenged Hercules' dominance. In the long run, VGA will likely triumph as the graphics standard for both color and monochrome."
     PC Magazine, August 1989




Arguably the lowest-grade of early PC colour graphics was IBM's Color Graphics Adapter (CGA). Introduced in 1981 it was the first to be able to display colour, and the first to display graphics (dot resolution) rather than just character text. IBM intended that CGA be compatible with a home television set. The 40×25 text and 320×200 graphics modes are usable with a television, and the 80×25 text and 640×200 graphics modes are intended for a monitor.

It was designed to be connected to the IBM 5153 (IBM Colour Display) or IBM 5154 (IBM Enhanced Colour Display) monitor via a 9-pin D connector. It's important to note that the CGA card can damage monitors that are not designed to take CGA video/sync signals such as the IBM 5151 (MDA) monitor.

IBM Color Graphics Monitor Adapter

With its 16 KB of onboard video RAM, it could show 4 colours on-screen simultaneously at a graphical resolution of 320 x 200, or 2 colours at 640 x 200. You could choose from two palettes, which looked like this:

It was designed to connect directly to the "IBM 5153 color display", but could also be connected to a television or composite video monitor using an RCA connector.

The IBM CGA card stores its screen content at memory addresses B8000-BBFFF and uses I/O ports at 3D0-3DF. Just like the IBM MDA card, it also has an 8 KB character ROM chip, of which 4 KB stores the MDA font, and two variants of the CGA font use up the other 4 KB in 2 KB chunks. At the heart of the CGA card, like its older MDA brother, is a Motorola MC6845 CRT controller chip.

The character ROM contains two different character fonts, both of which are square 8 x 8 pixel blocks.

Most graphics cards that supercede the CGA graphics standard offer either full CGA register compatibility, or may emulate some or all of the original CGA card's capabilities. To test how compatible one of these cards is, you can use CGA_COMP - a compatibility checker that runs in DOS.


Plantronics ColorPlus

The Plantronics ColorPlus card of 1982 provided a superset of the CGA standard, using the same monitor and the same resolutions. The Plantronics card has twice the memory of an IBM CGA card which can be used to double the colour depth, and provides graphics modes of 320 x 200 in 16 colours and 640 x 200 in 4 colours. It also offers a high-resolution text font. These are more or less the same as the PCjr/Tandy modes, but they require you to program the registers directly (no BIOS support). Some applications like Lotus 1-2-3 support them, but you can't use games configured to run in PCjr/Tandy mode.

Some third-party graphics cards could also display these "extended CGA" modes, usually describing them simply as "Plantronics mode".



When IBM launched the PCjr (model number 4860), it came with built-in colour graphics and 3-channel audio. Compared to the IBM PC or XT this was a marvel, since those required expensive expansion cards to achieve the same output.

The PCjr video output is compatible with all seven BIOS-supported CGA modes, plus additional 160 x 200 (16-colour), 320 x 200 (16-colour), and 620 x 200 (4-colour). The latter two, plus the 80 x 25 text mode require the optional 64 KB internal memory upgrade card, which doubles the PCjr's system RAM to 128 KB. Like CGA, PCjr video output is a composite signal that supports using a colour or black & white television set.

The main problem with emulating the Tandy graphics is that the video buffer is not in a fixed location. It moves up the more memory is in the computer.

When Tandy entered the PC clone market with their Tandy 1000 they liked the idea of having integrated video and audio circuitry, and so effectively copied the design of the PCjr.



IBM created the Enhanced Graphics Adapter (EGA) standard and released their card in October 1984 to coincide with the release of the new IBM PC/AT personal computer. Designed to connect directly to IBM's new Enhanced Colour Display, or ECD, monitor (Model 5154), it lasted as the best graphics standard right up to the release of MCGA and VGA standards in 1987 when IBM launched the PS/2 range of personal computers.

The IBM Enhanced Graphics Adapter

The card came with 64 KB of video memory and a 16 KB ROM, which permitted resolutions up to 350 lines in text mode. The EGA standard can handle up to 256 KB of video memory, allowing up to 64 colours on-screen at once. While the base card only had 64 KB of RAM, you could purchase the Graphics Memory Expansion Card into which you could plug a further 192 KB of DRAM chips via twenty-four 16K x 4-bit chips.

The EGA palette allows all 16 CGA colors to be used simultaneously, and it allows substitution of each of these colors with any one from a total of 64 colors, at a resolution of 640 x 350. Text mode was an 8x14 character box which could also display text in colour. The card ran at two frequencies: 22 kHz for the new 640x350 mode, and 15.75 kHz for compatibility with the older 640x200 and 320x200 modes. The EGA card includes a 16 KB ROM chip which extends the IBM PC system BIOS for additional graphics functions. It also includes a CRT controller chip that has a backward compatibility mode so the EGA card is able to generate video signals from earlier graphics cards.

Both the EGA card and ECD monitor were made available at launch to be sold to existing owners of the IBM PC and IBM PC/XT, as well as the new IBM PC/AT.

The EGA palette of 64 colours



The Multi-Color Graphics Array or MCGA is a video subsystem built into the motherboard of the IBM PS/2 Model 30, introduced on April 2, 1987, and Model 25, introduced later on August 11; no standalone MCGA cards were ever made.

The MCGA supports all CGA display modes plus 640 × 480 monochrome at a refresh rate of 60 Hz, and 320 × 200 with 256 colors (out of a palette of 262,144) at 70 Hz. The MDA monochrome text mode is not supported.

MCGA is similar to VGA in that it had a 256-color mode (the 256-color mode in VGA was sometimes referred to as MCGA) and uses 15-pin analog connectors. The PS/2 chipset's limited abilities prevents EGA compatibility and high-resolution multi-color VGA display modes.

The tenure of MCGA was brief; the PS/2 Model 25 and Model 30 were discontinued by 1992, and no manufacturer produced a clone of this display adapter except for Epson Equity Ie, since the VGA standard introduced at the same time was considered superior.


Video Graphics Array (VGA) is the display hardware first introduced with the IBM PS/2 line of computers in 1987. Through widespread adoption, the term has also come to mean either an analog computer display standard, the 15-pin D-subminiature VGA connector, or the 640 × 480 resolution characteristic of the VGA hardware. Unlike MCGA, it supports 256 simultaneous colours on-screen at 640 x 480.

The 256-colour VGA palette

VGA was the last IBM graphics standard to which the majority of PC clone manufacturers conformed, making it the lowest common denominator that virtually all post-1990 PC graphics hardware can be expected to implement. It was officially followed by IBM's Extended Graphics Array (XGA) standard, but was effectively superseded by numerous slightly different extensions to VGA made by clone manufacturers, collectively known as Super VGA.

Today, the VGA analog interface is used for high definition video, including resolutions of 1080p and higher. While the transmission bandwidth of VGA is high enough to support even higher resolution playback, there can be picture quality degradation depending on cable quality and length. How discernible this degradation is depends on the individual's eyesight and the display, though it is more noticeable when switching to and from digital inputs like HDMI or DVI.

IBM 8514/A

In late 1988/early 1989, graphics board manufacturers were still trying to standardise the specifications for 'extended VGA' (what would be later called "Super VGA" with an 800 x 600 resolution). While this was going on, some of them made attempts to bypass it completely with their eyes set on IBM's 8514/A resolution of 1024 x 768. This was designed to be used for graphics-intensive applications such as CAD (Computer Aided Design). This was short-lived, being soon overtaken by Super VGA and the VESA standards that supported and then exceeded the 8514/A resolution.


Super VGA and VESA VBE

Originally, it was an extension to the VGA standard first released by IBM in 1987. Unlike VGA—a purely IBM-defined standard—Super VGA was never formally defined. The closest to an "official" definition was in the VBE (Video Bios Extensions) extensions defined by the Video Electronics Standards Association (VESA), an open consortium set up to promote interoperability and define standards. In this document, there was simply a footnote stating that "The term 'Super VGA' is used in this document for a graphics display controller implementing any superset of the standard IBM VGA display adapter." When used as a resolution specification, in contrast to VGA or XGA for example, the term SVGA normally refers to a resolution of 800x600 pixels.

Though Super VGA cards appeared in the same year as VGA (1987), it wasn't until 1989 that a standard for programming Super VGA modes was defined by VESA. In that first version, it defined support for (but did not limit to) a resolution of 800x600 4-bit pixels. Each pixel could therefore be any of 16 different colors. It was quickly extended to 1024x768 8-bit pixels, and well beyond that in the following years.

Although the number of colors is defined in the VBE specification, this is irrelevant when referring to Super VGA monitors as (in contrast to the old CGA and EGA standards) the interface between the video card and the VGA or Super VGA monitor uses simple analog voltages to indicate the desired color. In consequence, so far as the monitor is concerned, there is no theoretical limit to the number of different colors that can be displayed. This applies to any VGA or Super VGA monitor.

While the output of a VGA or Super VGA video card is analog, the internal calculations the card performs in order to arrive at these output voltages are entirely digital. To increase the number of colors a Super VGA display system can reproduce, no change at all is needed for the monitor, but the video card needs to handle much larger numbers and may well need to be redesigned from scratch. Even so, the leading graphics chip vendors were producing parts for high-color video cards within just a few months of Super VGA's introduction.

On paper, the original Super VGA was to be succeeded by Super XGA, but in practice the industry soon abandoned the attempt to provide a unique name for each higher display standard, and almost all display systems made between the late 1990s and the early 2000s are classed as Super VGA.

Monitor manufacturers sometimes advertise their products as XGA or Super XGA. In practice this means little, since all Super VGA monitors manufactured since the later 1990s have been capable of at least XGA and usually considerably higher performance.

SVGA uses a VGA connector, the same DE-15 (a.k.a. HD-15) as the original standard.


VESA 2.0

In late 1994, VESA introduced version 2.0 of their VBE (Video BIOS Extensions) standards. which supercede VESA VBE 1.2. This provided new features that compliant video cards had to have, including:-

  • Linear framebuffer access - direct access to the framebuffer in protected mode as a single chunk of memory.
  • Protected mode banking - access to the framebuffer in protected mode without having to go down to real mode.
  • SVGA page flipping - allow higher performance animations.
  • Display window control - allows software to setup virtual display resolutions larger than the displayed resolution (the viewport), and be able to smoothly scroll or pan around the larger image.
  • "High Color" and "TrueColor" modes - industry standard 16-bit and 24-bit graphics modes for resolutions ranging from 320x200 up to 1600x1200.

Since games began to support resolutions above 640x480, VESA VBE compliance in a graphics card became more critical than anything else. Despite cards supporting versions of VESA VBE, games manufacturers would still push the capabilities of the card which sometimes exposed weaknesses or lack of support in their VBE ROM code. In cases where a card may or may not directly support all of the VBE functionality, some DOS TSRs were written to provide such functionality. These include UniVESA, later renamed UniVBE, which would override the card's own VESA VBE ROM code extensions with the full VESA VBE functionality, thus providing better compatibility with DOS games. UniVBE only worked on graphics cards with at least 512K of memory.

Both nVidia and 3dfx (Voodoo3, 4 and 5) cards famously have excellent VESA implementations.


Frequently Asked Questions

Q) Can I use a 16-bit graphics card in an 8-bit ISA slot?

A) The short answer is, it depends! Usually 16-bit cards require a 16-bit slot, but some cards were released at a time between the XT/8086/8088 era and the 80286 era when the 16-bit address bus became available. As such, graphics card manufacturers sometimes allowed their cards to work in either slot. Below are known 16-bit cards that will work in 8-bit slots.

WARNING: Some 16-bit cards are advertised as being 8-bit compatible, but require a 286 or better CPU, i.e. they have been designed for an 8-bit slot but only in an AT-class computer.

WARNING: Be aware of clone cards. So for example, just because the VGA card's main chip is labelled "Trident TVGA9000i" does not mean that the card is a Trident TVGA9000i. The card could be an ACME 1234, a card not made by Trident, but uses the Trident TVGA9000i chip, and has an on-board BIOS that requires a 286 or better CPU.


  • VGA Wonder (autosense)
  • Mach-8 (JU1 to position 2/3)
  • Ultra (JU1 to position 2/3)
Cirrus Logic

  • CL-GD5320 chipset (JP6 to position 1/2)
  • CL-GD5402 chipset (autosense)
  • CL-GD5401 chipset (autosense)
  • GD5426

  • Gotham Pass 4 TV/VGA Output.
Oak Technology

  • OTI037C
  • OTI067 (autosense)
  • OTI077

  • TVGA 8800CS (one or more jumpers have to be changed, Information varies)
  • TVGA 8900C - auto detects 8 bit. Successfully tested on IBM PS/2 Model 30 8086.
  • TVGA 8900D - Manual [here] indicates compatibility with "486, 386, 286 and PC compatibles" and that jumpers need to be changed for 8-bit operation.
  • TVGA 8900CL - Jumper settings that work on a Zenith 4MHz 8088 are: J1=on, J2=off, J6=on, J7=on, J8=on, J10-->J9=off,on,on,on (left to right) Note that the card did not work on a generic turbo 8088 board.
  • TVGA 9000B - From archeocomp: For 8-bit operation, all three jumpers on J9 need to be on. archeocomp verified 8-bit operation in an XT.
  • TVGA 9000C - From Caluser2000: "Connected the jumpers to J9 then had to remove the one off J10(blue) towards the rear ... and it worked in the 8 bit slot on the 286."
  • TVGA 9000C MKII - From modem7: Second version of 9000C - Jumper settings for 8-bit operation in manual [here] - Works in an 8-bit slot in my IBM AT. Does not work in my IBM XT or in my XT clone.
  • TVGA 9000I - Manual [here] indicates compatibility with "486, 386, 286 and PC compatibles" and that jumpers need to be changed for 8-bit operation.

  • ET4000 chipset XVGA based card from Focus Information Systems Inc. (autosense)
  • ET4000 chipset from Diamond Speedstar. (switches 1/3 off)
Western Digital

  • Paradise VGA Professional Card (autosense, WD PVGA1B chipset)
  • Paradise 4088
  • Paradise 4089
  • Paradise88 VGA
Unknown Manufacter

  • AVGA1 chipset, FCC NO:EUNLEOVGA-10710 (autosense)
Video Seven

  • VGA-16 (autosense, however switches 6/7 may need to be turned off)


Laptop Display Technologies

In addition to the graphics card in a DOS PC, if you were a laptop owner you had the added confusion of choosing a type of display. Head over to my page dedicated to laptop displays!