CRT Monitors
Introduction
This page provides details on the various monitor technologies used with DOS PCs. It should be read in conjunction with the Graphics Cards page for completeness. This page uses the term 'display card' as a more generalised term for 'graphics card', as it includes cards that can only display text as well as later cards that produce both text and graphics. Also, if you're interested in laptop/portable PC screen technology, head on over to my laptop displays page.
Before the modern day of LED televisions and monitors, we had CRT (Cathode Ray Tube) displays. These were necessarily bulky due to the fact an electron gun that fired a beam at the back of a display area needed to be positioned far back from the actual display itself. CRTs, for all their girth, do still have some advantages over modern flat screens. CRTs don't suffer from dead pixels, they have a better viewing angle, and of course have retro authenticity - the fact that running old software on modern screens looks decidedly odd, showing up the low resolutions that existed back in the day - this is not as noticeable when viewed on a monitor it was designed for. But I digress... let's take a journey through the various display technologies.
MDA and Hercules Monitors
From the first MDA (Monochrome Display Adapter) cards up to the EGA standard, all PC video output was digital which required connection to a monitor that accepted a digital signal input. With MDA, each character is displayed in a "cell" of 9x14 pixels, of which 7x11 depicts the character itself and the rest is used for spacing between character columns and lines. The theoretical total screen display resolution is 720 x 350, however the MDA card cannot address individual pixels, making it a "display card", not a "graphics card".
Hercules Computer Technology introduced their Hercules Graphics Card (HGC) in 1982 as an upgrade to IBM's MDA standard. It supported a bitmapped graphics mode in addition to the high quality text mode offered by MDA. The graphics mode was 720 x 348, and used the same horizontal and vertical scan frequencies as MDA, which meant existing MDA monitors supported the Hercules card out-of-the-box. Compatibility of HGC to MDA was so good that when running in text mode (its default on startup), a PC couldn't tell the difference and assumed it was an MDA card. Hercules HGC cards also output digital TTL, just like MDA.
You can be almost 100% sure that if your display card has a 9-pin D-SUB female connector, it's outputting digital signals. These signals use what's called TTL (Transistor-Transistor Logic), which outputs a HIGH (+5V) or a LOW (0V) depending on whether to fire the CRT beam or not. Also in the MDA video output is an intensity signal (for brightness of characters) and separate horizontal and vertical sync signals.
An MDA card's female 9-pin DSUB output connector and its pinouts
(so pin 1 is in the top-left of a male 9-pin
connector on the cable)
On the monitor side will be either a DIN socket or a 9-pin female DSUB (just like on the display card).
A digital mono video cable that connects an MDA or Hercules graphics card to a mono monitor
Monochrome monitors displayed either green, amber, or white characters on a black background. A monitor that displayed green did so because it used green "P1" phosphor to light up each pixel on the display. Old monitors tended to have a very low refresh rate (the speed at which the entire screen's contents were refreshed with new content based on data coming from the display card). Green phosphor had the longest "afterglow" so remains lit up on the screen for longer between refreshes, and green is the brightest type of phosphor. This made it a cheaper monitor to build.
The various monochrome PC display colours
Amber monitors came a little later, and used "P3" phosphor. It was considered easier on the eyes for business use but required a faster refresh rate and so was more expensive to manufacture.
Black and white monitors displayed white or grey characters on a black background, and used "P4" phosphor. These were sometimes referred to a "paper white" displays.
Strangely, when purchasing a PC in the 1980s and early 1990s that had a monochrome display, you were rarely informed of the "colour" you would receive. Instead, it was simply referred to as a mono (or monochrome) monitor.
CGA and EGA Monitors
CGA monitors arrived on the scene in 1981, coinciding with IBM's launch of the Color Graphics Adapter (CGA) card. This permitted up to 4 colours simultaneously displayed on-screen (not including black) in the low resolution of 320 x 200 pixels. It also provided the option of monochrome in 640 x 200, mimicking the MDA format but with a slightly smaller character cell size.
The EGA standard, introduced in 1984, expanded on this with up to 16 colours at a resolution of 640 x 350. Once launched, an inexpensive PC clone that supported EGA graphics could produce better graphics than rivals at the time including the Commodore 64 and Apple II.
Both CGA and EGA send their signals as digital TTL, just like MDA and Hercules. The previously unused pins in the same 9-pin DSUB that was used by MDA/Hercules were now employed with CGA and EGA to transmit Red, Green and Blue colour and intensity information to the supporting monitor:
A CGA or EGA card's female 9-pin DSUB output connector and its pinouts
Note: Red 0, Red 1 and Green 1 are the "Primary" colour signals, whilst Red 0, Grn 0 and Blue 0 are the "Secondary" colour signals. The primary signals provide the actual colour whilst the secondary signals provide the "intensity" of the colour.
If you connect an EGA card to a CGA monitor it should work if the EGA card is outputting at the same scan rate as CGA (15.75 kHz) and the monitor isn't running pin two to ground which would short out the EGA card and potentially damage it. All EGA modes that support 200 lines operate at this lower scan frequency of 15.75 kHz. Most EGA cards have DIP switches on the side to set the monitor type.
VGA and Beyond
With the advent of the VGA graphics standard, the output of such graphics cards moved to analogue in order to support a seemingly infinite number of colours for display. If your graphics card has a 15-pin D-SUB female connector, it's outputting analogue RGB (Red, Green, Blue) signals. More specifically, they carry RGBHV (Red, Green, Blue, Horizontal Sync, Vertical Sync) signals. These connect to an "analogue" monitor, which accept these analogue RGB signals and convert them back into digital signals for display on-screen.
It's worth mentioning that mono VGA is different - it only needs 8 or 9 wires but the signalling is completely different to MDA (Mono TTL is 5V digital whereas VGA is 1V analogue).
A VGA monitor's female 15-pin DSUB output connector and its pinouts
Monitors and Connectors
Monitors typically accept either digital or analogue signals - not both!
An exception to this was some of the early multisync monitors including the NEC Multisync 3D. This could switch between analogue and digital and so was able to support MDA, CGA, EGA (all digital) as well as VGA (analogue) signals coming in. Later multisync monitors supported only analogue [VGA] signals - the purpose of these was to display different screen resolutions at different refresh rates (frequencies).
Some digital TTL monitors have a DIN socket, so a cable with a 9-pin D-SUB [male] on the graphics card end goes to a DIN plug on the other.
The table below provides a list of the specifications of each display type and what the expected monitor's capabilities need to be:
Display | Signals | Connector(s) | Monitor Horizontal Frequency | Monitor Vertical Frequency | Resolutions/Colours |
---|---|---|---|---|---|
MDA | Digital TTL | DE-9 (9-pin DSUB) | 18.432 kHz | 50 Hz | 720 x 348 |
HGC | Digital TTL | DE-9 (9-pin DSUB) | 18.432 kHz | 50 Hz | 720 x 348 |
CGA | Digital TTL (4-bit RGBi) |
DE-9 (9-pin DSUB) or RCA | 15.75 kHz | 60 Hz | 320 x 200 in 4 colours 640 x 200 in 2 colours 160 x 100 in 16 colours (see note 1) |
EGA | Digital TTL |
DE-9 (9-pin DSUB) | 15.75 kHz (200-line modes) or 21.8 kHz (350 line modes) | 60 Hz | 640 x 350 in 16 colours |
MCGA | Analogue | DE-15 (15-pin DSUB) | 31.5 kHz | 50-85 Hz | 320 x 200 in 256 colours 640 x 480 in 2 colours |
VGA | Analogue | DE-15 (15-pin DSUB) | 31.5 kHz | 50-85 Hz | 320 x 200 in 256 colours 640 x 480 in 16 colours |
SVGA | Analogue | DE-15 (15-pin DSUB) | 800 x 600 in 256 colours |
TTL stands for "Transistor-Transistor Logic" - basically a digital signalling system which sends and receives -5V or +5V signals to indicate the logic level of 0 or 1.
RGBI stands for "Red, Green, Blue and Intensity" - these define the colour palette available, based on the 3-bit palette of RGB alone, but with an added intensity bit (dark or bright) which gives 16 colours in total. If a standard RGB monitor is used with a CGA card it will only display a maximum of 8 colours, as the Intensity bit is not supported.
Notes:
1) This extended CGA graphics mode is not very common. It was used primarily in games where the number of colours was far more advantageous than the screen resolution. PakuPaku, a Pac-Man clone, was one such game.
Multisync Monitors
Before multisync (or multiscan, same thing) CRT monitors existed, a monitor displayed just one resolution at one refresh rate. It simply wasn't changeable. An EGA monitor could only display 640 x 350 at 60 Hz. If you tried to display 320 x 200, it would display it in the middle of the monitor surrounded by black. Any higher resolution than the monitor could handle (or different refresh rate) would simply fail and possibly damage the monitor.
A multisync monitor can display different resolutions at different refresh rates (it supports multiple synchronisation rates). Early multisync monitors were not only auto-switching for sync frequencies, they could also switch between analog and digital. The NEC Multisync 3D (around 1988-1989) was one of the last monitors to support analog/digital switching. Most of these monitors had a 'Mode' switch somewhere to switch between analog and digital. Later monitors are all analog-only, but of course do support different resolutions and refresh rates.
9-pin MDA/CGA to 15-pin Monitor-end
Below is the cable wiring required to adapt a 9-pin D-SUB (MDA, CGA, EGA, and PGA/PGC) to the 15-pin D-SUB of a multisync monitor that supports digital as well as analog signals. As mentioned above, only early multisync monitors support digital signals as well as analog, so please check in your monitor's documentation before attempting to connect a digital signal display card to your monitor!
9-pin DSUB (graphics card end of cable) | 15-pin DSUB Male (monitor end) |
---|---|
1 | 1 |
2 | 2 |
3 | 3 |
4 | 13 |
5 | 14 |
6 | 5 |
7 | 15 |
8 | 12 |
9 | 10 |
I've tested this layout with both CGA and MDA cards and it works perfectly. The only special thing to note is that when using MDA the "MODE" switch on the front panel of the monitor must be set to "ON", for all others it is set to "OFF"
For convenience, below are a number of user manuals for multisync CRT monitors:
Frequently Asked Questions
Q) Will this monitor X work with my graphics card Y ?
A) This depends on several factors. The first, most important, question is: Is the monitor analogue or digital? Remember, almost all monitors can only accept either digital signals or analogue signals. Get this wrong and you'll likely permanently damage the graphics card or the monitor. If you know this, the next question to ask is: does the monitor have the horizontal scan frequency that the graphics card is outputting? Check the table above for details, and compare it to your monitor's specifications in tbe back of its manual.