DOS Days

2D and 3D Graphics Card Technology

This topic is one that I've been meaning to write for some time. There is a bewildering collection of terminology associated with 90s graphics cards which frankly go over my head. Z-buffer sorting, bilinear filtering, alpha blending, texture mapping, etc, etc.

In this topic I hope to sanitise these terms for the non-technical reader so that you can more easily compare graphics cards of a given year - which manufacturers were leading and when, and what compromises were being made?

Early Graphics Cards

All PC video cards before IBM launched the Video Graphics Array (VGA) standard in 1987 had a number of common components. They all had a video controller chip where the majority of display handling would take place They also had a small amount of onboard RAM in which the display data was stored in what was referred to as a framebuffer, and they had a crystal oscillator to keep the display refreshing at a constant frequency. All these pre-VGA graphics standards output their display data as a digital signal, so there was no need for a video card to have a DAC (Digital-to-Analog Converter) to convert the digital signal into an analog waveform for the monitor.

RAMDACs and the Video BIOS ROM

When VGA came along, graphics cards also needed a digital-to-analogue converter (DAC).

 
A RAMDAC and BIOS ROM chip

For graphics cards, the DAC is usually referred to as a RAMDAC (Random Access Memory Digital to Analogue Converter), since it contains some internal static memory (SRAM), plus some dynamic memory (DRAM) as well as several actual Digital-to-Analog converters (DACs). The memory is used to store the colour palette, and to convert the digital colour 'codes' into an analogue signal that are then sent to an analogue monitor. Each colour code is made up of separate Red, Green and Blue (RGB) codes, and these codes are sent to separate embedded DAC circuits inside the RAMDAC which produce an analog voltage on that colour signal line to represent the given colour.

Since the mid-90s, the SRAM (Static RAM) portion of a RAMDAC is usually bypassed, with the DACs being sent display data directly for True Color modes. In such cases, the SRAM-stored palette is retained only for backward-compatibility with older software. From the early 90s the RAMDAC itself was embedded inside the video controller chip.

The other thing a VGA card has is a BIOS in a ROM chip, much like the one on your motherboard. For the original IBM video standards, MDA and CGA, the system BIOS had the code needed to understand how to drive a display. With the introduction of the EGA standard in 1984, older PCs didn't have the ability to output EGA quality via their BIOS interrupt calls, so the concept of a video BIOS was born. At system startup, this video BIOS's "extension" code would be loaded into memory so the PC could also output EGA video. This carried over into VGA cards during the DOS and 90s Windows era.

VESA BIOS Extensions

By 1989 it was apparent that video card technology was moving forward at quite a pace. Manufacturers were releasing cards that provided video modes at ever higher resolutions and in more colours. There was no standard set between these manufacturers, and this was becoming a problem for software companies trying to write programs that leveraged all these new video modes. With no common standard, the software would need to be written specifically to cope with each manufacturer's card if they wanted to make use of extended video modes beyond the VGA hardware standard.

This is where the VBE (VESA BIOS Extensions) standard came in. First released in 1989, VBE extended IBM's VGA standard BIOS functions to cater for higher resolutions, more colour depth and how to access the framebuffer (basically a bitmap of a single 'frame' of information to be shown on the monitor). Graphics card manufacturers would make sure their new cards adhered to the VBE standard in order to remain compatible with the latest software. The VBE standard would later include the ability for software to request a video card's capabilities and set the current display mode accordingly.

VBE versions 1.0 (late 1989), 1.1 (1990) and 1.2 (1991) were primitive, with limitations on how a card should access video card memory, and further limits on screen resolution and colour depths, but were still very useful in getting a standard going that most hardware manufacturers followed. VBE v1.1 introduced a new refresh rate standard for 640 x 480 and 800 x 600 resolutions, which was 72 Hz. This was to eliminate flicker on multi-frequency, high-resolution monitors. VBE v1.2 added a 70 Hz minimum refresh rate for 1024 x 768 resolution.

VBE version 2.0 came along in November 1994, adding support for two new colour modes: High Color (HiColor) and TrueColor. These were new 16-bit and 24-bit colour depths, offering a palette of 65,536 colours and 16.7 million colours respectively. VBE 2.0 provided standard resolutions from 320 x 200 up to 1600 x 1200 pixels.

It also provided these new capabilities:

Linear frame buffer access - Enables direct frame buffer access in protected mode as one large area of memory instead of less efficient smaller chunks.
Protected mode banking - Allows access to the framebuffer from protected mode.
SVGA page flipping - Allows smooth animations for computer games and other graphics programs.
SVGA virtual screens - Allows software to set up virtual display resolutions larger than the actual displayed resolution, and smoothly scroll or pan around the larger image.


Then the first 'accelerator' functions were added in 1996, including:-

Hardware cursors - an overlay on top of the frame buffer that you can change the location of.
Bit Block Transfers (Bit Blt) - the ability to combine several rectangles of graphical data into one.
Off-screen sprites - constructing and manipulating sprites off-screen in readiness for their use on-screen.
Hardware panning, drawing and other functions - the ability to pan around the screen and draw simple shapes.



Because these operations were performed by sending an instruction to the graphics card and it would do the hard work for you, it was much faster. Previously these operations would be done in software, taking up valuable CPU cycles on the host computer.
.
VBE 3.0 arrived in 1998, adding the following:-

Triple buffering - Allows high speed applications to perform multi-buffering with less screen flickering and without having to wait for the graphics controller.
Refresh rate control - Applications and OSes can change the refresh rate in a standard way on all VBE 3.0 graphics controllers. Important for stereo applications, since when stereo is enabled, the user's effective refresh rate is cut in half.
Stereoscopic page flipping - When viewing an application using stereo glasses, software needs to page flip twice as often as normal, because it needs to generate separate images for each eye. This new feature allows stereo compatible software to display properly.
Hardware stereoscopic sync - Allows stereo software to determine if there is a connector for stereo glasses on the user's graphics card.

 

In Part 2 we will move into the era of "Graphics Accelerators"...