BOXX builds workstations and servers purpose-built for your creative applications. As such, we are heavily involved in rendering, as all that is created must also be rendered. Since I arrived at BOXX in 2008, I have witnessed tremendous advances in rendering, but nothing compared to the gigantic leaps made from the early days of computer graphics up to now. So…as we bravely venture forward into 2022, spoiled by photorealism, virtual reality, real-time raytracing, and the like, it seems like a good time to look back at where we’ve been.

 

CGA

Based around the Motorola MC6845 display controller, IBM’s Color Graphics Adapter (CGA) debuted in 1981. Originally called the Color/Graphics Monitor Adapter, it was the company’s first color graphics card, as well as the first color computer display standard for their PC.

The standard card featured 16 kilobytes of video memory and could be connected (via RCA jack) to a NTSC-compatible monitor or TV, or to a dedicated RGBI CRT monitor. The highest resolution of any mode was 640 × 200, while the highest color depth supported was 4-bit (16 colors).

By today’s standards, it was pretty crude with essentially two modes. One displayed red, green, yellow, and black, while the other offered cyan, magenta, white, and black. Colors could be shifted, but only by intensity. In low-res (320 x 200), the maximum number of colors CGA could display simultaneously was four, which you couldn’t choose.  

 

EGA

In 1984, the CGA was supplanted by the EGA (Enhanced Graphics Adapter), which gave IBM PC’s improved capabilities. This too was IBM’s baby, yet competitors reverse-engineered it, selling clones. EGA supported four graphics modes and four text modes with a max graphic resolution of 640 x 350 with 4-bit color (16 simultaneous colors selected from a 64-color palette). Most of the programs which supported EGA used its 320x200 resolution with 4-bit color and didn't modify the default color palette. The display type was popular for MS-DOS programs through the late 80’s.

 

VGA/SVGA

EGA popularity waned, however, with the arrival of the IBM VGA (Video Graphics Array) in 1987. Still around today, it offered a video resolution of 640 x 480 with 16 colors and up to 256 KB of video memory. VGA/SVGA utilizes analog signals, so it's only capable of lower resolutions and a lower quality display on screens. As VGA/SVGA inches toward obsolescence, it’s being replaced by DVI, HDMI, and DisplayPort.

 

Progressive Scan

During the resolution revolution, another shift was taking place. The switch from interlaced to progressive scan. The easiest way to look at the difference from say, VHS quality (480i) to DVD quality (480p) is the number of scan lines being displayed. Interlaced mean that out of the 480 scan lines, only 240 are being shown at a given time slice. Every other line for each pair switch back and forth very quickly, almost invisible to the human eye. You’ve undoubtedly seen something akin to a flickering screen on your favorite movie or TV show from that era when a shot of a computer screen is present. That flicker is the alternating display. The arrival of progressive scan means that every line is being displayed on every frame.

 

In Part II of this blog series, we’ll take a look at standard and high definition, as well as the development timeline of computer graphics from the 1990’s until now.