• @argv_minus_one@beehaw.org
    link
    fedilink
    21
    edit-2
    1 year ago

    VGA was so much better.

    The composite video output commonly seen on 1980s microcomputers couldn’t display high-resolution text without severe distortion making the text unreadable. This could be seen on the IBM PCjr, for example, where the digital RGB display it came with could display 80×25 text mode just fine, but if you connected a composite video display (i.e. a TV) instead, 80×25 text was a blurry, illegible mess. The digital video output was severely limited in color depth, however; it could display only a fixed palette of 16 colors, whereas the distortion in the composite video could be used to create many more colors, albeit at very low resolution.

    Then along came the VGA video signal format. This was a bit of a peculiarity: analog RGB video. Unlike digital RGB of the time, it was not limited in color depth, and could represent an image with 24-bit color, no problem. Unlike composite video, it had separate signal lines for each primary color, so any color within the gamut was equally representable, and it had enough bandwidth on each of those lines to cleanly transmit a 640×480 image at 60Hz with pretty much perfect fidelity.

    However, someone at IBM was apparently a bit of a perfectionist, as a VGA cable is capable of carrying an image of up to 2048×1536 resolution at 85Hz, or at lower resolutions, refresh rates of 100Hz or more, all with 24-bit color depth—far beyond what the original VGA graphics chips and associated IBM 85xx-series displays could handle.

    Also, the VGA cable system bundled every signal line into a single cable and connector, so no more figuring out which cable plugs in where, and it being so future-proof meant that, for pretty much the entire '90s, you could buy any old computer display and plug it into any old computer and it would just work.

    Pretty impressive for an analog video signal/cable/connector designed in 1987.