This article was compiled by the author from TechSpot
The development of modern graphics processors began with the introduction of the first 3D add-on cards in 1995, followed by widespread adoption in 32-bit operating systems and reasonably priced personal computers.
The graphics card industry that existed prior to this time consisted primarily of common 2D non-PC architectures, and graphics cards were known for their alphanumeric naming conventions for chips and huge price tags. 3D gaming and virtualized PC graphics eventually blended with a variety of sources such as arcade and console games, military, robotics, space simulators, and medical imaging.
Early consumer 3D graphics cards were a competing “Wild West”. From how the hardware was implemented, to the use of different rendering techniques and their application and data interfaces, to even the exaggerated naming. Early graphics card systems used a fixed function pipeline (FFP), an architecture that followed a very strict processing path and used almost as many graphics card APIs as 3D chip manufacturers.
While 3D graphics cards have turned a rather dull PC industry into a glittering show, they owe their existence to the innovative efforts of generations. This article delves into the history of the GPU. From the early days of 3D consumer graphics cards to the 3Dfx Voodoo game changers, the industry consolidated at the turn of the century and today’s modern GPGPUs.
Early development of consumer-grade 3D graphics cards (1976-1995)
The first true 3D graphics cards began with early display controllers, namely video shifters and video address generators. They acted as a pass-through channel between the main processor and the display. The incoming data stream was converted to serial bitmap video output, such as luminance, color, and vertical and horizontal composite sync, which kept the pixel rows in display generation and synchronized each successive line as well as the fade interval (the time interval between ending one scan line and starting the next).
The latter half of the 1970s saw the emergence of a series of designs that laid the foundation for what we know as 3D graphics cards. For example, RCA’s 1976 “Pixie” video chip (CDP1861) was capable of outputting a 62×128 resolution or 64×32 NTSC-compatible video signal not suitable for RCA Studio II consoles.
A year later, the TV chip was followed by the TV Interface Adapter (TIA) 1A, which was integrated into the Atari 2600 (Figure 1) to generate screen displays, sound effects, and read input controllers.
In 1978, Motorola introduced the MC6845 video address generator. This became the basis for the monochrome (Figure 2) and color display adapter (MDA / CDA) cards for the IBM PC in 1981 and provided the same functionality for the Apple II. Motorola added the MC6847 video display generator later that same year and introduced it to many first-generation personal computers, including the Tandy TRS-80.
A similar solution from Commodore’s MOS Tech subsidiary, VIC, provided graphics card output for 1980-83 vintage Commodore home computers.
The following November, LSI’s ANTIC (Alphanumeric TV Interface Controller) and CTIA/GTIA coprocessors (color or graphics TV interface adapters) made their debut in the Atari 400. ANTIC used direct memory access (DMA) to process 2D display instructions. Like most video coprocessors, it generates motion field graphics (background, title screen, score display), while CTIA generates color and moveable objects. Yamaha and Texas Instruments (TI) supplied similar ICs to various early home computer suppliers.
The next step in the evolution of graphics cards is mainly in the professional field
Intel used their 82720 graphics card chip as the basis for the $1,000 iSBX 275 video graphics controller multi-mode board. It is capable of displaying eight colors of data at a resolution of 256×256 (or 512×512 in monochrome). Its 32KB of display memory is enough to draw straight lines, arcs, circles, rectangles, and character bitmaps. The chip also has zoom, screen partitioning, and scrolling capabilities.
SGI soon introduced the GR1.x graphics board for the IRIS Graphics workstation, providing separate external (daughter) boards for color options, geometry cards, Z buffers, and overlays/underlays.
At the time, industrial and military 3D virtualization was relatively well developed. IBM, General Electric, and Martin Marietta (which acquired GE’s aerospace division in 1992), as well as a large number of military contractors, technical agencies, and NASA, carried out various projects requiring military and space simulation technology. in 1951, the U.S. Navy also used MIT Whirlwind Computer’s 3D virtualization technology to develop a flight simulator.
In addition to defense contractors, there are companies that span the military market with specialized graphics cards.
Evans & Sutherland, which will offer professional graphics card lines such as Freedom and REALimage, also provided graphics cards for the CT5 flight simulator, a $20 million package powered by the DEC PDP-11 mainframe. The company’s co-founder, Ivan Sutherland, developed a computer program called Sketchpad in 1961 that used a light pen to draw geometric shapes and display them in real-time on a CRT.
This was the origin of the modern graphic card user interface (GUI).
In the personal computer world, Chips and Technologies’ 82C43x series EGA (Extended Graphics Card Adapter) provided a competitive edge for IBM adapters and could be installed in many PC / AT clones around 1985. This year Commodore Amiga also installed the OCS chipset. The chipset consisted of three main component chips, including Agnus, Denise, and Paula, which were not CPU dependent and allowed a certain number of graphics cards and audio computing.
In August 1985, three Hong Kong immigrants, Kwok Yuan Ho, Lee Lau, and Benny Lau, founded Array TechnologyInc in Canada. currently, the company name has been changed to ATI TechnologiesInc.
The following year, ATI launched its first product, the OEM color emulation card. It was used to output monochrome green, amber, or white phosphorescent text on a black background to a TTL monitor via a 9-pin DE-9 connector. The card was equipped with at least 16KB of memory and accounted for a significant portion of ATI’s C$10 million in sales in its first year of operation.
This was accomplished primarily through a weekly supply of about 7,000 chips to Commodore Computers.
The advent of color monitors and the lack of standards among many competitors eventually led to the formation of the Video Electronics Standards Association (VESA), of which ATI was a founding member, along with NEC and six other graphics card adapter manufacturers.
In 1987, ATI added the Graphics Solution Plus series to the OEM product line, which used the IBM PC / XT ISA 8-bit bus for the Intel 8086/8088 based IBM PC. The chip supported MDA, CGA, and EGA graphics card modes via DIP switches. It was basically a clone of the Plantronics Colorplus board, but with 64kb of memory space. the Paradise Systems PEGA1, 1a, and 2a (256kB) released in 1987 were also Plantronics clones.
The EGA Wonder 1-4 series, available in March for $399, features 256KB of DRAM and up to 640×350, 16-color compatibility for CGA, EGA, and MDA emulation. Extended EGAs are available for Series 2, 3, and 4.
At the high end is the EGA Wonder 800 (Figure 3), which features 16-color VGA emulation and 800×600 resolution support, as well as the VGA Improved Performance (VIP) card, which is essentially an EGA Wonder with digital-to-analog (DAC), added to provide limited VGA compatibility. The latter costs $449, plus $99 for the Compaq expansion module.
ATI wasn’t the only one driving the wave of consumer demand for personal computing.
There were many new companies and products introduced that year. These included Trident, SiS, Tamarack, Realtek, Oak Technology, LSI’s G-2 Inc., Hualon, Cornerstone Imaging, and Winbond all founded in 1986-87. Meanwhile, AMD, Western Digital/Heavenly Systems, Intergraph, Cirrus Logic, Texas Instruments, Gemini, and Genoa would produce the first graphics cards during this period.
The next few years also saw an amazing rate of updates to ATI’s Wonder series.
1988 saw the introduction of the Small Wonder graphics solution with game controller ports and composite output options (for CGA and MDA emulation), as well as the EGA Wonder 480 and 800+ with extended EGA and 16-bit VGA support, and the VGA Wonder and Wonder 16 added VGA and SVGA support.
The Wonder 16 comes with 256KB of memory and retails for $499, while the 512KB version retails for $699.
The updated VGA Wonder/Wonder 16 series was introduced in 1989 and included the cost-reduced VGA Edge 16 (Wonder 1024 series). New features included a bus mouse port and support for VESA-capable connectors. This was a gold finger connector similar to the shortened data bus slot connector, which was connected to another video controller via ribbon cable to bypass the congested data bus.
The Wonder series continued to be rapidly updated in 1991. the Wonder XL card added VESA 32K color compatibility and Sierra RAMDAC, increasing the maximum display resolution to 640×[email protected] or 800×[email protected] 1MB RAM options were available in price ranges of $249 (256KB), $349 ( 512KB) and $399. A low-cost version based on the previous year’s Basic-16, called the VGA charger, was also offered.
The Mach series was introduced in May of the same year along with the Mach8. It was sold as a chip or board and allowed limited 2D drawing operations such as line drawing, color fill, and bitmap combinations (Bit BLIT) to be offloaded via the programming interface (AI). ATI added a variant of Wonder XL with the Creative Sound Blaster 1.5 chip integrated on the expansion PCB. Called VGA Stereo-F / X, it is able to emulate the stereo sound in Sound Blaster mono files and its quality is close to FM radio quality.
Graphics boards like the ATI VGAWonder GT offer a 2D + 3D option, combining Mach8 with the VGA Wonder +’s graphics card core (28800-2) for its 3D capabilities. wonder and Mach8 drove ATI to a $100 million sales milestone in the year, largely due to Windows 3.0 adoption and an increase in the amount of 2D work available for the platform.
S3 Graphics was founded in early 1989 and produced its first 2D gas pedal chip and a graphics card, the S3 911 (or 86C911), 18 months later. The latter’s key specifications included 1MB of VRAM and 16-bit color support.
That same year, the S3 911 was replaced by the 924 – essentially a revised version of the 911 with 24-bit color – and updated the following year with the 928, which added 32-bit color, and the 801 and 805 gas pedals. 801 used the ISA interface, while 805 used the VLB. from the introduction of the 911 to the introduction of 3D gas pedals, the market was flooded with 2D GUI designs based on the original S3 design – especially from Tseng Labs, Cirrus Logic, Trident, IIT, ATI’s Mach32, and Matrox’s MAGIC RGB.
In January 1992, Silicon Graphics Inc (SGI) released OpenGL 1.0, a multi-platform vendor-agnostic application programming interface (API) for 2D and 3D graphics cards.
OpenGL evolved from SGI’s proprietary API called IRIS GL (Integrated Raster Imaging System Graphics Library). This was a move to preserve the non-graphics functionality of IRIS and allow the API to run on non-SGI systems as rival vendors began to emerge with their own proprietary APIs.
Initially, OpenGL was aimed at the UNIX-based professional market, but it was quickly adopted for 3D gaming due to developer-friendly support for extended implementations.
Microsoft was developing its own competitor, the Direct3D API, and did not ensure that OpenGL would work properly under Windows as well.
Things took a turn for the worse a few years later when John Carmack of ID Software (who had previously revolutionized PC gaming with the release of Doom), ported Quake to use OpenGL on Windows and publicly blamed Direct3D.
Microsoft’s intransigence increased as it refused to license the Mini Client Driver (MCD) for OpenGL on Windows 95, which would have allowed vendors to choose which features could use hardware acceleration. SGI replied by developing the Installable Client Driver (ICD), which not only provides the same functionality but does it better, with the MCD only Overriding rasterization and ICD added lighting and transformations (T&L).
During the rise of OpenGL, which initially attracted attention in the workstation space, Microsoft was busy designing for the emerging gaming market through its proprietary APIs. They acquired RenderMorphics in February 1995, and its Reality Lab API became popular with developers and became the core of Direct3D.
Around the same time, Brian Hook of 3dfx was writing the Glide API, which would become the primary API for games. partly because of Microsoft’s involvement in the Talisman project (a block-based rendering ecosystem), which diluted DirectX’s resources.
As Windows became more popular, D3D became widely available, with proprietary APIs such as S3d (S3), Matrox Simple Interface, Creative Graphics Card Library, C Interface (ATI), SGL (PowerVR), NVLIB (Nvidia), RRedline ( Rendition), and Glide, are starting to gain popularity among developers.
It didn’t help that there was increasing pressure to align some of these proprietary APIs with board manufacturers to add to a rapidly expanding list of features. Higher screen resolutions, increased color depth (from 16-bit to 24-bit, then 32-bit) and graphics quality enhancements (such as anti-aliasing) were required. All of these features required increased bandwidth, graphics card efficiency, and faster product cycles.
By 1993, market turmoil had forced many graphics card companies out of business or to be acquired by competitors.
The year 1993 ushered in a series of new graphics card competitors, most notably Nvidia, which was founded in January of that year by Jen-Hsun Huang, Curtis Priem, and Chris Malachowsky. Huang was previously director of core software at LSI, while Priem and Malachowsky both came from Sun Microsystems, where they had previously developed the SunSPARC-based GX graphics card architecture.
Shortly thereafter, Dynamic Pictures, ARK Logic and Rendition also joined NVIDIA.
The market turmoil has forced many graphics card companies out of business or to be acquired by competitors. These include Tamarack, Gemini Technology, Genoa Systems, Hualon, Headland Technology (acquired by SPEA), Acer, Motorola, and Acumos (acquired by Cirrus Logic).
And ATI continues to grow and develop.
As the pioneer of the All-In-Wonder series, ATI released the 68890 PC TV decoder chip that debuted in Video-It! With the on-board Intel i750PD VCP (Video Compression Processor), the chip is capable of capturing video at 320×240 @ 15 fps or 160×120 @ 30 fps and is able to compress/decompress in real-time. It is also able to communicate with the graphics cardboard via the data bus, thus eliminating the need for dongles or ports and ribbon cables.
Five months later in March, ATI introduced a 64-bit gas pedal, the Mach64.
In a highly competitive market, ATI lost C$2.7 million. Competitor motherboards included the S3 Vision 968 (Figure 6), which was adopted by many motherboard suppliers, and the Trio64, which had OEM contracts from Dell (Dimension XPS), Compaq (Presario 7170/7180), AT&T (Globalyst), HP (Vectra VE 4) OEM contracts) and DEC (Venturis / Celebris).
Mach64 (Figure 7) was released in 1995 and created a number of notable firsts. It became the first graphics card adapter in the form of Xclaim for use on PC and Mac computers and, together with the Trio for S3, provided full-motion video playback acceleration.
Mach64 also ushered in ATI’s first professional graphics cards, the 3D Pro Turbo and 3D Pro Turbo + PC2TV, priced at $599 for the 2MB option and $899 for 4MB.
The following month saw the emergence of a technology startup called 3DLabs, born out of the acquisition by DuPont’s Pixel graphics card division of a subsidiary from its parent company, and the GLINT 300SX processor capable of OpenGL rendering, fragment processing, and rasterization. The company’s cards were initially aimed at the professional market due to their high price. The Fujitsu Sapphire2SX 4MB retailed for $1600 to $2000, while the 8MB ELSA Gloria 8 was $2600 to $2850. The 300SX, however, was designed for the gaming market.
The 1995 game GLINT 300SX reduced the memory by 2MB. It used 1MB for the texture and Z buffers and 1MB for the frame buffer but also offered an option to add VRAM for Direct3D compatibility for $50 more than the base price. The card failed to make inroads in a crowded market, but 3DLabs was already working on a successor to the Permedia line.
S3 seemed ubiquitous at the time. The company dominated high-end OEMs with its Trio64 chipset, which integrated the DAC, graphics card controller, and clock synthesizer into a single chip. They also leveraged a unified frame buffer and supported hardware video overlay (a dedicated portion of graphics card memory used to render the video as needed by the application.) The Trio64 and its 32-bit memory bus sibling, the Trio32, were available as OEM units and standalone cards from Diamond, ELSA, Sparkle, STB, Orchid Hercules, and Number Nine. Diamond Multimedia prices range from $169 for a ViRGE-based card to $569 for a Trio64+-based Diamond Stealth64 video with 4MB VRAM.
The mainstream end of the market also includes products from Trident, a longtime OEM supplier of clean 2D graphics card adapters, which recently added the 9680 chip to its lineup. The chip has most of the features of the Trio64 and the motherboard is typically priced around $170-$200. They offer acceptable 3D performance in that bracket with good video playback.
Other newcomers in the mainstream market include Weitek’s Power Player 9130 and Alliance Semiconductor’s ProMotion 6410 (often seen as Alaris Matinee or FIS’s OptiViewPro). Both offer excellent scaling speed and CPU speed, while the latter combines a powerful scaling engine with anti-blocking circuitry for smooth video playback, which is much better than previous chips such as ATI Mach64, and Matrox MGA 2064W, and S3 Vision968.
NVIDIA introduced its first graphics chip, the NV1 (Figure 8), in May and became the first commercial graphics processor capable of 3D rendering, video acceleration, and integrated GUI acceleration.
They partnered with ST Microelectronic, who also promoted the STG2000 version of the chip, to produce it on their 500nm process. While this was not a huge success, it did represent the company’s first financial return. Unfortunately for NVIDIA, just as the first vendor motherboards began shipping in September (specifically Diamond Edge 3D), Microsoft finalized and released DirectX 1.0.
The D3D graphics card API confirms that it relies on rendering triangular polygons, while NV1 uses quadrilateral texture mapping. Limited D3D compatibility was added via drivers to package triangles as secondary surfaces, but the lack of games tailored to the NV1 doomed the card to become a masterpiece in all walks of life and omnipotence.
Most games were ported from the Sega Saturn. The 4MB NV1 with integrated graphics ports (two per expansion bracket connected to the plug-in card via ribbon cable) retailed for about $450 in September 1995.
Microsoft’s latest changes and the release of the DirectX SDK prevented motherboard manufacturers from having direct access to the hardware used for digital video playback. This meant that almost all discrete graphics cards had functionality problems in Windows 95. In contrast, drivers from a variety of companies under Win 3.1 are usually problem-free.
Its first public demonstration was at the E3 video game convention in Los Angeles the following May. The card itself became available a month later. 3D Rage merged the Mach 64’s 2D core with 3D capabilities. in November 1995, ATI announced their first 3D gas pedal chip, 3D Rage (also known as Mach 64 GT).
The latest revision to the DirectX specification meant that 3D Rage had compatibility issues with many games that used the API – mainly due to the lack of depth buffering. When using the onboard 2MB EDO RAM frame buffer, 3D modalities are limited to 640x480x16-bit or 400x300x32-bit. Attempting 32-bit color on 600×480 typically results in corrupted on-screen color and a peak 2D resolution of 1280×1024. If the game performance is moderate, the full-screen MPEG playback feature will at least balance the feature set.
ATI redesigned the chip and released the Rage II in September. in addition to adding MPEG2 playback support, it also corrected the D3DX problems of the first chip. However, the initial card still came with 2MB of memory, which affected performance and compromised perspective/geometry conversion. With the expansion of the series to include Rage II + DVD and 3D Xpression +, the memory capacity option was increased to 8MB.
While ATI led the way in bringing 3D graphics solutions to market, it didn’t take long for other competitors with different ideas for 3D implementation to emerge. Namely, 3dfx, Rendition and VideoLogic.
In the race to release a new product to the market, 3Dfx Interactive won over Rendition and VideoLogic. However, the performance race was over before it began, and 3Dfx Voodoo Graphics effectively wiped out all competition.