en English

(2) History of GPU – A game-changer of 3DfxVoodoo (1976-1995)

At Voodoo’s peak, 3Dfx was estimated to account for 80-85% of the 3D accelerator market

From Techspot

In the history of GPUs (I), which focuses on the early development of consumer 3D Graphics cards, 3Dfx Voodoo Graphics effectively killed off the performance race before it even began. Here’s how 3Dfx Voodoo is a game-changer.

3Dfx’s Voodoo graphics card was introduced in November 1996 and consisted of a single 3D graphics card, requiring a VGA cable from a separate 2D card to the Voodoo and then to the monitor.

These cards are sold by many companies. Orchid Technologies pioneered the Orchid Righteous 3D chip, a $299 motherboard known for mechanical relays that “click” when using a chipset. Later revisions used solid-state relays, as did other suppliers. It was followed by Diamond Multimedia’s Monster3D, Colormaster’s Voodoo Mania, Canopus Pure3D, Quantum3D and Miro Hiscore, Skywell (Magic3D), and 2theMAX Fantasy FX Power3D are all chips based on Voodoo.

Voodoo Graphics revolutionized the Graphics processor in personal computers almost overnight, leapfrogging a number of vendors that could only provide 2D Graphics design. The 3D space in 1996 gave S3 about 50% market share. At Voodoo’s peak, 3Dfx was estimated to account for 80-85% of the 3D accelerator market. However, that will soon change.

一:Diamond Multimedia Monster3D(3DfxVoodoo14MBPCI)

At the same time, Video Logic developed a block-based delayed rendering technology (TBDR) that eliminates the need for massive Z-buffering (removing shaded/hidden pixels in the final rendering) by discarding all visible geometry before applying textures, shadows, and lighting to the rest of the rendering. The resulting frames are split into rectangular blocks, each with its own polygon rendering, and sent to the output. Once the number of pixels needed for the frame is calculated, the excess polygons are culled (z-buffering only happens at the block level) and polygon rendering begins. This method requires minimal computation.

The chips and graphics cards of the first two series are made by NEC, while the Series3 (Kyro) chips are made by ST Micro. The first graphics card was dedicated to the Compaq Presario PC and was called Midas3 (Midas1 and 2 were prototypes for arcade-based system projects). PCX1 and PCX2 are followed as OEM parts.

Series 2 chips were originally produced by Sega’s Dreamcast game console. But by the time the desktop Neon 250 card went on sale in November 1999, it had been ruthlessly eclipsed at $169 by higher-resolution 32-bit color graphics cards.

Prior to Neon 250’s arrival, Rendition’s Verite V1000 was the first graphics card with a programmable core to render 2D +3D graphics by leveraging a MIPs-based RISC processor and pixel pipeline. The processor is responsible for setting up and organizing the pipeline workload.

Originally developed in late 1995, the Verite 1000 became one of the motherboards Microsoft used to develop Direct3D. Unfortunately, because Rendition uses this approach to transfer data over a PCI interface, the card requires a motherboard chipset capable of supporting direct memory access (DMA). Until Voodoo Graphics came along, the V1000 performed well compared to almost every other consumer Graphics card, with more than twice as much 3D performance. The board is relatively inexpensive and offers a good feature set, including edge anti-aliasing for budget gamers and Quake hardware acceleration for ID software. However, this approach was not very popular, and game developers quickly eschewed the DMA transport model.

Like 1996, 1997 proved to be another busy year for the consumer graphics industry.

ATI released Rage II, followed by 3D Rage Pro in March. The latter was the first AGP 2X card and the first product from the ATI 3D engineering group, which was founded in 1995.

ATI 3D Rage Pro
Figure 2: ATI 3D Rage Pro

The Pro performed almost as well as Voodoo Graphics in 4MB format and outperformed 3Dfx cards using 8MB and AGP interfaces. With an expanded 4kB cache and edge anti-aliasing, the card improved perspective correction, texture processing, and trilinear filtering. A floating-point unit is also integrated to reduce CPU dependence and hardware acceleration and display support for DVD.

All in all, Rage Pro greatly boosted ATI’s bottom line, helping the company to a c $47.7 million profit on sales of more than $600 million. Much of this success comes from OEM contracts, integration with consumer and server motherboards, and mobile versions. The cards, which are typically sold as [email protected] and [email protected], also vary in price: $170 for the 2MB version, $200- $230 for the 4MB version, and $270- $300 for the 8MB version. The 16MB version costs more than $400.

ATI strengthened its portfolio by acquiring IP from Tseng Labs for $3 million and hiring 40 of the company’s engineers in December 1997. It was a good deal, as Tseng’s failure to integrate RAMDAC into its cards led to a sharp decline in sales, from $12.4 million in 1996 to $1.9 million in 1997.

In March 1997, 3DLabs released a revised version of the Permedia (” Pervasive 3D “) series of boards based on the 350nm process of Texas Instruments rather than the previous version of IBM’s Permedia and Permedia NT processes. The performance of the first version wasn’t up to par, and the NT model was improved with an additional Delta chip with full Triangle and AA Settings, but it came in at $300. Permedia 2-based graphics cards began shipping by the end of the year, but instead of going head-to-head with gaming heavyweights, they were sold as semi-professional 2D cards with moderate 3D graphics capabilities.

A month after ATI and 3DLabs updated their product lines, Nvidia released RIVA128 (Real-time Interactive Video and Animation Accelerator) and added Direct3D compatibility by rendering triangular polygons.

Nvidia maintains a partnership with ST Micro, which uses a new 350nm process to produce chips and has developed RAMDAC and video converters. While there were problems with the initial drivers (especially with Unreal), the card showed enough performance in games like Quake 2 and 3 to rank high on many benchmark charts.

Diamond ViperV330 PCI (Nvidia RIVA 128)
Figure 3: Diamond ViperV330 PCI (Nvidia RIVA 128)

This turned out to be the landmark graphics card that Nvidia had been looking for since 1993. Nvidia needs to look further ahead to maintain supply, which is key to their future economy and success. So they signed a manufacturing agreement with TSMC to supply Riva128ZXs together with ST Micro. At the end of 1997, Nvidia’s 3D graphics market share was estimated at 24%, second only to 3Dfx Interactive, largely thanks to Riva 128/128ZX.

Although the final contract went to NEC/VideoLogic, Sega also added NV2 to the graphics chips likely to be used in the Dreamcast console, which also boosted Nvidia’s revenue.

Rival 3Dfx also partnered with Sega on the project and is believed to have supplied the hardware for the console until it terminated its contract. 3Dfx filed a $155 million lawsuit, claiming it was misled by Sega into believing they were promising 3Dfx hardware and giving them access to confidential material related to its graphics IP. They settled out of court a year later for $10.5 million.

The Dreamcast Black Belt project is just one of 3DfxInteractive’s annual highlights.

Sega Lackbelt prototype based on 3Dfx
Figure 4: Sega Lackbelt prototype based on 3Dfx

Quantum3D was separated from 3Dfx on March 31, 1997. SGI and Gemini Technology partnered with the company to develop very high-end enthusiasts and professional graphics solutions using 3Dfx’s new SLI (Scanline interleave) Technology. This involves using a sub-card with a second chipset and memory, or two or more cards connected via a ribbon cable, in the same way, that Nvidia’s SLI and AMD’s Crossfire currently use the concept. When connected together, each card (or logical block in the case of a single-board SLI card) contributes half of the scan line to the display per frame.

SLI also increases the maximum screen resolution from 800x 600 to 1024 x 768 pixels. The Obsidian Pro 100DB-4440 (two single cards, each with an Amethyst child card) retails for $2,500, while single-card SLI solutions like the 100SB-4440 and 4440V cost $1,895.

In the summer of 1997, 3Dfx announced an IPO and launched the Voodoo Rush, an attempt to offer single cards with 2D and 3D capabilities. However, the final product could not use the proposed Rampage chip, resulting in a decline in Voodoo sales. The card’s SST-1 chip can handle Glide API games, while the Alliance’s below-standard and even worse Macronix chip can handle other 3D games and 2D applications. The Voodoo Rush caused screen distortion because the 3Dfx chip/memory was running at 50MHz while the Alliance AT25 was running at 72MHz.

The situation is made worse by the fact that the Voodoo Rush’s frame buffer is halved by being shared between 3D and 2D chips, limiting the resolution to around 512×384. In addition, the refresh rate also dropped dramatically, with the Alliance and Macronix chips being limited to 175 and 160MHz RAMdAcs, respectively.

Shortly after the Voodoo Rush, Sunnyvale-based Rendition released the Verite V2100 and V2200. These cards still don’t match the performance of the original Voodoo and are barely competitive with the budget-oriented Rush. The company lagged significantly behind its competitors in development, and game developers showed little interest in the cards, which eventually became Rendition’s last commercial graphics product.

Rendition is also working on various other projects, including adding the FujitsuFXG-1 geometry processor to the V2100/V2200 in a two-chip approach, and other vendors are working to integrate it into a single chip. In September 1998, Micron acquired the company for $93 million, hoping to combine LSI’s embedded DRAM technology with Rendition’s graphics expertise, The FXG-1 driver (and gloriously named) Hercules Thriller Conspiracy cards as well as the V3300 and 4400E are still unfinished projects.

RenditionVerite V2200 reference board
Figure 5: RenditionVerite V2200 reference board

As the feature set and performance increase, so does the price of graphics cards, and many vendors that can’t catch up with ATI, Nvidia, and 3Dfx are rushing in to fill the sub – $200 market.

Matrox released Mystique for $120-150 (limited by lack of OpenGL support), while the BASIC MODEL of the S3 ViRGE series started at around $120, while the DX and GX rose to $150 and $200, respectively. S3 adds mobile cards with dynamic power management (ViRGE/MX) and desktop ViRGE/GX2 with TV-out, S-Video, and auxiliary DVD playback, diversifying the product line to ensure a steady flow of sales.

Cirrus Logic’s Laguna 3D series, tridents 9750/9850, and SiS’s 6326 have all captured the attention of gamers. But for Laguna3D, the cheap $99 price isn’t enough to compensate for the performance degradation, poor 3D image quality, and consistency issues compared to similarly priced cards like S3’s ViRGE VX.

After Laguna3D was introduced, CirrusLogic quickly left the graphics card industry. Previously, they offered a range of inexpensive 16-bit color graphics adapters for $50, most notably the Diamond SpeedStar series and the OrchidKelvin 64.

Trident also launched the 3DImage9750 for entry-level products in May, followed shortly thereafter by the updated 9850, which supports the AGP 2x bus. The 9750 is a PCI or AGP 1X graphics card with various graphics quality and rendering issues. The 9850 has corrected some issues, but the texture filtering issue remains.

SiS joined the Budget 3D graphics market in June with the 6326, which typically costs between $40 and $50. The graphics card provides good image quality and is better than many other Budget cards. The 6326 was never a threat in the performance arena, but 7 million units were sold in 1998.

At the Assembly games event in June 1997, BitBoys announced their Pyramid3D graphics to the world. The much-ballyhooed project is the result of a joint effort between Silicon VLSI Solutions Oy, TriTech, and BitBoys.

But Pyramid3D never showed up, numerous tweaks and modifications delayed the project, and TriTech lost a legitimate chip patent lawsuit that ultimately bankrupted the company.

Glaze3D cards should achieve realism
Figure 6: Glaze3D cards should achieve realism

Bitboys will continue to announce a second design, the Glaze3D chip, on May 15, 1998. They promised a first-class performance and planned release by the end of 1999. As the time for the big reveal neared, BitBoys announced a revised design at SIGGRAPH99 in October, eliminating RAMBUS memory and memory controller in favor of Infineon’s 9MB embedded DRAM.

Again, the project was canceled due to search errors and manufacturing problems.

The company has a reputation for missing release dates. Glaze3D was later redesigned under the codename Axe and caught up with the competition by supporting DirectX 8.1. The new chips were scheduled to debut as Avalanche3D cards in late 2001, while a third Glaze3D development code named Hammer was already committed to DirectX 9 support.

Glaze3D’s prototype boards were built with the original chips, but that all came to a halt when Infineon stopped making embedded DRAM in 2001 due to financial losses. Lacking a manufacturing partner, Bitboys eventually abandoned desktop graphics to focus on mobile graphics IP.

Intel introduced the first (and to date the last) commercially available discrete 3D board game chip in January 1998. The i740 traces its origins to a flight simulation NASA ran for GeneralElectric’s Apollo space program, which was later sold to Martin Marietta, Three years later he merged with Lockheed. Lockheed-martin purported the program as Real3D for specialized graphics products, particularly Real3D/100 and Real3D/Pro-1000. Sega Model 3 arcade board features two Pro-1000 graphics systems.

Lockheed-martin then entered into a joint project with Intel and Chips and Technologies called project Aurora. Intel bought a 20 percent stake in Real3D in January, a month before the i740 was launched. By this stage, Intel had purchased 100% of the chips and technology in July 1997.

The i740 combines the resources of two different graphics and texture chips on the R3D/100, but somewhat oddly Intel implements AGP textures, where textures are uploaded to system memory (render buffers can also be stored in RAM). Some designs use the card’s frame buffer to hold textures, or swap textures to system RAM if the frame buffer becomes saturated or the texture is too large to store in local graphics memory.

To minimize latency, Intel’s design uses AGP direct memory execution (DiME) functionality, which calls only those textures needed for rasterization and stores the rest in system RAM. The performance and image quality are fair, and the performance is roughly comparable to last year’s high-end products. The pricing reflects Aggressive marketing by Intel, which will sell for $119 for the 4MB model and $149 for the 8MB model. The i740 is sold as an intel-branded card, Real3DStarFighter, or Diamond Stealth IIG450.

Intel740 i740AGP graphics
Figure 7: Intel740 / i740AGP graphics

Intel designed a modified i752 chip, but a general lack of interest from OEMs and the gaming community led the company to cancel commercial production. Some boards came out of the factory but, like the i740, were converted into integrated graphics chipsets.

Lockheed-martin closed Real3D in October 1999 and sold the related IP to Intel. Many employees then moved to Intel or ATI.

ATI introduced a revamp of Rage Pro in February 1998, which included the renaming of the Rage Pro Turbo card and providing a set of drivers that were highly optimized for the synthetic benchmark. There was little else except the price went up to $449. The driver from beta2 has improved game performance.

ATI launched Rage 128 GL and VR in August, the first product developed by the company’s former Tseng LABS engineers. However, until the New Year, the supply in the retail channel was less than ideal, effectively killing any chance ATI had of making its mark on the gaming space, as they had done in the OEM market. Its configuration includes 32MB of ram (16MB and 32MB in 128 versions) and an efficient memory architecture that can surpass the NvidiaTNT when screen resolution is improved and a 32-bit color display is used. Unfortunately for ATI, many games and users at the time had hardware configurations that were 16-bit. In addition, the graphics quality is basically the same as S3 and Nvidia’s mainstream competitors but still lags behind Matrox.

Nevertheless, ATI was enough to be the top graphics vendor in 1998 with a 27% market share and net revenues of C $168.4 million, and sales of C $1.15 billion.

ATI announced its $67 million acquisition of Chromatic Search in October of that year, and its MPACT media processor is favored among many PC TV solutions — especially Compaq and Gateway. These chips offer very good 2D graphics performance, excellent audio, and MPEG2 playback, but limited 3D gaming performance and cost around $200. In the end, insurmountable software problems doomed the company to only four years of life.

Two months after the i740 made a splash in the graphics market, 3Dfx released the Voodoo 2. Like its predecessor, it is a pure 3D solution, and while impressive, it represents a complex system. The boards came with two textured IC, the first example of multiple textures being found in a graphics card, so it used three chips in total instead of just one chip combining 2D/3D functionality like the competing cards.

Quantum3D’s Voodoo 2 implementation includes obsidian2X-24 as a single SLI card that can be paired with a 2D sub card, a single-slot SLI SB200/200SBI with 24MB EDO RAM, And Mercury Heavy MetalSLI boards with four 200SBi connected via the Controller board (AAlchemy), which provides SLI bridging capabilities in multi-GPU card setups.

The latter is a professional graphics solution for vision simulators, so it costs up to $9,999 and requires an Intel BX or GX server motherboard with four consecutive PCI slots.

The Voodoo Banshee was released in June 1998, but it didn’t go on sale for another three months. The card still combines the 2D part of the AWOL Rampage chipset with a single texture mapping unit (TMU), so 3Dfx can now offer a single chip with 2D and 3D capabilities at a significantly lower production cost, but by comparison, Banshee lags far behind Voodoo 2’s ability to render multi-textured polygons.

The revolution 3Dfx brought about three years ago has now passed.

Voodoo 2 has no competition in raw 3D performance, but the competition is rapidly expanding. Amid growing competition between ATI and Nvidia, 3Dfx hopes to maintain a higher profit line by marketing and selling its own boards, which were previously handled by a long list of board partners. To that end, 3Dfx acquired STB Systems on Dec. 15 for $141 million in stock, but the joint venture proved to be a huge misstep as the foundry used by the company (Juarez) couldn’t compete with the TSMC foundry used by Nvidia in terms of quality and cost. Nor can it compete with ATI’s Taiwanese contract manufacturing partner, UMC.

Many of 3Dfx’s former partners switched to Nvidia.

On March 23, Nvidia launched Riva TNT, which stands for TwiN Texel, adding to the pressure on 3Dfx in the market. A second parallel pixel channel was added to the Riva design, doubling the pixel fill rate and rendering speed, as well as adding a staggering (1998)16MB of SDR memory — the Voodoo 2’s 8-16MB RAM was the slower kind in the EDO series. As a strong contender, its performance was weakened by its own complexity – an 8 million transistor chip on TSMC’s 350nm process could not run at Nvidia’s original 125MHz core/memory frequency due to heat, so it used a 90MHz clock. That’s a 28% drop, and thus enough to ensure that the Voodoo 2 eked out a performance lead, thanks largely to Glide.

Even with the reduced specs, TNT is an impressive graphics card. Its AGP 2X interface allows gaming at 1600 x 1200 and 32-bit color rendering using a 24-bit Z-buffer (image depth representation). This is a huge improvement over Voodoo 2’s 16-bit color support and 16-bit Z-buffer. TNT offers a better feature set, better CPU clock speed scaling, excellent AGP textures and better 2D performance compared to Voodoo 2 and Banshee. It didn’t ship until September and in small quantities.

Not everything went Nvidia’s way, at least not at first.

SGI filed a lawsuit against them on April 9, alleging infringement of texture mapping patents. A settlement reached in July 1999 gave Nvidia access to SGI’s professional graphics portfolio, while SGI terminated its graphics team and turned over its mid-level graphics team to Nvidia. This virtual gift of IP is often cited as one of the main reasons SGI went bankrupt at an alarming rate.

June and July highlighted signs of decline in the industry as major players in the market dominated media coverage in the first few months of the year.

On June 16, Number Nine released their RevolutionIV card.

The graphics card can’t match the advances Nvidia and ATI have made in 3D performance, so the company is trying to strengthen its position in the 2D productivity market.

Number Nine has been favoring 2D performance over allocating resources to 3D, only to find itself tied to game cards like Nvidia’s TNT in both markets. The company decided to take advantage of a real weakness that plagues most dedicated game cards: the high display resolution of 32-bit color.

To that end, Number Nine has added a 36-pin OpenLDI connector to the Revolution IV-FP that connects to an SGI flat screen bundled with the card. The 17.3″SGI 1600SW (1600×1024) with the Revolution IV-FP package will initially retail for $2,795.

SGI tablet bundled with revolution IV-FP
Figure 8: SGI tablet bundled with revolution IV-FP

This is the last homemade card Number Nine will have when they start selling S3 and Nvidia products again. The company’s assets were acquired by S3 in December 1999 and sold to engineers from Number Nine’s original design team, which formed SiliconSpectrum in 2002.

S3 unveiled Savage3D at E3 in 1998, and unlike TNT and Voodoo Banshee, the card quickly hit the retail market. However, this rapid introduction has brought penalties. OpenGL games are particularly affected, as S3 only provides a tiny OpenGL driver for Quake games.

The original specification for the S3 called for a 125MHz clock for the card, but production and heat output resulted in a 90-110mhz clock for the final shipped parts — many review magazines and websites still receive pre-production samples with a higher 125MHz clock. Savage3D later introduced a 120MHz superpower segment, while Hercules and STB sold the Terminator BEAST and Nitro 3200 at 120/125mhz respectively. Although OpenGL emulation and DirectX performance were hampered by drivers, the reference board’s sub – $100 price and acceptable game and video playback performance contributed to some sales.

From 1997 to 1998, the number of graphics vendors leaving the industry increased. These include Cirrus Logic, Macronix, Alliance Semiconductor, Dynamic Pictures (sold to 3DLabs), Tseng Labs, Diagnostics Research (all acquired by ATI), and Rendition (sold to Micron), AccelGraphics (purchased by Evans&Sutherland), and chips and technology (swallowed by Intel).

The gap between those who own and those who don’t become more pronounced in 1999.

In January SiS300 was released, which is a graphics card for economical business jets. The SiS300 offered minimal 3D performance in Beijing in 1999, and 2D couldn’t match most of SiS’s competitors in the retail market. You can see this in a single-pixel pipeline. Fortunately, with SiS, OEMs don’t have to worry because the product has enough feature checkboxes to meet performance requirements: 128-bit memory bus (64-bit in the SiS305 revision), 32-bit color support, DirectX 6.0 (DX7 for 305), multi-texture, TV output and hardware MPEG2 decoding.

Following in December 2000, SiS315 added a 256-bit memory bus, DirectX 8 support, full-screen AA, a second-pixel pipe, transform and lighting engine, DVD video motion compensation, and DVI support. Performance is usually in the GeForce 2 MX200 range. In September 2001, the same 315 chip formed the basis of SiS’s 650 chipsets for Socket 478 board (Pentium 4) and SiS552 system-on-chip in 2003.

In addition to SiS’s products, there are still many choices available that require careful budgeting. These include the Trident Blade 3D (about $65), and its 3D performance (if driver support is incomplete) is generally comparable to Intel’s i740.

Encouraged by this, Trident went ahead with the release of the Blade 3D Turbo, which increased its clock frequency from 110MHz to 135MHz, helping it keep pace with Intel’s revised i752. Trident’s involvement with the integrated Graphics business came to an abrupt end when VIA acquired S3 Graphics in April 2000.

Trident’s core graphics business relies heavily on high-volume, low-priced chips, mainly in the mobile space. The Blade 3D Turbo was improved to Blade T16, T64 (143MHz), and XP (166MHz). But Trident is developing 3D at a much slower pace than the market as a whole. Even a long-delayed budget like SiS315 easily outpaced their new product.

The Graphics Division of Trident was sold to the XGI subsidiary of SiS in June 2003.

S3 Savage4 is an improvement over SiS and Trident products in terms of performance. The graphics card was released in February and will retail starting in May for $100- $130, depending on whether the onboard memory is 16MB or 32MB. Savage3D introduces S3’s texture compression to ensure that textures up to 2,048×2048 bytes can be held even with a limited 64-bit memory bus.

Diamond ViperII Z200 (S3 Savage4)
Figure 9: Diamond ViperII Z200 (S3 Savage4)

Savage4 became the first S3 graphics card to support multiple textures and the first to support the AGP 4X interface. But even improved drivers and a reasonable feature set can’t make up for the fact that it struggles to match the performance of previous-generation 3Dfx, Nvidia, and ATI cards. The cycle was repeated with the release of Savage 2000 at the end of the year. The card is better than the TNT2 and Matrox G400 at a resolution of 1024×768 and below, but the 1280×1024/1600×1200 is different.

The first in 3Dfx’s Voodoo3 series came out in March, backed by an extensive television and print advertising campaign. But the long-awaited Rampage chipset still hasn’t arrived, so even with some tweaks in the Avenger chipset, these motherboards still have the same architecture. They still rely on 16-bit color modes, are limited by 256×256 texture support and lack of hardware-based transformations and lighting (T&L). These factors are starting to become critical for game developers, and 3Dfx continues to disappoint by not delivering on its architectural and feature set promises.

In what seems to be a long tradition, 3Dfx blamed market shakiness for its failed fate, although it didn’t really affect ATI and Nvidia that much. In another sign of 3Dfx’s troubles, the company announced in December that its Glide proprietary graphics API would finally be made available as open-source, while DirectX and OpenGL continue to be favored by game developers.

In March, Nvidia announced the Riva TNT2, which includes the first Ultra branded motherboard with faster core and storage speeds, while Matrox introduced the G400 series.

TNT2 uses TSMC’s 250nm process and has successfully fulfilled Nvidia’s original expectations for TNT. It’s better than Voodoo 3 across the board, with the exception of apps that use AMD 3DNow! CPU instruction extensions are used in conjunction with OpenGL. TNT2 keeps in sync with 3Dfx and Matrox, including DVI output for flat panel displays.

Meanwhile, the Matrox G400 largely outperformed Voodoo 3 and TNT2, although OpenGL support was still significantly behind. At $199- $229, the card is worth the price in terms of performance, image quality, and feature set. The ability to drive two displays through dual display controllers (called DualHead by Matrox) started the company’s trend toward multi-display support. In this case, the resolution of the auxiliary monitor is limited to 1280×1024.

The G400 also introduced EMBM, which provides better texture performance. For those with deep pockets, the G400, which costs up to $250, gives them the fastest consumer graphics card on the market, It wasn’t until early 2000 that boards such as Creative Labs3D Blaster Annihilator Pro based on GeForce 256 DDR hit the shelves.

Since then, Matrox has focused on the professional market and briefly returned to the gaming market in 2002 with Parhelia. Three-monitor support is not enough to offset the new tide of sub-par gaming performance and DirectX 9.0-compatible hardware.

As the smoke clears from 3Dfx, Nvidia, and Matrox releases, 3DLabs joins the long-awaited Permedia 3 Create. The graphics card was released a few months ago and is aimed at professional users who are interested in the game. As a result, 3DLabs prioritized 2D functionality over 3D, drawing on its graphics expertise acquired in July 1998 from Dynamic Pictures, the designer of the top Oxygen series of workstation cards.

Unfortunately for 3DLabs, the most important thing for workstation graphics is often complex polygon modeling — often at the expense of texture fill rate. This is the opposite of what game cards require, where texture and visuals are more important than complex wire-frame models.

While TNT2 and Voodoo 3 are overpriced and better than their competitors in the gaming landscape, their workstation tasks are far from enough to differentiate them from their competitors. Permedia 3 represents 3DLabs’ final foray into the gaming graphics card space. From then on, 3DLabs will focus its efforts on Oxygen Card based on GLINT R3 and R4. From $299 for the VX-1 to $1499 for the GVX 420, the Wildcat family (e.g., the Wildcat II-5110 for $2499) is still based on Intense3D’s ParaScale graphics processors, The processor began when the Intense3D was acquired from Intergraph in July 2000. 3DLabs will begin integrating their own graphics into the Wildcat line in 2002 when Creative Technology purchased their P9 and P10 processors.

Due to the division’s merger with Creative’s SoC Group (renamed ZiiLabs), the company left the desktop market in 2006 and shifted its focus to media-oriented graphic design, which was sold to Intel in November 2012.

ATI has made rapid progress since the Rage 128 debut. In late 1998, the company added support for AGP 4X and added a clock to Rage 128 to release a Pro version of the card, which also had video capture and TV-out options. The Rage 128 Pro’s gaming performance is roughly on par with Nvidia’s TNT2, but much lower than the TNT2 Ultra, which ATI intends to compensate for with the Project Aurora.

ATI's Rage Fury MAXX combines two Rage 128 Pro chips on a single board
Figure 10: ATI’s Rage Fury MAXX combines two Rage 128 Pro chips on a single board

When it became clear that ATI was not gaining the upper hand in the performance race, the project changed tack and implemented Rage Fury MAXX, which included two Rage 128 Pros on the same PCB. The spec numbers are impressive, the two chips are each responsible for rendering alternate frames and halving the amount of play between them. In fact, while the card is better than the previous generation, it is no match for the S3 Savage 2000 and will always lag behind the upcoming GeForce 256 DDR. The latter is only slightly more expensive at $279 versus $249 for the ATI.

Less than two months after Rage Fury MAXX was released, Nvidia released the GeForce 256 SDR on October 1, followed by the DDR version in February 2000. This will be the first RAM card to use this form. It is an integrated 23 million transistor chip based on TSMC’s 220 nm process and is the first truly called GPU (Graphics processing unit) graphics chip based on the addition of the Conversion and lighting Engine (TnL or T&L).

The engine enables graphics chips to do a lot of floating-point intensive calculations to convert 3D objects and scenes and their associated lighting into 2D representations of rendered images. Previously, this kind of computation was undertaken by the CPU, which easily became a bottleneck in the workload and tended to limit the details available.

GeForce 256 has long been the subject of debate as the first product to combine programmable pixel shaders with T&L. This is because many designs are at the prototype stage (replaying VeriteV4400, BitBoys Pyramid3D, 3DfxRampage), near the level of irrelevance (3DLabs GLINT, WARP for Matrox G400), Or via a separate onboard chip (Hercules Thriller Conspiracy).

However, none of these are commercially functional. In addition, Nvidia has an inherent performance advantage over the competition due to its pioneering four-pipeline architecture. The combination of the T&L engine enabled the company to market the GeForce 256 as a professional workstation card.

A month after the desktop version, Nvidia released its first product for professional workstations — the Quadro series, the GeForce 256-based SGI VPro V3, and VR3. The cards utilize SGI’s graphics technology, which Nvidia acquired through a cross-license agreement signed in July 1999.

Nvidia made a profit of $41 million on revenue of $374.5 million, easily surpassing 1998’s $4.1 million and $158.2 million, and a huge jump from 1997’s $13.3 million. Microsoft’s $200 million down payment for NV2A, the graphics core of the Xbox, boosted Nvidia’s coffers, as did a $400 million secondary bond and stock offering in April.

Those numbers pale in comparison to ATI’s revenue of $1.2 billion and profit of $160 million that year, thanks to a 32 percent share of the graphics market. But Intel was on the verge of losing much of its OEM business because of its 815 series of integrated graphics cards.

Table of Content

Leave a Comment

Your email address will not be published.

20 + twelve =

Let's Have A Chat

Learn How We Served 100+ Global Device Brands with our Products & Get Free Sample!!!

Email Popup Background 2