NVIDIA
From Free net encyclopedia
Template:Infobox Company$2.01 billion USD (2004)|
homepage = www.nvidia.com
}}
NVIDIA Corporation (Template:Nasdaq) is a major supplier of graphics processors (graphics processing units, GPUs), graphics cards, and media and communications devices for PCs and game consoles such as the original Xbox and the new upcoming 'next generation' Playstation 3. Its headquarters are in Santa Clara, California. In 2001, it had revenues of $1.37 billion USD and net income of $177.1 million.
On December 14th, 2005, nVidia acquired ULI Electronics for $52 million US. ULI is notable for supplying third party Southbridge parts for ATI chipsets.
Contents |
History
Jen-Hsun Huang, Chris Malachowsky, and Curtis Priem founded the company in January 1993 and incorporated it in California in April 1993 (later re-incorporating it in Delaware). The company remained relatively low-key until the late 1997-1998 period, when it launched its line of RIVA PC graphics processors. It went public in January 1999 on Nasdaq; in May of that year it shipped its 10 millionth graphics processor. In 2000 it acquired the intellectual assets of one-time rival 3dfx, one of the biggest graphics companies of the mid to late 1990s. NVIDIA established close ties with many OEM companies as well as with organizations like SGI. By February 2002, NVIDIA had shipped 100 million processors.
Today, NVIDIA and ATI Technologies supply the majority of "discrete" graphics chips found in most modern mainstream PCs. NVIDIA's GeForce line of graphics processors, first launched in 1999, is NVIDIA's flagship product.
As a fabless high-tech company, NVIDIA conducts research & development of chips in-house, but subcontracts the actual (silicon) manufacturing to third-parties. In the past, NVIDIA has sourced silicon production capacity from STMicroelectronics, TSMC, and IBM. The production-chain of a chip involves multiple third-parties: the foundry makes processor wafers, the test-house tests the dies for defects and sorts them based on performance-characterization, and the packager seals individual dies in a hardened case. In terms of inventory management, NVIDIA must place foundry orders months in advance of their planned sale, then hold the produced chips in a warehouse until final delivery. This leads to occasional supply/demand imbalances.
Products
NVIDIA's product portfolio includes graphics processors, wireless communications processors, PC platform (motherboard core-logic) chipsets, and digital media player software. Within the Mac/PC user community, NVIDIA is best known for its "GeForce" product line, which is not only a complete line of "discrete" graphics chips found in AIB (add-in-board) video cards, but also a core-technology in both the Microsoft Xbox game-console and NForce-motherboards.
In many respects, NVIDIA is similar to its arch-rival ATI, in the sense that both companies began with a focus in the PC market, but later expanded their businesses into chips for non-PC applications. NVIDIA does not sell graphics boards into the retail market, instead focusing on the development and manufacturing of GPU chips. As part of their operations, both ATI and NVIDIA do create "reference designs" (board schematics) and provide manufacturing samples to their board partners such as Asus.
In December 2004, it was announced that NVIDIA would be assisting Sony with the design of the graphics processor (RSX) in the upcoming Sony PlayStation 3 game-console. As of March 2006, it is known that NVIDIA will deliver RSX to Sony as an IP-core, and that Sony alone would be responsible for manufacturing the RSX. Under the agreement, NVIDIA will provide ongoing support to port the RSX to Sony's fabs of choice (Sony and Toshiba), as well as die-shrinks to 65nm. This is a departure from NVIDIA's business arrangement with Microsoft, in which NVIDIA managed production and delivery of the Xbox GPU through NVIDIA's usual third-party foundry contracts. (Meanwhile, Microsoft has chosen ATI to provide the IP design for the Xbox 360's graphics hardware, as has Nintendo for their console to supersede the ATI-based GameCube.)
- "Discrete" refers to the graphic chip's boundary/proximity to other PC hardware. A discrete piece of hardware can be physically plugged/unplugged from the motherboard, the opposite term being "integrated graphics" where the piece of hardware is inseparable from the motherboard. In the PC graphics architecture, "discrete" means graphics-hardware is encapsulated in a dedicated (separate) chip. The chip's physical location, whether soldered on the motherboard PCB (as in most laptops) or mounted on an aftermarket add-in-board, has no bearing on this designation.
Graphics chipsets
- NV1 - NVIDIA's first product based upon quadratic surfaces
- RIVA 128 and RIVA 128ZX - DirectX 5 support, OpenGL 1 support, NVIDIA's first DirectX compliant hardware
- RIVA TNT, RIVA TNT2- DirectX 6 support, OpenGL 1 support, The series that made NVIDIA a market leader
- GeForce
- GeForce 256 - DirectX 7 support, OpenGL 1 support, hardware transform and lighting, introduces DDR memory support
- GeForce 2 - DirectX 7 support, OpenGL 1 support
- GeForce 3 - DirectX 8.0 shaders, OpenGL 1.5 support, features memory bandwidth saving architecture
- GeForce 4 - DirectX 8.1 parts, OpenGL 1.5 and a new budget core
- GeForce FX series - DirectX 9 support, OpenGL 1.5 and claimed to offer 'cinematic effects'
- GeForce 6 Series - DirectX 9C support, OpenGL 2.0 support, features improved shaders, reduced power consumption and Scalable Link Interface-operation
- GeForce 7 Series - DirectX 9C Support, OpenGL 2.0 support, Improved shading performance, Transparency Supersampling (TSAA) and Transparency Multisampling (TMAA) anti-aliasing, Scalable Link Interface (SLI)
- Quadro (GeForce based professional chipsets)
- GoForce - For PDAs & Smartphones
Personal computer platforms / chipsets
- nForce
- nForce IGP (AMD Athlon/Duron K7 line)
- nForce 2 (AMD Athlon/Duron K7 line, SPP (system platform processor) or IGP (Integrated Graphics Platform) and MCP (Media and Communications Processor), also features SoundStorm)
- nForce 3 (AMD Athlon 64/Athlon 64 FX/AMD Opteron, MCP only)
- nForce 4 (4X, Base, Ultra and SLi) (PCI Express support for AMD Athlon 64 processors and SLi technology on the SLi edition)
- Xbox GeForce3-class GPU (on an Intel Pentium III/Celeron platform)
- Playstation 3 (Reality Synthesiser RSX)
Market History
Pre-DirectX
NVIDIA's original graphics card called the NV1 was released in 1995, based upon quadratic surfaces, with an integrated playback only soundcard and ports for Sega Saturn gamepads. Because the Sega Saturn was also based upon forward-rendered quads, several Saturn games were converted to NV1 on the PC, such as Panzer Dragoon and Virtua Fighter Remix. However, the NV1 struggled in a market place full of several competing proprietary standards.
Market interest in the product ended when Microsoft announced the DirectX specifications, based upon polygons. Subsequently NV1 development continued internally as the NV2 project, funded by several millions of dollars of investment from Sega. Sega hoped an integrated sound and graphics chip would cut the manufacturing cost of their next console. However, even Sega eventually realised quadratic surfaces were a flawed implementation, and there is no evidence the chip was properly debugged. The NV2 incident remains something of a dark corporate secret for NVIDIA.
Turning a New Leaf
NVIDIA's CEO Jen-Hsun Huang realised at this point after two failed products, something had to change if the company was to survive. He hired David Kirk,Ph.D. as Chief Scientist from software developer Crystal Dynamics, a company renowned for the visual quality of its titles. David Kirk turned NVIDIA around by combining the company's 3D hardware experience, with an intimate understanding of practical implementations of rendering.
As part of the corporate transformation, NVIDIA abandoned proprietary interfaces, sought to fully support DirectX, and dropped multimedia functionality, in order to reduce manufacturing costs. NVIDIA also adopted an internal 6 month product cycle goal. The future failure of any one product would not threaten the survival of the company, since a next generation replacement part would always be available.
However, since the Sega NV2 contract was secret, and employees had been laid off, it looked to many industry observers at the time as if the company was dead in the water. So when the RIVA 128 was first announced in 1997, the specifications were hard to believe: Performance superior to market leader 3dfx Voodoo Graphics, and a full hardware triangle setup engine. The RIVA 128 shipped in volume, and the combination of its low cost and high performance 2D/3D acceleration made it a popular choice for OEMs.
Climbing to the Top with TNT
Having finally developed and shipped in volume the market leading integrated graphics chipset, NVIDIA set the internal goal of doubling the number of pixel pipelines in its chip, in order to realize a substantial performance gain. The TwiN Texel (RIVA TNT) engine NVIDIA subsequently developed, allowed either for two textures to be applied to a single pixel, or for two pixels to be processed per clock cycle. The former case allowing for improved visual quality, the latter doubling maximum fill rate.
New features included a 24-bit Z-buffer with 8-bit stencil support, anisotropic filtering, and per-pixel MIP mapping. In certain respects such as transistor count, the TNT had begun to rival Intel's Pentium processors for complexity. However, while the TNT offered an astonishing range of quality integrated features, it failed to displace the market leader Voodoo 2, because the actual clock speed ended up at only 90 MHz, about 35% less than expected.
However, this was only a temporary respite for Voodoo, as NVIDIA's refresh part was a die shrink for the TNT architecture from 350 nm to 250 nm. Stock TNTs now ran at 125 MHz, ULTRAs at 150 MHz. The Voodoo 3 was barely any faster and lacked features such as 32 bit color. The RIVA TNT2 marks a major turning point for NVIDIA. They had finally delivered a product competitive with the fastest on the market, with a superior feature set, strong 2D functionality, all integrated onto a single die with strong yields, that ramped to impressive clock speeds.
The GeForce Era
Not content to sit back, the fall of 1999 saw the release of the GeForce 256 (NV10), most notably bringing on-board transformation and lighting. The GF256 ran at 120 MHz and was also implemented with advanced video acceleration, motion compensation, hardware sub picture alpha-blending, and had four-pixel pipelines. When combined with DDR memory support, NVIDIA's technology was the hands down performance leader.
Basking in the success of its products, NVIDIA won the contract to develop the graphics hardware for Microsoft’s Xbox. The result was a huge $200 million advance. However, the project drew the time of many of NVIDIA's best engineers. Short term this was of no importance, and the GeForce 2 GTS shipped in the fall of 2000.
The GTS benefited from the fact NVIDIA had by this time acquired extensive manufacturing experience with their highly integrated cores, and as a result they were able to optimise the core for clock speeds. The volumes of chips NVIDIA was producing, also enabled them to bin split parts, picking out the highest quality cores for their premium range. As a result, the GTS shipped at 200 MHz. The pixel fill rate of the GF256 nearly doubled, and texel fill rate nearly quadrupled because multi-texturing was added to each pixel pipeline. New features included S3TC compression, FSAA, and improved MPEG-2 motion compensation.
More significantly, shortly afterwards NVIDIA launched the GeForce 2 MX, intended for the budget / OEM market. It had two pixel pipelines fewer, and ran at 175 and later, 200 MHz. Offering strong performance at a bargain basement price, the GeForce 2MX is probably the most successful graphics chipset of all time. A mobile version called the GeForce2 Go was also shipped at the end of 2000.
All of which finally proved too much for 3dfx whose Voodoo 5 had been delayed, and the board of directors started the process of dissolving 3DFX. This became one of the most spectacular and public bankruptcies in the history of personal computing. NVIDIA purchased 3dfx primarily for the intellectual property which was in dispute at the time, but also acquired anti-aliasing expertise, and about 100 engineers, who subsequently worked extensively on development of the ill fated FX series, to some minds proving 3dfx lacked the expertise to develop high performance integrated cores.
A Shaky Lead
At this point NVIDIA’s market position looked unassailable, and industry observers began to refer to NVIDIA as the Intel of the graphics industry. However while the next generation FX chips were being developed, many of NVIDIA’s best engineers were working on the Xbox contract, developing the SoundStorm audio chip, and a motherboard solution.
It is also worth noting Microsoft paid NVIDIA for the chips themselves, and the contract did not allow for falling manufacturing costs, as process technology improved. Microsoft eventually realised its mistake, but NVIDIA refused to renegotiate the terms of the contract. As a result, NVIDIA and Microsoft who had previously worked very closely, fell out. NVIDIA was not consulted when the DirectX 9 specification was drawn up. Apparently as a result, ATI designed the 9700 to fit the DirectX specifications. Rendering color support was limited to 24-bits, and shader performance had been emphasized throughout development, since this was to be the main focus of DirectX 9.
In contrast, NVIDIA’s cards offered 16 and 32 bit modes, offering either poor visual quality, or slow performance. The 32 bit support made them much more expensive to manufacture requiring a higher transistor count. And shader performance was often only half or less the speed provided by ATI. Having made its reputation by providing easy to manufacture DirectX compatible parts, NVIDIA had misjudged Microsoft’s next standard, and was to pay a heavy price for this error. As more and more games started to rely on DirectX 9 features, the poor shader performance of the GeForce FX series became ever more obvious. With the exception of the FX 5700 Ultra, a late revision, the FX series lacked performance compared to equivalent ATI parts.
NVIDIA started to become ever more desperate to hide the shortcomings of the GeForce FX range. A notable 'FX only' demo called Dawn was released, but the wrapper was hacked to enable it to run on a 9700, where it ran faster despite a translation overhead. NVIDIA also began to include ‘optimizations’ in their drivers to increase performance. While some of these were valid, hardware review sites started to run articles showing how NVIDIA’s driver autodetected benchmarks, and produced artificially inflated scores, that did not relate to real world gaming performance. Oftentimes it was tips from ATI’s driver development team that lay behind these articles. As NVIDIA’s drivers became ever more full of hacks and ‘optimizations,' the legendary stability and compatibility also began to suffer.
Furthermore, the GeForce FX series also ran hot, because they drew as much as double the amount of power as equivalent parts from ATI. The NV30 became notorious for the fan noise, and acquired the nickname ‘Dustbuster.’ While it was withdrawn and replaced with quieter parts, NVIDIA was forced to ship large and expensive fans on its FX parts.
As a result of the FX series' weaknesses, NVIDIA quite unexpectedly lost its market leadership position to ATI.
NVIDIA Steps Back Up
NVIDIA's fightback began with the GeForce 6 series, which addressed the key issues that had plagued the FX series, namely shader performance and power consumption. By working closely with developers most especially as part of NVIDIA's the way it's meant to be played program, NVIDIA renewed its determination to produce integrated, easy to manufacture hardware, in line with industry requirements and expectations.
The results of this improved corporate focus came with the release of the GeForce 7 series. With 24 pixel pipelines, it gave NVIDIA the undisputed performance lead for the first time since the ATI 9700 release. But more importantly, the parts shipped in volume the day the product was formally released. Pricing and availability has continued to be excellent, while ATI's comparable next generation parts have suffered from repeated delays.
Open Source development
NVIDIA provides binary GeForce graphics drivers for X11 and a thin open-source library that interfaces between the Linux kernel or FreeBSD kernel and the proprietary graphics software. NVIDIA's Linux support has promoted mutual adoption in the entertainment, defense and simulation/training industries, which have been traditionally dominated by SGI, Evans & Sutherland and other relatively costly vendors.
Because of the proprietary nature of NVIDIA's drivers, they are at the center of an ongoing controversy within the Linux and FreeBSD communities. Many Linux and FreeBSD users insist on using only open-source drivers, and regard a binary-only driver as wholly inadequate. Yet there are also many users that are pleased to have regularly-updated, high performance and officially-supported drivers.
Original Equipment Manufacturers
- eVGA
- Gainward
- Inno3D
- Leadtek
- Micro-Star International (MSI)
- Palit
- XFX
- BFG
- Asus
- Albatron
- Biostar
- Gigabyte
- GRANDMARS
- PNY
- Chaintech
See also
External links
- NVIDIA.com - Corporate Homepage
- SLIzone.com - SLI Technology Website
- nZone.com - PC and Gaming Enthusiast Website
- NVIDIA - Windows Vista Graphics
- NVIDIA - List of Windows Vista GPUs
- Tweakguides.com "nVidia Forceware Tweak Guide"
- Firing Squad: History of NVIDIA
- Omega drivers, alternative drivers, currently on hold not stopped
- NVIDIA's graphics-developer website
Template:NVIDIAbs:NVIDIA da:NVIDIA de:NVidia es:NVIDIA fr:NVIDIA id:NVIDIA it:NVIDIA Corporation lb:NVIDIA hu:NVidia nl:Nvidia ja:NVIDIA pl:NVIDIA Corporation pt:NVIDIA sk:NVIDIA fi:NVIDIA sv:Nvidia uk:NVIDIA zh:NVIDIA