GeForce 4 Series

The GeForce4 (codenames below) refers to the fourth-generation of GeForce-branded graphics processing units (GPU) manufactured by Nvidia. There are two different GeForce4 families, the high-performance Ti family, and the budget MX family. The MX family spawned a mostly identical GeForce4 Go (NV17M) family for the laptop market. All three families were announced in early 2002; members within each family were differentiated by core and memory clock speeds. In late 2002, there was an attempt to form a fourth family, also for the laptop market, the only member of it being the GeForce4 4200 Go (NV28M) which was derived from the Ti line.

Nvidia GeForce 4 Series
GeForce 4 logo
Codename(s) NV17, NV18, NV19, NV25, NV28
Release date 2002
Entry-level GPU MX
Mid-Range GPU Ti 4200, Ti 4400
High-end GPU Ti 4600, Ti 4800
Direct3D and Shader version D3D 7 (MX). D3D 8.1 with Pixel Shader 1.3 & Vertex Shader 1.1 (Ti)

Contents

[hide]
  • 1 GeForce4 Ti
  • 2 Performance
  • 3 GeForce4 MX
  • 4 GeForce4 model information
  • 5 GeForce4 Go driver support
  • 6 Known problems
  • 7 See also
  • 8 Notes and references
  • 9 External links

    GeForce4 Ti

    Architecture

    The GeForce4 Ti (NV25) was launched in April 2002 and was a revision of the GeForce 3 (NV20). It was very similar to its predecessor; the main differences were higher core and memory clock rates, a revised memory controller (known as Lightspeed Memory Architecture II), an additional vertex shader (the vertex and pixel shaders were now known as nFinite FX Engine II), hardware anti-aliasing (Accuview AA), and DVD playback.[1] Proper dual-monitor support was also brought over from the GeForce 2 MX.[2] The GeForce 4 Ti was superior to the GeForce 4 MX in virtually every aspect save for production cost, although the MX had the Nvidia VPE (video processing engine) which the Ti lacked.

    Lineup

    The initial two models were the Ti4400 and the top-of-the-range Ti4600. At the time of their introduction, Nvidia's main products were the entry-level GeForce 2 MX, the midrange GeForce4 MX models (released the same time as the Ti4400 and Ti4600), and the older but still high-performance GeForce 3 (demoted to the upper mid-range or performance niche).[1] However, ATI's Radeon 8500LE was somewhat cheaper than the Ti4400, and outperformed its price competitors, the GeForce 3 Ti200 and GeForce4 MX 460. The GeForce 3 Ti500 filled the performance gap between the Ti200 and the Ti4400 but it could not be produced cheap enough to compete with the Radeon 8500.

    In consequence, Nvidia rolled out a slightly cheaper model: the Ti4200. Although the 4200 was initially supposed to be part of the launch of the GeForce4 line, Nvidia had delayed its release to sell off the soon-to-be discontinued GeForce 3 chips. In an attempt to prevent the Ti4200 damaging the Ti4400's sales, Nvidia set the Ti4200's memory speed at 222 MHz on the models with a 128 MiB frame buffer—a full 53 MHz slower than the Ti4400 (all of which had 128 MiB frame buffers). Models with a 64 MiB frame buffer were set to 250 MHz memory speed. This tactic didn't work however, for two reasons. Firstly, the Ti4400 was perceived as being not good enough for those who wanted top performance (who preferred the Ti4600), nor those who wanted good value for money (who typically chose the Ti4200), causing the Ti4400 to fade into obscurity. Furthermore, some graphics card makers simply ignored Nvidia's guidelines for the Ti4200, and set the memory speed at 250 MHz on the 128 MiB models anyway.

    Then in late 2002, the NV25 core was replaced by the NV28 core, which differed only by addition of AGP-8X support. The Ti4200 with AGP-8X support was based on this chip, and sold as the Ti4200-8X. A Ti4800SE replaced the Ti4400 and a Ti4800 replaced the Ti4600 respectively when the 8X AGP NV28 core was introduced on these.[3][4] If the naming convention that had been applied to the AGP-8X capable Ti4200-8X was to have been applied consistently, these two cards should have been named Ti4400-8X and Ti4600-8X.

    The only mobile derivative of the Ti series was the GeForce4 4200 Go (NV28M), launched in late 2002.[5] The solution featured the same feature-set and similar performance compared to the NV28-based Ti4200, although the mobile variant was clocked lower. It outperformed the Mobility Radeon 9000 by a large margin, as well as being Nvidia's first DirectX 8 laptop graphics solution. However, because the GPU was not designed for the mobile space, it had thermal output similar to the desktop part. The 4200 Go also lacked power-saving circuitry like the MX-based GeForce4 4x0 Go series or the Mobility Radeon 9000. This caused problems for notebook manufacturers, especially with regards to battery life.[6]

    Performance

    The GeForce4 Ti outperformed the older GeForce 3 by a significant margin.[1] The competing ATI Radeon 8500 was generally faster than the GeForce 3 line, but was overshadowed by the GeForce 4 Ti in every area other than price and more advanced pixel shader (1.4) support.[1] Nvidia, however, missed a chance to dominate the upper-range/performance segment by delaying the release of the Ti4200 and by not rolling out 128 MiB models quickly enough; otherwise the Ti4200 was cheaper and faster than the previous top-line GeForce 3 and Radeon 8500. Besides the late introduction of the Ti4200, the limited release 128 MiB models of the GeForce 3 Ti200 proved unimpressive, letting the Radeon 8500LE and even the full 8500 dominated the upper-range performance for a while.[7] The Matrox Parhelia, despite having several DirectX 9.0 capabilities and other innovative features, was at most competitive with the GeForce 3 and GeForce 4 Ti 4200, but it was priced the same as the Ti 4600 at $399 USD. ATI had planned to develop a refresh to the 8500 to rival the GeForce 4 Ti, the 8500XT (R250), but ended up abandoning it to concentrate on the DirectX 9.0 compliant Radeon 9700.

    ATI's resulting Radeon 9700 Pro defeated the Ti 4600 by 15–20% in normal conditions. However, when anti-aliasing (AA) and/or anisotropic filtering (AF) were enabled, the 9700 would beat the Ti 4600 by anywhere from 40–100%. Besides outclassing the Ti4600 in performance, the 9700 also had a notably superior feature-set with full DirectX 9.0 support.[8] The next-generation GeForce FX 5800, despite being plagued by a loud fan and taking up two slots, as well as performing unimpressively against the Radeon 9700, nonetheless was still generally superior to and supplanted the 4600/4800 as Nvidia's flagship product. The Ti 4600 did hold a performance lead over the slower variants in the GeForce FX and Radeon R300 series, beating the FX 5600 and Radeon 9500. However, the continued proliferation of succeeding midrange DirectX 9.0 compliant cards, with the FX 5700 able to match or exceed, meant the obselesence and discontinuation of the Ti 4600 by mid-2003.

    The GeForce 4 Ti4200 enjoyed considerable longetivity compared to its higher-clocked peers. At half the cost of the 4600, the 4200 remained the best balance between price and performance until the launch of the ATI Radeon 9500 Pro at the end of 2002, though the 9500 was only intended as a stopgap product.[9] The Ti4200 still managed to hold its own against several next generation DirectX 9 chips released in late 2003; beating out the lackluster GeForce FX 5200 and the midrange FX 5600 and performing at parity with the midrange Radeon 9600, which was finally ATI's cost-effective permanent answer to the Ti4200.[10][11]

    GeForce4 MX

    Architecture

    If the capabilities of the GeForce4 generation are defined by the GeForce4 Ti, then the GeForce4 MX (NV17) is a GeForce4 in name only. Many criticized the GeForce MX name as a misleading marketing ploy since it was less advanced than the preceding GeForce 3. In the features comparison chart between the Ti and MX lines, it showed that the only "feature" that was missing on the MX was the nfiniteFX II engine—the DirectX 8 programmable vertex and pixel shaders.[12] In reality, however, the GeForce4 MX was not a GeForce4 Ti with the shader hardware removed, as the MX's performance in games that did not use shaders was considerably behind the GeForce 4 Ti and GeForce 3. Disappointed enthusiasts described the GeForce4 MX as "GeForce 2 on steroids".

    Though its lineage was of the past-generation GeForce 2, the GeForce4 MX did incorporate bandwidth and fill rate-saving techniques, dual-monitor support, and a multi-sampling anti-aliasing unit from the Ti series; the improved 128-bit DDR memory controller was crucial to solving the bandwidth limitations that plagued the GeForce 256 and GeForce 2 lines. It also owed some of its design heritage to Nvidia's high-end CAD products, and in performance-critical non-game applications it was remarkably effective. The most notable example is AutoCAD, in which the GeForce4 MX returned results within a single-digit percentage of GeForce4 Ti cards several times the price.

    As the MX line was launched along with the rest of the GeForce4 in early 2002, id Software technical director John Carmack worried about the GeForce4 MX's potential success. Since Carmack feared that a widespread adoption of the MX would set back the development of advanced games that used programmable DirectX 8 vertex and pixel shaders, he warned gamers not to buy the chip. However, in mid 2004, id Software's Doom 3 was released with support for the GeForce4 MX; it is noteworthy that the MX is the only one in the list of supported chips that does not have DirectX 8 vertex and pixel shaders. Doom 3 is not supported on slower GPUs with an otherwise similar feature set to the MX, like the GeForce 2 and Radeon 7xxx series.

    Despite harsh criticism by gaming enthusiasts, the GeForce4 MX was a market success. Priced about 30% above the GeForce 2 MX, it provided better performance, the ability to play a number of popular games that the GeForce 2 could not run well—above all else—to the average non-specialist it sounded as if it were a "real" GeForce4—i.e., a GeForce4 Ti. Although it was frequently out-performed by the older and more expensive GeForce 3, many buyers were unaware, particularly as Nvidia was quick not to let the GeForce 3 remain on the market. GeForce 4 MX was particularly successful in the PC OEM market, and rapidly replaced the GeForce 2 MX as the best-selling GPU.

    In motion-video applications, the GeForce4 MX offered new functionality. It (and not the GeForce4 Ti) was the first GeForce member to feature the Nvidia VPE (video processing engine). It was also the first GeForce to offer hardware-iDCT and VLC (variable length code) decoding, making VPE a major upgrade from Nvidia’s previous HDVP. In the application of MPEG-2 playback, VPE could finally compete head-to-head with ATI's outstanding video engine.

    Lineup

    GeForce4 MX440-SE GPU

    There were 3 initial models: the MX420, the MX440 and the MX460. The MX420 was designed for very low end PCs and replaced the GeForce 2 MX100 and MX200. The GeForce 4 MX440 was a mass-market OEM solution, replacing the GeForce 2 MX/MX400 and GeForce 2 Ti. The GeForce 4 MX460 was a midrange solution without a clear competitor. While the MX460 was not slow by any means, it was not priced far below the GeForce4 Ti4200, the GeForce 3 Ti200 and the Radeon 8500LE/9100 (even the full 8500 in some cases), each of which outperformed it easily as well as being fully DirectX 8 compliant. The end result was that the MX460 never had potential in the market and flopped.

    In terms of 3D performance, the MX420 performed only slightly better than the GeForce 2 MX400 and below the GeForce 2 GTS, but this was never really much of a problem, considering its target audience. The nearest thing to a direct competitor the MX420 had was ATI's Radeon 7000. In practice however, its main competitors were actually chipset-integrated graphics solutions, such as Intel's 845G and Nvidia's own nForce 2.

    Another version of MX440-SE GPU

    The MX440 performed reasonably well for its intended audience, outperforming its closest competitor, the ATI Radeon 7500, as well as the discontinued GeForce 2 Ti and Ultra. When ATI launched its Radeon 9000 Pro in September 2002, it performed about the same as the MX440, but had crucial advantages with better single-texturing performance and proper support of DirectX 8 shaders. However, the 9000 was unable to break the MX440's entrenched hold on the OEM market. The MX440 also had a derivative called the MX440-SE. This was simply an MX 420 with increased memory bandwidth. Nvidia's answer to the Radeon 9000 was the GeForce FX 5200, but despite the 5200's DirectX 9 features it did not have the performance to match the MX440 even in DirectX 7.0 games. This kept the MX440 in production while the 5200 was discontinued, which could be considered ironic because the MX440 was supposed to be replaced by the 5200.

    The GeForce4 Go was derived from the MX line and it was announced along with the rest of the GeForce4 lineup in early 2002. There was the 420 Go, 440 Go, and 460 Go. However, ATI had beaten them to the market with the similarly performing Mobility Radeon 7500, and later the DirectX 8.0 compliant Mobility Radeon 9000. (Despite its name, the short-lived 4200 Go is not part of this lineup, it was instead derived from the Ti line.)

    Like the Ti series, the MX was also updated in late 2002 to support AGP-8X with the NV18 core. The two new models were the MX440-8X, which was clocked slightly faster than the original MX440, and the MX440SE, which had a narrower memory bus, and was intended as a replacement of sorts for the MX420. The MX460 was never updated; in fact, it had been discontinued several months previously. Another variant followed in late 2003—the MX 4000, which was a GeForce4 MX440SE with a slightly higher memory clock.

    Surprisingly, the GeForce4 MX line received a third update in 2004, with the PCX 4300—an MX 4000 with support for PCI Express, and a wider memory bus. In spite of its new codename (NV19), the PCX 4300 is in fact simply an NV18 core with a chip bridging the NV18's native AGP interface with the PCI-Express bus.

    GeForce4 Go driver support

    This family is a derivative of the GeForce4 MX family, produced for the laptop market. The GeForce4 Go family, performance wise, can be considered comparable to the MX line. However, in terms of support, some users have become rather irritated at an uncharacteristic lack of driver support from Nvidia. Instead of supporting this family of chips, Nvidia redirects users to the manufacturer's webpage.

    One possible solution to the lack of driver support for the Go family is the third party Omega Drivers. However, it is not recommended that one install these drivers unless one is willing to accept the risks. Using third party drivers can, among other things, invalidate warranties. The Omega Drivers are supported by neither laptop manufacturers, laptop ODMs, nor by Nvidia. Nvidia has also attempted legal action against a version of Omega Drivers that included the Nvidia logo.[13] The Omega drivers are essentially stock drivers modified to deliver up to 30%-40% performance increases without overclocking.[citation needed] The invalidation of warranties is usually seen by the expert users as a corporate safety net rather than an actual warning against devices failing.

    Nvidia’s own solution to the problem is to try drivers from laptopvideo2go.com. This website hosts desktop display drivers which have been modified to install on a notebook. The drivers found on this website do not contain any laptop specific modifications and thus may or may not be better than drivers provided by your laptop's manufacturer.

0 comments: