Introduction
With much of NVIDIA’s competition busy scrambling to its feet after the GeForce2 GTS release, it seems that another knockout blow is about to hit the competition. The flurry of product releases from NVIDIA has kept most of the competing graphics companies rolling with the punches while it was brewing up yet another surgical strike, the GeForce2 MX. This product may seem harmless at first glance as it’s only a low cost consumer solution but don’t let that fool you. What you must realize is that NVIDIA isn’t just using this GeForce2 derivative as an inexpensive chip that will fill in for the soon to be extinct TNT2 family but also have this piece of hardware take over this segment of the market by setting new performance standards for low cost solutions and on top of all this, inject the mainstream with a GPU that is T&L ready. The big question is: Does it truly perform as well as NVIDIA claims? Armed with my reference GeForce2 MX and 5.30 Detonator drivers, I’m ready to find out.
Before we move into further detail about the GeForce2 MX, let’s take a peek at NVIDIA’s new product line-up for a better understanding of the big picture. NVIDIA has already established a strong lead with the GeForce2 GTS line of products since its release 2 months ago (see Tom’s Take on NVIDIA’s New GeForce2 GTS). The GeForce GTS holds the high-end position and will replace the GeForce line of products. Although it had this segment, it left a big question mark for the rest of the market that covers a majority of the consumers. The TNT2 was assigned to cover this area but it didn’t necessarily dominate like its big brother the GeForce2 and left a huge untapped resource open. This is where the GeForce2 MX will come into play. It replaces the TNT2 family of products as the mainstream solution. The chipset has various configuration options that allow manufacturers to focus on several key areas that will range from an extremely low cost board to a mid-range multimedia offering and possibly mobile products. Let’s begin our detailed analysis of this promising chipset.
The Chip
The GeForce2 MX (MX standing for Multi-transmitter) chip is based on the GeForce2 GTS but tweaked for versatility, price and mainstream consumer needs. It is based on the same .18-micron technology that the GeForce2 GTS is but clocked slower and has some architectural differences. These differences help cut power consumption and cost but at the same time take away from the performance of the GPU. If you take a look at both rendering engines, you’ll immediately see a big difference between the two. First, let’s look at the GeForce2 GTS.
As you already know, the GF2 GTS has four rendering pipelines that contain two texturing units each and combine for a very powerful rendering engine. These chips are clocked at 200MHz and in theory are able achieve a 800 Mpixels/sec or 1.6 Gtexels/sec.
First and foremost, this GPU does indeed have fully functional hardware T&L and will be the first mainstream chip to have it. The T&L unit is the same unit in the GeForce2 GTS but running at a slower speed. Taking a closer look at the GeForce2 MX rendering engine, you’ll notice that is has only two pipes. Both are still capable of applying two texels per clock but in the end its already half as efficient as the GeForce2 GTS. To top that off, the GeForce2 MX chips will also be clocked at 175 MHz compared to the 200 MHz of the GeForce2 GTS. The chip does however retain several of the valuable features of the GeForce2 GTS like 4X AGP, the NVIDIA Shading Rasterizer (NSR) and High Definition Video Processor (HDVP). It was noted that the GeForce2 MX HDVP is slightly inferior to that of the GeForce2 GTS, as it doesn’t cover some of the higher end formats.
From here things can vary as the GeForce2 MX will be available in a couple of memory configurations. The memory interface can range from 64-bit to 128-bit depending on the manufacturers needs and SDR or DDR memory can be used. Frame buffer sizes can vary between 8 and 64 MBs but 32MB configurations will probably be most common.
Costs obviously will drop if the narrow 64-bit SDR memory is used but it also greatly hurts performance. However, NVIDIA told us that the 64-bit memory configuration should automatically be equipped with DDR SDRAM, which will equalize the narrower data path with a higher memory clock. Our reference board is based on 128-bit SDRAM clocked at 166 MHz (giving us a theoretical limit of 2.7 GB/sec) so we can only imagine how bad the performance can get if a 64-bit/SDR configuration should be used.
What’s New?
A couple of features were added to target the needs of mainstream users and give additional value on top of the high performance graphics engine. I’m referring to TwinView and Digital Vibrance Control.
TwinView is similar to the DualHead concept from Matrox in that it allows users to have multiple output devices at once. You can have various combinations between a CRT, Digital Flat Panel (DFP) and TV out. The only limitation that I noticed was that you cannot do dual TV output. The main claim to fame that NVIDIA boasts over Matrox is that they claim to be able to drive two digital displays simultaneously thanks to a Dual Link Transmission-Minimizing Differential Signaling transmitter or TMDS. Keep in mind that this feature is still limited by the type of outputs that the given boards has. If you don’t have a DFP connector on your board or a TV-out, you can’t possibly use them. This is probably a warm welcome to those who actually need a dual output graphics card, as the G400 series from Matrox are nice cards but don’t necessarily meet everyone’s 3D performance standards.
Above are some examples of the TwinView configurations.
Digital Vibrance Control gives the user the ability to digitally control the saturation of images. This feature was added so that users can adjust the color saturation similar to the controls of your television set. I’m not really convinced that this feature adds much over the standard gamma controls that we’re used to, but this feature may come in handy for fine-tuning visuals on Flat Panels.
Chipset Comparison
I’ve compiled a table that compares all the features the available NVIDIA chipsets so that you can see the differences much easier.
GeForce | GeForce2 MX | GeForce2 GTS | |
Manufacturing Process | .25 micron | .18 micron | .18 micron |
Rendering Pipelines | 4 | 2 | 4 |
Textures per Pipeline | 1 | 2 | 2 |
Core Speed | 120MHz | 175MHz | 200MHz |
Memory Interface/Speed | 128-bit/166-300MHz | 64 or 128-bit/166-300MHz | 128-bit/333MHz |
Pixel Fill-Rate | 480 Mpixels/sec | 350 Mpixels/sec | 800 Mpixels/sec |
Texel Fill-Rate | 480 Mtexels/sec | 700 Mtexels/sec | 1.6Gtexels/sec |
Polygon/sec | 15M | 20M | 25M |
TwinView | No | Yes | No |
Digital Vibrant Control | No | Yes | No |
Active Cooling | Yes | No | Yes |
HDVP | No | Yes | Yes |
Retail Price | $179-$249 | $99-$179 | $299 |
There are a few things to note about the above table:
Memory Interface: I can’t stress how important it is to make sure you know what interface a GeForce2 MX board is based on before comparing it with anything. If you care about having good 3D performance, make sure you don’t get stuck with a 64-bit/SDR board or you’ll most likely regret it. Theoretically there’s also the possibility of a 128-bit/DDR solution, which would automatically score a lot better at high color/high resolution. However, this combination would destroy the low-cost idea of GeForce2 MX and is thus rather unlikely.
HDVP: The GeForce2 MX is lacking some of the high-end visual modes that the GeForce2 GTS has so keep that in mind if you are into higher end video output.
Active Cooling: Our reference board didn’t have any cooling at all and most manufacturers will probably opt not to as it is a low cost solution. However, this doesn’t mean it cannot happen.
Retail Price: The GeForce2 MX will vary greatly in price due to configuration differences like memory interface/size/type and video outputs. I would estimate that our reference board configuration would probably sell for $129.
Test Setup
Graphics Cards and Drivers | |
GeForce2 MX | 32 MB SDR DRAM 175 MHz core clock /166 MHz memory clock NVIDIA Reference Driver 5.30 |
GeForce2GTS | 32 MB DDR SGRAM 200 MHz core clock / 333 MHz memory clock NVIDIA Reference Driver 5.30 |
GeForce DDR | 32 MB DDR SGRAM 120 MHz core clock / 300 MHz memory clock NVIDIA Reference Driver 5.30 |
GeForce SDR | 32 MB SDR DRAM 120 MHz core clock / 166 MHz memory clock NVIDIA Reference Driver 5.30 |
Platform Information | |
CPU | PIII 1GHz |
Motherboard | Asus CUSL2 (bios 1000 BETA 013) |
Memory | Crucial PC133 CAS2 |
Network | Netgear FA310TX |
Environment Settings | |
OS Version | Windows 98 SE 4.10.2222 A |
DirectX Version | 7.0 |
Quake 3 Arena | Retail version command line = +set cd_nocd 1 +set s_initsound 0 OpenGL FSAA set to 2x SuperSampling or FSAAQuality 1 |
Expendable | Downloadable Demo Version command line = -timedemo D3D FSAA set to 2x SuperSampling or 2nd slider in the D3D FSAA control panel |
Performance Expectations
With the SDR GeForce and GeForce2 MX offering nearly the same theoretical output, I expect these two cards to be neck and neck with the GeForce2 MX winning in certain circumstances due to its much higher clocked core. I also know that the GeForce2 MX will take a beating in high resolution and color modes due to its poor memory speed. It would be interesting to see how bad a 64-bit/SDR version of this board does but at this time we don’t have one just yet to show you. The 64-bit/DDR solution should score pretty much the same as our 128-bit/SDR reference board, while a 128-bit/DDR configuration should get rather close to GeForceDDR or even outperform it.
Test Results – Quake 3 Arena Demo001
The GeForce2 MX starts off on a good note as it slightly edges out the GeForce SDR and is just under that of the GeForce DDR.
Here we have some interesting results if you look at the GeForce2 MX compared to the GeForce SDR. The only edge technically that the SDR GeForce holds is raw pixel fill-rate. This must be what pushes it slightly ahead in this case.
Test Results – Quake 3 Arena NV15Demo
I would have guessed that the core speed would have made a bigger difference I this demo than it did. As the resolution rises, however, we do see the boards file into their typical pecking order.
In High Quality mode the boards with slower fill-rate become apparent sooner than with the Normal setting but only by very little as T&L is what this demo stresses.
Test Results – Quake 3 Arena FSAA
The GeForce2 MX is on par with the results we expect as it sits between the GeForce DDR and SDR boards. You’ll notice that only the GeForce2 GTS is able to handle the heavy fill-rate needs of the FSAA setting.
Things pretty much stay the same in our High Quality FSAA setting aside from the demanding software taking its toll on the GeForce2 GTS and dragging its scores down greatly.
Test Results – Expendable Demo
In this DX benchmark you can still see that the performance order hasn’t changed as the GeForce2 GTS maintains a healthy lead followed by the GeForce DDR. The GeForce2 MX and GeForce SDR continue to stay within a few hairs of each other.
The GeForce2 MX manages to get a decent performance lead over the GeForce SDR in this test at the higher resolutions.
Test Results – Expendable Demo FSAA
Fill-rate becomes a huge issue sooner than normal as we turn FSAA on. You’ll note that the performance differences become apparent much earlier than they did with it off.
From the looks of our results, memory performance limitations at the upper resolutions overshadow the demand for fill-rate.
Test Results – DMZG
At low resolution it appears that the greater T&L performance of the GeForce2 based boards (thanks to the higher clocked core speeds) carries them above the GeForce series. As fill-rate demands increase, this repositions our line-up as it overwhelms the lower fill-rate boards.
Although this benchmark stresses T&L, the fill-rate demands in 32-bit color balance out the T&L performance differences sooner than we saw in the lower color mode.
Test Results – 3DMark 2000 Pixel Fill-Rate
Although 3DMark Fill-Rate tests are synthetic, they still seem to follow the same order of leadership that our real world games followed. At the highest resolution it appears the memory performance begins to limit the GeForce2 GTS a bit.
Switching to high color really puts a thorn in the side of the cards. Each one takes a severe hit as the color depth increases the total memory bandwidth needed. 1600x1200x32 was unable to run on any of the boards.
Test Results – 3DMark 2000 Texel Fill-Rate
Nice, the multi-texturing pipelines of the GeForce2 MX board comes in very handy I in this test as it managed to surpass the performance of the GeForce DDR.
3DMark 2000 stumps us yet again. Although the Texel performance of the GeForce2 MX is good, it falls back into its typical place as the color depth is pushed to 32-bit. Once again the highest setting was not an option for all our contestants.
Performance Conclusion
As I had expect the GeForce2 MX was right there with the GeForce SDR board and in some cases, it surpassed it. Although the GeForce2 MX didn’t dominate its bigger siblings, it did what most competitors couldn’t do, keep within a reasonable distance most of the time. This type of performance is unheard of for such an inexpensive solution. I am very impressed by what this chipset can offer. I do want to note that this could easily change had the board been a 64-bit/SDR memory interface based board. With such a limitation the results will change dramatically.
To BX or Not to BX, is the Question
Although the i815-based boards will be out in force sometime soon, many of our tweakers out there trying to save a few bucks are still running their OC’ed P3’s on BX platforms. Knowing this, I wanted to see if the GeForce2 MX could live up to the favorable past experiences that other NVIDIA based boards offered us on this out-of-spec platform setting.
Typically I see problems within minutes if a board has issues running at the abnormal AGP speed so I felt that a 30-minute looped Quake 3 Arena session would be sufficient. During this testing I had no crashes or visual defects while the demos were running. Results may vary on this type of rig so I can’t guarantee that you’ll have just as much success. Your best bet is to obtain a board from somewhere you can return the product in case things don’t work out.
Conclusion
The NVIDIA GeForce2 MX chip will be filling in a few gaps that NVIDIA has not be able to cover completely for some time now (mobile, competitive low cost 3D/Video solution like ATi has been known to handle well), as it will cover the mainstream consumers who are on a budget, low-cost workstations (for small business), mobile computing and even rumored Apple support down the line. This highly configurable solution can be geared towards cost, performance or video functionality. Although it does have 3D performance limitations when compared to the high-end parts, it does excellent in all other areas. No other solution available can offer the same for the price.
Why is this a big deal? Well, there are a couple of interesting things to note here. First, it allows NVIDIA to crush its competition in yet another market segment putting some serious hurt on their ability to make money somewhere. This also means they’ll be able to capture more design wins with OEMs as this chip fits perfectly in that market due to the cost, flexibility and even more importantly, offers more than the competition can.
The second reason this product will be a huge win for NVIDIA is due to the fact that it will flood the mainstream with a T&L capable graphics solution. This in turn will make software developers more comfortable with the idea of creating T&L enhanced titles that NVIDIA lacks so badly at this point in time. It’s amazing how their budget product will help push the success of their flagship part soon.
So is this newcomer really worth buying? For the majority of you that have a GeForce solution or better by now, I wouldn’t advise you to head out and replace it with a NVIDIA GeForce2 MX board unless you fancy DualView enough to trade off some 3D performance. For the rest of you who have been holding out for an inexpensive GPU, that solution has finally arrived.