For the past few months, we've been hearing all about the S3's Savage2000 and how it would utterly crush the upcoming chipsets from ATI, NVIDIA, Matrox and 3dfx. Massive "fill-rates" and T&L performance numbers were being thrown around the net that would place this chipset beyond what anyone else was claiming at that point in time. During that timeframe, I was given the opportunity to take a sneak peek at the chipset (as reported in the S3 Savage 2000 preview). I quickly realized that the Viper II would realistically be a competitor with a SDRAM GeForce, at best. I feel that the net had blown the comparison way out of proportion with ignorant speculation of the true fill-rate performance. Well the time has come for it to prove itself as we review S3's Diamond Viper II based on the Savage2000 (or S2000) chipset.
What is the Viper II?
The Viper II is based on S3's latest chipset a .18-micron package that is supposed to offer on-board Transform & Lighting (T&L), hardware texture compression (S3TC), single-pass quad texture blending, hardware assisted DVD playback (through motion compensation and 16-tap upscaling/downscaling) and a reasonable price tag of $199.00 (USD). The board is running at 125 MHz core and 155 MHz memory speeds that are much lower than the anticipated 200MHz speeds that were rumored to be released.
Although this card is S3's flagship video solution at the moment, it doesn't necessarily mean that it has the ability or the intensions to compete with the high end solutions now available on the market. Keep in mind that this card is not nearly in the price range of something like a $300 (USD) GeForce DDR board.
3dfx won't have it's newest line-up available anytime soon so the VD3 3000 and VD3 3500 are the two contenders. Although both cards can't offer 32-bit color, they both are decently fast at 16-bit 3D and have very good driver support. The 3500 is slightly more expensive but offers a TV-tuner and superior video options than the Viper II offers. Also keep in mind that the VD3 series of cards does not have hardware T&L.
ATI really doesn't have a card that can compete with the Viper II just yet but we'll be taking a peek at the ATI Rage Fury MAXX as soon as it makes its way into the lab. The MAXX will be in a similar price range, offering awesome fill-rate performance but doesn't come with hardware T&L. Keep your eyes peeled for something about this, very soon.
Matrox is currently shipping the G400 series cards that will be competing at and above the price range of the Viper II. The G400 will be at the same price while the G400 MAX will be slightly more expensive. The G400 cards offer great visual quality, decent fill-rates, good video performance and feature environment bump mapping. Keep in mind the G400 solutions don't come with hardware T&L.
NVIDIA is offering two products that will end up competing with the Viper II; they are the TNT2 Ultra and GeForce SDR. Now the TNT2 Ultra is slightly cheaper than the Viper II while the GeForce board will be slightly more expensive. In theory, the Viper II should offer better performance than the TNT2 Ultra while offering relatively competitive solution to the GeForce SDR boards. The GeForce has a full functioning T&L unit and offers currently the best real world fill-rate we've seen to date.
Once again the term Mtexel has come to haunt us as people have abused the word and started comparing single textured fill-rate vs. multi-textured fill-rates. The fill rate of the Viper II is 250 Mpixels/s while having the ability to actually apply four textures per texture in a single pass. Let's reflect on a few terms before we get deeper into this discussion.
Fill-rate - the rate at which pixels are drawn to video memory.
Pixel - short for "picture element", a pixel is a single point graphic image. Monitors display pictures by segmenting the screen into hundreds of thousands (even into the millions) of pixels.
Texel - short for "texture element", similar to a pixel but texels are actually textured pixels on 3D surface.
Now the confusion doesn't end here either, you can also apply multiple textures in a single pass (through hardware) or multiple passes (through software). You can even have different filtering methods that will eat up more bandwidth but we will save that discussion for another time. The main idea here is that texel fill-rate can be measured with one or more textures applied in various situations. This allows companies to play with the "Mtexels/sec" fill-rate measurement and market their product in its best light.
For example, you may have a card based on the GeForce 256 that can push four single textured pixels in a single pass while a card based on the S2000 can only do two textured pixels in a single pass. It's obvious who is faster in a single textured scenario so how does S3 market the Viper II at 20 Mtexels/sec faster than a GeForce? Easy, they take a multi texture situation and start multiplying numbers. The Viper II graphics core is running at 125MHz, you can dual texture two pixels per pass (or one quad textured pixel in a pass) and up with the following:
125,000,000 (125 MHz core speed) * 4 (one quad textured or two dual textured pixels) = 500Mtexels/sec
If we look at the GeForce 256, we will see:
120,000,000 (120 MHz core speed) * 4 (four single textured pixels or two dual textured pixels) = 480 Mtexels/sec
So why on earth would a GeForce be faster than a Viper II in fill-rate tests or actual applications? Well first off, keep in mind that most games do not use multi-texturing like we'd hope they would. They resort to multi-pass multi-texturing to support legacy hardware (older generations of hardware). This means that in a single textured application the GeForce would perform at its maximum fill-rate performance while the Viper II would work at half its possible MTexel performance being that it's not being used efficiently. Given a dual texture scenario, things would change in theory and the Viper II would actually be faster. Note I said in theory, there are even more factors that come into play such as memory bandwidth, filtering methods and T&L bandwidth.
Next time you pick up a retail box and check for the specified fill-rate, keep in mind the other factors involved that you should note before deciding a card has a superior fill-rate. You just might be surprised.
Transform and Lighting Engine
Transform and lighting (or T&L) has been the latest buzzword after NVIDIA announced and then released the GeForce GPU to consumers this year. With the promise of greater 3D quality due to the newfound ability to have high polygon scenes and realistic lighting without great performance loss, competitors quickly shuffled to release their own T&L solution or explain why they felt it was still too early to release such a product. S3 happened to be one of the companies that embraced this new feature (keep in mind this is not new to the workstation market) and incorporated it into their latest product, the Viper II.
So how fast is the Viper II T&L engine? I have no idea. Why? The current drivers do not have hardware T&L enabled as of yet. S3 has officially stated that they have put all their efforts into making the current features of the Viper II that can be taken advantage of, available. This meant putting aside the availability of T&L in general because they have run into driver issues that couldn't be reasonably fixed in time for release. They also went on to let me know that the merger between Diamond and S3 made things a bit more difficult. I don't see how that's possible being that two combined companies making one product line causes a lack of manpower to develop stable, full-functional drivers.
So when will the driver be available that has T&L enabled? In the middle of January an OpenGL ICD will be release with it enabled and towards the end of Q1 a DirectX driver with T&L will makes its way onto the web. S3 claims that T&L isn't such a big issue right now because there really isn't any software that uses it as of yet and that by the time their drivers are released, T&L games will just start to trickle out. Although I didn't care to see T&L unavailable in the released drivers, it is true that currently there isn't much software that takes advantage of it. However, who really wants to buy a card that can't do what it claims right out of the box? With S3's recent driver problems with legacy video cards, who really wants to trust them? This is something that you, the consumer, will decide.
Memory bandwidth is becoming a very important factor in 3D accelerators quickly as we transition into high resolutions and high color depths. The more complex we make 3D games or graphic applications, the more memory bandwidth we need. Now, depending on the architecture of the chip, we may need more for one card than we will for another. For example, the GeForce based boards need a great amount of bandwidth to push such insane fill-rates with high resolution, high color, T&L and various filtering modes. This is why we're seeing DDR based graphics boards pop up to help alleviate this problem for the GeForce. In the future, 3dfx and ATI are going to take yet another route where "brute force" is used by allocated a set amount of memory per graphics chip and diving work between the two. This allows for theoretically double the memory bandwidth. S3 is using SDR for the Viper II that is clocked at 155 MHz giving them a 2.5GBs/sec memory bandwidth. With the use of texture compression and efficient driver tweaks, S3 must not feel the need for greater memory bandwidth solutions just yet.
S3 Texture Compression
S3 texture compression (or S3TC) has been around since the birth of the Savage 3D and has been trying to get its foot in the door with software developers since. Although the feature hasn't received as much support as S3 wants, it seems that developers are beginning to slowly move over. We're seeing game developers like Epic (Unreal), id (Quake 3 Arena), Monolith (Shogo2) and Raven (Soldier of Fortune) providing support for S3TC in their upcoming titles. S3 not only offers higher visual quality with their texture compression but also accelerate performance for large texture scenes. These instances maybe be few right now but with 3dfx other big graphics players pushing texture compression, you sure can bet texture compression will become a standard feature.
With ATI standing alone in the consumer high performance video arena, only S3 is beginning to offer some type of real challenge. The Viper II now features enhanced DVD features that improve visual quality. Motion compensation and 16-tap upscaling and downscaling greatly improve the visual quality of software DVD playback. While not offering the best DVD performance, the Viper II offers a very good runner-up when it comes to DVD playback. The software DVD playback on our P!!! 550 test system was smooth and visual quality differences between ATI and S3 were very difficult to tell.
Before we take a look at the drivers that S3 provided with the Viper II, I wanted to address a topic that many of you have e-mailed me about in the recent weeks. It seems that S3 has still been under achieving when it comes to supporting the drivers of their legacy products and many of you are very upset about this. I understand the pains of dealing with "not-so-refined" video drivers as we have to use them periodically for reviews. I will say that aside from some problems with 3DMark 2000, there were no real issues with the drivers that I saw. I've confronted S3 about this and they assure me that they are trying their best to support their legacy hardware owners out there and they will keep a high standard of quality for their Viper II. Let's take a look at the drivers now.
Here we have the basic color correction adjustment window that lets us set "schemes" for various users or application settings.
Here we have the D3D properties window that adjusts the settings per application. I personally was annoyed by this but it can have its advantage for when you need to set certain things on or off on a given application. I still feel there should be a general tab for a universal setting.
Here we have additional D3D properties that you can adjust to your liking. The mandatory use of per application settings is still applicable here.
Here you can select your video output as well as define some video out parameters. Although flat panel options were seen grayed out, the card only provided output for TV. The advanced button leads to some minor TV adjustments like sizing, positioning, brightness and a flicker filter.
The Diamond taskbar option offers you a quick way to change resolution or the ability to get to various properties windows in a few clicks.
Looking at what the Viper II hardware is capable in theory, I would assume that in single textured 3D benchmarks the Viper II will probably not do so hot. I also think that in low-resolution modes that the card will probably lag behind the competition a bit as the drivers aren't very mature. When we first did a preview of the Viper II, the software engineer I was working with mentioned that S3 was very careful on how their drivers were geared to preserve memory bandwidth. If this is the case, we should see the Viper II do well under 32-bit testing.
|Motherboard (BIOS rev.)
||ABIT BX6 2.0 (BIOS date 7/13/99)
||128 MB Viking PC100 CAS2
|Diamond Viper II
|Reference NVIDIA GeForce/TNT2 Ultra drivers
|3dfx Voodoo3 3500
|Matrox G400 MAX
||4.11.01.1410 w/TurboGL 1.00.001
||Windows 98 SE 4.10.2222 A
|Quake 3 Arena
command line = +set cd_nocd 1 +set s_initsound 0
Advanced Settings = disable sound, disable music, disable movies, disable joysticks, enable optimized surfaces, enable triple buffering, enable single-pass multi-texturing
High Detail Settings = enabled
Settings = -nosound -nomusic -nonetwork -timetest
||16-bit settings = 16 bit textures, 16-bit Z-buffer, triple buffering
32-bit settings = 32-bit textures, 24-bit Z-buffer, triple buffering
At low resolution most of the cards are even with the Viper II slightly behind the pack and the G400 MAX trailing off even farther back. There is only one card that doesn't reach the very good rating of over 60 frames per second (FPS).
As we hit the medium resolution in our testing, the Viper II doesn't seem to do well although this is a multi-texturing game. My guess is that the Viper II drivers may need some additional tuning for performance. I really expected the card to take charge under our multi-texturing tests.
Ouch, the Viper doesn't seem to like playing Shogo much as it still doesn't surpass even the TNT2 Ultra board. Note that the GeForce SDR board nearly doubles the framerate of the Viper II. Also keep in mind that the VD3 3500 doesn't run the demo properly as it's missing textures all over the place.
There are several reasons why the Viper II is so far behind everyone else in this test case. It is most likely the combination of driver maturity (this can be seen clearly at low resolutions) and the fact that D3 isn't using multi-texturing.
As we turn the resolution up, things begin to look better for the Viper II but it's still lagging behind the pack.
Nothing much chances are we pump the resolution up to the top. The Viper II doesn't perform too well in D3 without multi-texturing being used.
Here we're switching to the OpenGL API that shows that obviously things are getting worse as the ICD isn't as speedy as its DirectX counterpart. 46 FPS isn't horrible but it's not what you'd expect from a video card in this price range at this resolution and simple complexity. I've omitted the G400 MAX results for D3 because the results were horribly low due to driver issues with this game. I'll have to give the newest Descent 3 patch a while in our next round of testing.
Things get much worse as we raise the resolution mode to 1024x768. The Viper II OpenGL ICD must be optimized for Quake 3 Arena and not for anything much else, yet.
At Descent3's highest resolution setting, the Viper II comes 8 FPS short of the standard 30 FPS minimum framerate. Its competition doesn't have much problem as each NVIDIA card is able to. I would gamble that with refined drivers, this would be a different story.
Quake3 Arena, Normal
The Viper II pulls a 3rd place victory in our round-up as the two GeForce boards keep a respectable lead. The drivers must be optimized pretty well to score this high at low resolution.
Although we turn the resolution up a tad, the results show us the same information. The Viper II is doing quite well with an impressive 53 FPS.
Pressing into our toughest 16-bit test in Quake3, only one card really makes it over our 30 FPS limit. The SDR GeForce comes just shy of the mark followed by the Viper II.
Quake3 Arena, High Quality
Now the games begin as we step into 32-bit color and higher complexity settings. The GeForce cards steamroll ahead with the Viper II nipping at their heels.
Well look what we have here! The Viper II blows past the SDR GeForce that is held back by its poor memory bandwidth. The DDR GeForce still takes a commanding lead however. With an efficiently tuned OpenGL ICD for Quake 3, the Viper II doesn't seem to be bothered by the massive demands of the 32-bit color and higher resolution.
Although none of the cards make the 30 FPS cut, we can see that the Viper II still has the SDR GeForce in its rear view mirror while trailing the DDR GeForce by a larger margin.
3D Mark 2000 3DMarks
As we're still getting a feel for 3DMark, we continue to stay with the 1024x768 resolution. The Viper II edges past the TNT2 Ultra while trailing a ways behind the GeForce cards. Keep in mind that the 3dfx VD3 3500 doesn't have a score because the drivers still aren't working with the benchmark just yet.
Contrary to the Quake 3 results, the Viper II doesn't gain any ground when we turn 32-bit color mode on. It actually loses ground to the TNT2 Ultra.
3D Mark 2000 Fill Rate (single texture)
The Viper II doesn't care for this single-texture test as it fall behind every card capable of running the benchmark. We'll have to see what happens when we jump into the multi-texturing test.
The Viper II gains a bit of ground this time when we switch the test into 32-bit color. Check out that G400 MAX putting some ground between it and the SDR GeForce.
3D Mark 2000 Fill Rate (quad texture)
I was very upset that the Viper II failed this test as it would have provided some awesome data for analysis. Unfortunately, the drivers have once again held the card back from proving itself. I hope S3 can get these driver issues out of the way sooner than later for their own sake. No one wants to deal with performance problems due to sub-par drivers. Note how well the GeForce boards do by the way. They don't have a quad texture in a single pass but they seem to handle the situation just fine.
Even though we aren't getting information on the Viper II with the quad texture testing, we do make an interesting find. Note that we went to 32-bit color that the SDR GeForce dramatically dropped in performance as the DDR took a small hit and kept pushing onward.
3D Mark 2000 High Polygon Count
The results in this test are showing me something that I hoped wouldn't happen. The video cards without T&L must rely on the speed of the CPU (which uses KNI in DirectX 7) to calculate all the T&L information. The CPU should do well but match the performance of the GeForce GPU that has built in T&L. As you can see from the results, the boards without T&L come very close to the GeForce cards. I will be investigating this further but from what I've found out so far, the 3DMark benchmark doesn't use DirectX 7 only but has a mix of custom code involved. This also makes us realize that it's very possible that game developers may do something similar to support a wide array of cards which would hurt the performance of the GeForce cards.
Here we have more data to support what I mentioned above. Notice that only the GeForce cards took a penalty when switching to 32-bit color mode. The video cards relying on the CPU for T&L can split the workload between the graphics card and CPU (so you'll see little to no performance drop) while the GeForce cards take a slight hit in performance because they're having to deal with the 32-bit color stealing a heavy amount of bandwidth.
Overall my feelings about the actual Viper II performance are mixed. I think the Viper has showed us great performance potential in multi-texturing situations. The drivers are going to play a huge part in this as well. We can see that the drivers were tuned very well for Quake 3 but the poor performance in OpenGL Descent 3 has proven to us that this OpenGL ICD driver still needs some tuning. You can also see that without the comfort of multi-texturing being used, the Viper II has a difficult time keeping up with the more flexible GeForce cards.
S3 has done a good job with the Viper II as it's a huge leap over the Savage4 series cards that, to be frank, were not cutting it. The Viper II offers decent performance and at times saddles up with the big boys when things jump into 32-bit situations. The respectable video capabilities of the Viper II give it added value that is only bested by that of ATI. I see great promise if S3 can get their drivers fine-tuned and games continue to transition into multi-texturing engines. Without multi-texturing, the Viper II is going to be sub-TNT2 Ultra performance and if the drivers don't shape up quickly, no one is going to want to deal with this card. The past shows a not so hot track record for S3 right now so as a consumer I would be scared to trust them.
Competition from NVIDIA and Matrox is already here and is making it very tough for the Viper II to fit into the market as these cards have had time to develop drivers as well as prove themselves to the marketplace. Not only does stiff competition exist, new competition from ATI is on its way this month as well by means of the Rage Fury MAXX. Although T&L isn't offered, it will sport extremely high fill-rate potential.
There is a big cheesiness about Diamond-S3's claim for fame with its Transform and Lighting-support however, which is practically non-existent in the current product. As much as it may be true that you will hardly find any 3D-games that would be able to take advantage of it right now, as much Diamond-S3 comes close to cheating on its ViperII-customers, by still promoting the card with this shiny feature.
At this point in time, I can't give my blessing on purchasing a Viper II with so many other great cards out there. The Viper II is promising T&L and nice fill-rate performance but as we've seen already, the drivers are really holding it back. Does this mean the hardware isn't good? My answer is 'No' in terms of fill rate performance, 'possibly yes' if you are asking for T&L though. It does mean that the end product is going to come up short. We should not forget that it's easy to promise that 'future drivers will support' a feature. Once the customer has bought the product the manufacturer made its money. The extremely long delay of a decent OpenGL-ICD for Matrox' G200 should not be forgotten. Some people had to wait for almost a year. Do you want to wait a year until S3-Diamond will release drivers that enable T&L? I really hope S3 can get things pulled together for the Viper II card because it really does have the potential to be a contender. With refined drivers, the Viper II could possibly put some serious heat on the SDR GeForce. At $199 (USD), the Viper II is set at a decent price but doesn't currently offer enough for me to not justify spending a few more dollars to get an SDR GeForce. In a month or two this could change but until then, I'll stick with the reliable competition.