Поиск по сайту THG.ru

Поиск по ценам в Price.ru

Preview of the Double Whopper - ATI's Rage Fury MAXX

ATI Rage Fury Pro Review

S3 Diamond Viper II Review

Asus AGP-V6600 Deluxe Review

Battle of the Titans: Creative Labs Annihilator vs. Leadtek WinFast GeForce 256

S3 Savage 2000 Preview

Full Review NVIDIA's new GeForce256 'GPU'

GeForce256 and the First T&L-Title

Over-clocking the GeForce256

Review of the Matrox G400 MAX

32 Graphic Card Meltdown - Part Two

32 Graphic Card Meltdown - Part One

Rambler's Top100 Рейтинг@Mail.ru
bigmir)net TOP 100


ATI Rage Fury MAXX Review
Краткое содержание статьи: The full review of the newly released ATI Rage Fury MAXX. We compare ATI's latest fill-rate monster against the current top sluggers from 3dfx, Matrox, NVIDIA and S3.

ATI Rage Fury MAXX Review

Редакция THG,  30 декабря 1999


ATI Rage Fury MAXX

It's finally here, the brute-force powerhouse that ATI has been hiding behind closed doors for last minute tweaks, driver enhancements and tuning, the Rage Fury MAXX. In November we brought you a bit of a teaser (see Preview of the Double Whopper - ATI's Rage Fury MAXX) as we took a close look at the MAXX using a beta board running early drivers and things seemed to be going very well. With the on-going war between fill-rate and T&L, ATI has taken the old, proven and easier road to fill-rate as they packed in two Rage 128 PRO graphic chips along with a massive 64MBs of memory. This is quite a brute force approach to high fill-rate and memory bandwidth performance by basically doubling the hardware on the board, although it doesn't go quite as far as the desperate attempt of 3Dfx with their cost intensive two to four-chip 'Voodoo4' or 'Voodoo5' solution. Now that we have a released board, we are no longer held back by any rules. No beta hardware, early drivers, no limited testing and of course no mercy anymore. This time around, it's no holds barred.

The Beast

For those unfamiliar with the Rage Fury MAXX, it literally is two Rage 128 PRO chips each with isolated 32MBs of memory per chip. That's right, two graphics processors and 64MBs of memory on one board. Unlike the first consumer brute-force approach taken by 3dfx using SLI, the ATI product uses AFR Technology to tap the power of both graphics processors. In essence, AFR Technology lets the graphics card alternate rendering of each frame per Rage 128 PRO chip. AFR gives ATI fill-rates never before seen in their products as well as a broad memory band that can deal with memory intensive factors like 32-bit textures and high resolutions. The Rage Fury MAXX also sports the ATI Rage Theatre chip that enhances DVD playback and offers impressive video encoding. For greater details on the Rage 128 PRO chip, check out our ATI Rage Fury Pro Review.

Fill-Rate and Memory Bandwidth

Since the first time we had our hands on this board, we knew it would be laying down some serious numbers when it came to bandwidth in regards to fill-rate and memory capabilities. The original reference boards we tested were clocked at 125 MHz core/143 MHz memory but the released board has been upgraded a reasonable amount to 135 MHz core/155 MHz memory! So how does all this measure up against the GeForce and Savage 2000? Let's take a look at the table below.

Graphics Card Fill-Rate Memory Bandwidth
ATI Rage Fury MAXX 540 Mpixels/sec 4.96 GB/sec
NVIDIA GeForce 256 DDR 480 Mpixels/sec 4.8 GB/sec
NVIDIA GeForce 256 SDR 480 Mpixels/sec 2.656 GB/sec
S3 Savage 2000 250 Mpixels/sec 2.48 GB/sec

Keep in mind that the numbers above are all in theory. This does not account for architecture or driver efficiency factors. Each factor can contribute to the degradation in the actual "real world" output capability of all the cards. You can see that in theory that the ATI Rage Fury MAXX has superior numbers when it comes to fill-rate and memory bandwidth but we'll hold our judgments until we take a look at the real world performance numbers.


With all its mighty fill-rate, the Rage Fury MAXX still doesn't have the brains to calculate the complicated floating-point math needed to process transform and lighting information like the dedicated T&L engine in the GeForce 256. This means that in software which takes advantage of hardware T&L, the Rage Fury MAXX is going to run into a bit of trouble. It will have to rely solely on the performance of the CPU for all transforms and lighting. Things might not be so bad with a high speed Athlon or Pentium III but even then it still won't match a dedicated geometry engine. However, T&L still hasn't raged into the software scene just yet and ATI plans to have a T&L version out within the next few months when T&L games do start to appear. At that point in time you should have three cards with T&L to choose from. NVIDIA's GeForce 256, S3's Diamond Viper II (hopefully S3 will have a working, T&L supporting, driver by that time) and ATI's next generation of Rage Fury MAXX that will include hardware T&L.


Although we've talked a little bit about AFR having possible issues on the technical side of things, we were finally able to test the board in real world benchmarks and applications. During the testing I decided to see how the latencies felt while playing a 3D shooter. If you're into any 3D shooter (especially online), you know that shooting a split second too late means missing or possibly dying. We've talked about our theory on the AFR possibly adding some game play latencies that hardcore gamers might not find acceptable. So to put the AFR to the test, I decided to fire-up Quake 3 Arena and death match against a few bots. I also decided to limit myself to the railgun and check for any difference in timing over a GeForce 256 equipped system with the same setup. After playing back and forth on each system for 30 minutes, I came to the conclusion that there was no noticeable latency. If something does exist, it's smaller than I could possibly notice and although I'm not Thresh, I consider myself to be a seasoned death match player.

The Competition

With ATI taking a plunge into the high-performance hardware realm, it has to deal with a different crowd than it ever has before, which consists of gamers, tweakers and power users. The competition difference between the OEM businesses they have and the high-performance market is much different. Where OEM's care mostly about price-performance, the high-performance market cares about - you guessed - it, speed. Let's take a look at the competition from each company.

The current competitive product from 3dfx is the aging Voodoo3 3500 that offers good 16-bit performance, TV tuner, Glide and competitive driver support. However it lacks 32-bit color 3D and T&L like the upcoming competition. Also from 3dfx we're still waiting for their upcoming next generation chipsets that will surface around March sporting "untouchable" fill-rates, 32-bit 3D color support and insane power-user configurations (4 graphics chips on a board with memory sizes up to 128 MBs). I don't care much for speculating this early on their next generation chipsets but I do know that they're not sitting on their laurels.

With the G400 and G400 MAX, Matrox has been offering a competitive solution at a slightly cheaper price range than the Rage Fury MAXX. The G400 series cards offer great visual quality, good video performance and decent fill-rate performance (especially in 32-bit 3D applications). Another note I'd like to make about Matrox is their dedication to driver support in "other" operating systems. I'm not too sure of their future plans for next generation products but their current product line offers a couple of sturdy performers.

Our current 3D leader is the GeForce 256 by NVIDIA. If price isn't an issue, there really isn't a better all-around solution right now. The GeForce offers excellent fill-rate, T&L and super OpenGL support. It's not the perfect card but it's the best solution currently available for the high-end consumer. Next generation products from NVIDIA, and we are not talking of a long time here, will improve GeForce's performance considerably as well.

We recently took a peek at the Diamond Viper II from S3 that thankfully adds some competition to the higher end of the market. While this card didn't receive my blessing over the GeForce solution, it still offers good fill-rate, S3TC and quad texture support. This solution is still unproven being that it's extremely new to the market and we're still waiting to see how the T&L driver development turns out. This solution could very well offer stiff competition depending their upcoming drivers.


For a long time now we've considered ATI to be the best video solution for DVD playback, video encoding and video software quality. However, we will be providing a much more in-depth DVD playback reviewing method sometime soon. Currently my testing is limited to playing a couple of DVD's, subjectively analyzing them but nothing much more than that. I plan to work in detailed encoding, decoding and TV-out quality tests in the future. Until then, we'll stick to a more general summary of how the card performed.

DVD playback was smooth with no frames being dropped and without any flaws that I could easily pick out. The DVD software was nothing special but easy to use nonetheless. Another thing to note about the Rage Fury MAXX on the video side of things is that there is no video-out option for those who like to play DVD movies through a PC onto their television set. This is a trivial thing but to some people this might be an issue.


After dealing this the Rage Fury MAXX from early on, I saw drivers emerge from very alpha quality drivers to the robust ones I received in the retail package for review. The MAXX ran our full test suite without any issues, which is rare. The only oddity I found was with 3DMark 2000. Many of the performance scores didn't seem to make sense when put in contrast with how it performed in real world applications. That's fine with me being that 3DMark is a synthetic test and the real world is where things really count. Here is what you can expect when checking out the ATI drivers for the MAXX.

device properties

I normally don't get into the device driver details of most cards but the MAXX has some unique things to it that I felt you might want to see. Notice that the driver actually detects both of the onboard graphics chips.

device properties details

When you check into the properties of the device driver, you will also find a few extra options. The information provided is mostly general information but is a welcome addition for people who are troubleshooting or just like to peek into what's going on.

device properties diagnostics

Clicking to the next tab, you'll find another unique feature to the Rage Fury MAXX. You are given the ability to low level test some of the basic components on your video card from video memory to testing the DAC. Not too shabby.


Here is a very general monitor information and adjustment property window. The synchronization option isn't to disable v-sync, that's in another window.

Drivers, Continued


Here is the standard color adjustment option window that we see in basically all our new video cards. This one isn't as confusing as some of the others I've seen. It has a brightness slider to play with versus some others I've seen that have brightness, contrast and gamma. Some might prefer the added control but most people just want to adjust a basic brightness setting.

opengl options

Here we have all of our OpenGL settings in a window. Note that they've inserted a wait for vertical sync option.

D3D options

The D3D options window is very basic but covers a few of the basic things you may be interested in, especially wait for vertical sync.


For whatever reason, ATI gives you a rather large option screen that seem a bit overkill for one checkbox and a few lines of information. A bit of a waste if you ask me.


Another huge screen with a single option that is rather important for those having problems with their MAXX card. My guess is that this feature is available in case there becomes an issue with some software. Worst case is that you'll be able to use the software although you'll probably lose performance to give you the functionality.

Overall I would say the ATI drivers are adequate. You're not going to sit in the drivers section most of the time anyhow but when you do need to make an adjustment, things should be pretty painless. They offer most of the functionality needed in today's world of techies and tweakers. The only gripe I have about the driver is the lack of an overclocking utility but more on that later as to why it's probably not there.


Using Powerstrip I was able to play a bit with overclocking the MAXX but ran into some interesting issues. After a few minutes of testing, it seemed to run into frame-syncing problems no matter what setting I used. The memory seemed to deal with the overclocking just fine but the core speed adjustment caused this unfortunate problem. I guess we'll have to see what ATI does to fix this problem. Although they don't condone overclocking, I'm sure they'll not want to upset their customers who do want to push the limits of their card. I'll have to talk with them about this and get back to you. Who knows, maybe ATI will release a driver with the overclock utility that adjusts everything necessary to keep things "in sync" while overclocking. Of course there's another possible explanation for this problem. It could be that Powerstrip is only overclocking one chip, instead of both. Maybe Ashley, the magical man behind the great Powerstrip, can shed some light into this issue for us as well.

Benchmark Expectations

I plan to see some pretty kick butt numbers when it comes to any of tests in the higher resolutions due to the raw fill-rate power of the MAXX. Unfortunately the performance at low resolutions won't be seen until ATI has a little more time to optimize their driver. High color, high-resolution settings will be the area where I expect the MAXX to pull the rug from under the GeForce SDR. In theory the DDR GeForce should fall to the MAXX but my gut feeling is that it won't. One last note is that we need to keep in mind that in T&L tests, the MAXX will only do as well as the CPU it's coupled with. In our tests we still stick with a very common CPU, the PIII 550 that won't fair well against the ability of the dedicated T&L engine in the GeForce cards. Let's put the MAXX to the test.

Benchmark Setup

Hardware Information
Motherboard (BIOS rev.) ABIT BX6 2.0 (BIOS date 7/13/99)
Memory 128 MB Viking PC100 CAS2
Network Netgear FA310TX
Driver Information
ATI Rage Fury MAXX 4.11.7925
3dfx Voodoo3 3500
Diamond Viper II
Reference NVIDIA GeForce/TNT2 Ultra drivers
Matrox G400 MAX with TurboGL 1.00.001
Environment Settings
OS Version Windows 98 SE 4.10.2222 A
DirectX Version 7.0
Quake 3 Arena Retail version
command line = +set cd_nocd 1 +set s_initsound 0
Shogo V2.14
Advanced Settings = disable sound, disable music, disable movies, disable joysticks,
enable optimized surfaces, enable triple buffering, enable single-pass multi-texturing
High Detail Settings = enabled
Fortress Demo
Descent III Retail version
Settings = -nosound -nomusic -nonetwork -timetest
3DMark 2000 16-bit settings = 16 bit textures, 16-bit Z-buffer, triple buffering
32-bit settings = 32-bit textures, 24-bit Z-buffer, triple buffering
TreeMark Simple = 35,000 polygons/4 lights
Complex = 129,000 polygons/6 lights

Benchmark Results - Shogo

Shogo - FORTRESS - 640x480x16 - DirectX 7

At 640x480 in Shogo, all the boards are about even. Although we see a few frames difference between the boards, they're all cruising along at this easy setting.

Shogo - FORTRESS - 1027x768x16 - DirectX 7

As we push things a bit harder, the Viper II starts to drop out of the competition but the MAXX keeps on the tail of the SDR GeForce board.

Shogo - FORTRESS - 1600x1200x16 - DirectX 7

Unfortunately the MAXX only keeps pace with the SDR GeForce and isn't able to pass at the highest Shogo resolution of 1600x1200. When compared to the DDR board, the MAXX is coming up short big time.

Benchmark Results - Descent 3 DirectX

Descent3 SECRET2.DEM - 640x480x16 - DirectX 7

Impressive, the MAXX nearly catches the GeForce boards even at the low resolution where it should have one its biggest problem areas due newly released drivers. You'll also note that the TNT2 Ultra has the highest score. This is very possible for two reasons. One, the TNT2 driver has been optimized heavily since its been out longer and two, we have a percent or two in margin of error in these particular tests (low resolution).

Descent3 SECRET2.DEM - 1024x768x16 - DirectX 7

Stepping the resolution up a notch, the MAXX still keeps a close pace behind the SDR GeForce board. I'm still a bit surprised that the MAXX didn't actually catch or pass it.

Descent3 SECRET2.DEM - 1600x1200x16 - DirectX 7

Now we're in Descent 3's highest DirectX setting and the MAXX still does fairly well. The thing to note here, however, is that the MAXX does not outperform the SDR GeForce. We're watching the GeForce competition closely because the SDR board is cheaper and the DDR board about the same price.

Benchmark Results - Descent 3 OpenGL

Descent3 SECRET2.DEM - 640x480x16 - OpenGL

Descent3 OpenGL is an interesting test because most cards seem to drop off very hard here. Most companies optimize their ICD for Quake 3 Arena and nothing else and that makes things very ugly for any other software. As you can see in the chart above, the MAXX does an excellent job even in OpenGL. It doesn't pass the GeForce competition but does much better than any other card in the line-up.

Descent3 SECRET2.DEM - 1024x768x16 - OpenGL

Things seem to get a little interesting as we press the resolution upward and the SDR GeForce loses some ground. You'll notice the DDR GeForce laughs at the changes and barely loses performance.

Descent3 SECRET2.DEM - 1600x1200x16 - OpenGL

Now that we've moved Descent 3 into its highest OpenGL setting, the MAXX pulls out a small victory over the SDR based GeForce. However, it still doesn't even touch the DDR GeForce solution.

Benchmark Results - Quake3 640x480

Quake3 Arena - Q3DEMO1 - Normal 640x480x16 - OpenGL

Ouch, even the much lower priced Viper II spanks the MAXX in this test. I know S3 has done a very good job at optimizing their drivers so maybe ATI can work some of the same magic. We'll see how things progress in higher colors and resolutions.

Quake3 Arena - Q3DEMO1 - High Quality 640x480x32 - OpenGL

Switching the video mode to higher quality 32-bit color didn't change things much other than the SDR GeForce taking a bit of a hit.

Benchmark Results - Quake3 1024x768

Quake3 Arena - Q3DEMO1 - Normal 1024x768x16 - OpenGL

As we begin to climb the resolution tree, the MAXX starts gaining some ground on its competition even in 16-bit mode.

Quake3 Arena - Q3DEMO1 - High Quality 1024x768x32 - OpenGL

Wow, look at that! The MAXX manages to jump past the Viper II and SDR GeForce board. Although it's still not as close as I'd expect it to be to the DDR GeForce, the MAXX does well.

Benchmark Results - Quake3 1600x1200

Quake3 Arena - Q3DEMO1 - Normal 1600x1200x16 - OpenGL

Things get a bit interesting as we hit the highest normal setting resolution. The MAXX basically ties the SDR GeForce and almost meets the 30 FPS barrier that I consider a minimum. Only the mighty DDR GeForce manages to surpass that barrier.

Quake3 Arena - Q3DEMO1 - High Quality 1600x1200x32 - OpenGL

These results are only good for analysis because none of the cards were able to cut the bare minimum of 30 FPS. You can see that the MAXX and Viper II have both skipped past the SDR GeForce. This proves how well the MAXX and Viper can deal with high color and resolution.

Benchmark Results - 3DMark 2000 3DMarks

I am so very close to yanking our only synthetic benchmark from future reviews because every hardware company has issues with this benchmark. I get answers that make no sense at certain times and when I check into the issues with the various hardware companies, they all blame the benchmark. I have graphics companies telling me one thing then another company another but things still aren't clear as to what's really going on. I can compare the results I see in this benchmark to real world tests and they make no sense still. I would love to tell you the real story here but I don't think these hardware companies would appreciate it. I feel like a parent that has to take away a marketing tool, I mean benchmark, because no one is playing fair. I've decided to show the results but next time around I may not.

3DMark 2000 3DMarks - 1024x768x16 - DirectX 7

Clearly the GeForce cards are dominating the game benchmarks in 3DMark. At 16-bit color this isn't surprising but the margin at which the GeForce cards ahead, is. I'm sure things will change when we peek at the 32-bit colors next.

3DMark 2000 3DMarks - 1024x768x32 - DirectX 7

Turning up the heat on our video cards by flipping the 32-bit texture setting and higher 24-bit Z-buffer has changes things quiet a bit. You'll notice that the Viper II took a huge hit as did the SDR GeForce. The DDR GeForce however remained fair unscathed.

Benchmark Results - 3DMark 2000 Fill-Rate (single texture)

3DMark 2000 Fill-Rate (single texture) - 1024x768x16 - DirectX 7

Now this is a test that I feel has some issues with it. I don't feel that MAXX fill-rate is as horrible as shown in this chart. S3 had some issues with 3DMark 2000 and gain huge performance numbers with an upgraded driver. I expect something similar is happening with the MAXX. We'll hopefully see more of the same when we switch to 32-bit mode.

3DMark 2000 Fill-Rate (single texture) - 1024x768x32 - DirectX 7

Things look a better but odd still in this set of results. One thing to note though is how well the DDR GeForce seems to be dominating everyone. I have a feeling the MAXX has some issues with these fill-rate tests.

Benchmark Results - 3DMark 2000 Fill-Rate (quad texture)

3DMark 2000 Fill-Rate (quad texture) - 1024x768x16 - DirectX 7

Here we have just more proof that something is goofed up. There is one nice little thing to point out about the GeForce boards however. Notice how both boards are pretty even. Watch what happens when we switch to 32-bit mode.

3DMark 2000 Fill-Rate (quad texture) - 1024x768x32 - DirectX 7

Notice how hard the SDR GeForce stumbles when we switched to 32-bit textures. It's unfortunate that we couldn't really get much information from the fill-rate tests but I hope to see ATI work something out the next driver release.

Benchmark Results - 3DMark 2000 High Polygon Count (4 lights)

3DMark 2000 High Polygon Count (4 lights) - 1024x768x16 - DirectX 7

Yet another headache is the polygon tests available in 3DMark 2000. I'm going to have to take time to figure out what's going on here. You must keep in mind that all the non T&L video cards are using the CPU to number crunch while the GeForce boards have a much faster geometry engine inside. Many people have speculated that the GeForce T&L unit isn't as fast as people make it out to be but I totally disagree that it should be anywhere near the performance of a standard Viper II or TNT2 Ultra board. Something is fishy here. Let's take a look at the 32-bit side of things.

3DMark 2000 High Polygon Count (4 lights) - 1024x768x32 - DirectX 7

As we saw in the previous chart, the same issues are still there. I can see the MAXX and Viper II picking up some ground but without a T&L unit, these cards should stand no chance.

Benchmark Results - TreeMark

NVIDIA TreeMark - Simple - 35,000 polugons / 4 lights

Here we have NVIDIA's OpenGL benchmark called TreeMark. TreeMark is a high polygon count test that stresses the T&L capabilities of a graphics card or CPU. I ran this test to show you the difference in performance you might see if a game is highly tuned for T&L but have a high-fill rate video card that's depending on the CPU to deal with the T&L requirements.

Here you can see that the GeForce cards obviously have the advantage. I understand that this is a bit extreme being that NVIDIA created this benchmark to show what T&L can do versus non-T&L cards but you can get an idea of what might happen in the future with upcoming T&L enhanced titles. You'll notice that the MAXX must have a pretty optimized driver to pull off one of the higher scores in the bench. Although it didn't come close to the SDR GeForce, it manages the 3rd highest score.

NVIDIA TreeMark - Complex - 129,000 polugons / 6 lights

Now we're making things really tough. The GeForce card even bite hard at this test by only scoring about 12 FPS while the MAXX and the rest of the competition stirs in the 2 FPS or less area.

My overall hardware performance opinion of the MAXX (aside from some of the goofball data from 3DMark) is that with its current drivers, it's a SDR GeForce contender at best. It shines in 32-bit, high-resolution situations and falls short in low-resolution 16-bit areas. With improved drivers, I feel that this card could possibly surpass that of the SDR GeForce. However, until that point in time, the MAXX remains on par if not slightly behind the SDR GeForce based board.


The ATI Rage Fury MAXX will be shipping in limited quantities by the time you read this and selling for $299 (USD) with a $30 mail-in rebate or $269. The card is offering solid fill-rate only matched by a few of their competitors, a solid driver that shows promise, quality DVD playback and a name that many have grown to trust over the years. The MAXX has a few factors that will keep it from selling like hot cakes however.

A this point in time, ATI must deal with some incredible competition especially at the price they're selling the MAXX for. Not only are they deep into the SDR GeForce price range, but also nestled right before the DDR GeForce based boards. This means that the MAXX better offer high fill-rates, T&L or something compelling to make the consumer want to go with the ATI Rage Fury MAXX over the GeForce. ATI has many faithful customers and I'm sure the initial shipment will sell well due to that fact but to continue selling boards I feel that they must decrease the price of the board and pump up those drivers to match if not beat that of the SDR based GeForce.

My final verdict on this card is that there are better choices at and below the price of which you'll pay to have this product. If you want something a tad bit expensive, you can go with one of a few boards based on the GeForce DDR solution and you'll be very happy. If you are a starving student who cares about the price, then you're better off saving your lunch money for purchasing a TNT2 Ultra or SDR GeForce card. The MAXX comes up a little too short at a little too much money for me to give it my blessing.

Свежие статьи
Лучшая оперативная память: текущий анализ рынка Лучший блок питания: текущий анализ рынка Главные новости за неделю Лучшие игровые ноутбуки: текущий анализ рынка Лучшие мониторы для игр: текущий анализ рынка
Лучшая оперативная память Лучший блок питания Главные новости за неделю Лучшие игровые ноутбуки Лучший монитор

История мейнфреймов: от Harvard Mark I до System z10 EC
Верите вы или нет, но были времена, когда компьютеры занимали целые комнаты. Сегодня вы работаете за небольшим персональным компьютером, но когда-то о таком можно было только мечтать. Предлагаем окунуться в историю и познакомиться с самыми знаковыми мейнфреймами за последние десятилетия.

Пятнадцать процессоров Intel x86, вошедших в историю
Компания Intel выпустила за годы существования немало процессоров x86, начиная с эпохи расцвета ПК, но не все из них оставили незабываемый след в истории. В нашей первой статье цикла мы рассмотрим пятнадцать наиболее любопытных и памятных процессоров Intel, от 8086 до Core 2 Duo.