The great graphics showdown

Edit 19/12/16: I eventually managed to get a standard CPU result with the Matrox G100. With that, the score goes all the way up to 560! Amazing (not really)! However, for some reason, the CPU still runs slower at times. Occasionally, restarting the computer fixes it, but not all the times. I’m not sure what causes it, but for fairness’ sake (and also laziness), I’ll leave the old results untouched.

Original: It took a while, but my 3DMark 99 marathon is finally over. I thought I would share the results (for posterity, you know) and also drop in a few comments about each card, seeing as how data doesn’t mean much without explanations.

Let me preface this wall of text by mentioning my system specs:

– Pentium III 450mhz
– 128MB SDRAM PC-100
– Windows 98

And the default settings for 3DMark 99 are:

– 800×600 screen resolution
– 16 bits color depth
– 16 bits Z-Buffer
– Triple Buffering

These settings require 3.67MB of video memory (0.92MB for each buffer, including depth), therefore you’ll need at least a 4MB card to run the default test. And even then, you won’t have a lot of memory left for textures. That’s why some of the tests are run in 640×480 and Double Buffering. That way, you only need 1.77MB, and even a 2MB card can make it… however badly.

I also realize that my CPU is both overpowered for some of the oldest cards in there, and underpowered for some of the more powerful cards. Ideally, you’ll want at least a good 600mhz to take advantage of a Voodoo 3 and G400, not to mention the GF2 GTS which would be best served by a 900mhz at least. I still have a faster P3-700 tucked around. But I didn’t want the CPU to influence the results too much on the slower cards’ side, so I thought this was best.

But enough of my yapping. Let’s have some results.

Two days’ work. And at least 3DMark built the graphs for me. Quake 2 and Forsaken won’t make it so easy.

Well, there are certainly a few weird things here. Even at a glance, you might notice some decidedly unexpected results. Let’s take a look at each, starting from the bottom.

S3 Virge/DX 2MB

The good old graphics decelerator, is there any way in which it won’t disappoint us? Well, admittedly, the feature set is fairly complete, including mipmaps and trilinear filtering. The lack of score is not due to the horribly low results, but because 3DMark would crash whenever I tried to run the 2nd test (I guess 2MB is really not enough…), forcing me to disable it, thus invalidating the score. But judging from other intermediate results, I’d be surprised if this card went above 150 or so. In truth, I’m amazed that textures are actually displayed correctly in the first game test. Most 4MB cards would simply blank them. But what’s the point, when it ends up running at roughly 2fps?

S3 Trio 3D 4MB

I’ve already mentioned this card before, and now you see the numbers. There are actually two results here, one at default settings and one at 640×480 Double Buffered (like the Virge), just to see if it made a difference. As you can see, it did – in an unexpected manner. Since the card doesn’t even bother trying to render most of the tests in 800×600, the score turns out higher. The dithering is some of the worst I’ve ever seen, too.

Ok, the bottom two are pretty  bad, but for the record, not even Matrox could reach reference quality until the G400.

Ati 3D Rage IIc 8MB

Yes, this is actually an AGP card. No, I don’t think it has anything other than electrical compatibility. It gets some of the worst results alongside the Virge and Trio. Bilinear filtering is ridiculously inefficient (see below) and looks terrible (see above). Mipmapping doesn’t work, so of course there’s no real trilinear support. At least the on-board 8MB allows it to run textures at their highest resolution. That much memory feels wasted on this kind of card, really.

Aside from S3 and its general issues, the Rage IIc really can’t seem to do bilinear filtering without losing a lot of performance. I tested it in Forsaken too.

Matrox G100 Productiva 4MB

What a strange card. It lacks alpha blending, subpixel precision is horrible, as is bilinear filtering. The chip seems derived straight from the Mystique, but overall improved AND gimped. It’s faster, textures actually work most of the time, and it supports bilinear and mipmapping, however badly. On the other hand, alpha stippling is perhaps even worse than the Mystique, and overall it’s still pretty crappy. Strangely, the card scores higher in multitexturing, but since the Mystique and a few other old cards (such as the Virge) do the same, I’d rather think of it as a bug. After all, even the G200 didn’t have it. I should also mention another thing: this is the only card to get a lower score in the CPU test. It doesn’t make any sense, because that test doesn’t depend on the graphics card. Every other card obtains the same score (roughly 7000), give or take a few points. Why does the G100 only get 5200 points? I really can’t say.

Unlike the Mystique, the G100 at least manages to (almost) properly render textures. Nevermind the lack of alpha blending. Also look at the bottom left, and you will see a curious lack of mipmapping by the Rage LT Pro.

Ati Rage LT Pro 8MB

We’re finally leaving Horribleville to get into something decent. The LT Pro was still almost a mobile chipset, so scores are overall very low, but the tests all run properly. With one caveat: mipmapping doesn’t seem to work. I could expect that from the Rage IIc, but from something off the Pro series? How strange. The card even gets a small boost with multitexturing. Bilinear quality is also a notch above the Rage IIc.

Matrox Mystique 170 4MB

A step to the side. I said we were out of the horrible territory? Sorry, there was still this one last obstacle. This card utterly fails almost every test, rendering a black screen instead. Hence, it gets fast framerates. Hence, it gets a high score. And that’s why you shouldn’t trust synthetic benchmarks too much. Textures are completely absent in the game tests, so why do we get 4x the score in multitexturing? Who knows. At this point, I’m convinced it’s simply a bug.

S3 Savage 3D 8MB

Okay, now it gets better for real. The Savage 3D is quite good, although at the time it was hampered by poor yields and immature drivers. Whichever version you choose, you are bound to have problems. I’m using an engineering release from 1999, which does help in OpenGL games, but also causes issues in other games like Forsaken. 3DMark works though, so there’s that. And it aces every quality test too: that’s quite something. On the other hand, 3D scores are low across the board, it lacks multitexturing, and texturing speed drops to an utter crawl when the card runs out of local and non-local memory. Really, other cards don’t drop as much. I wonder why it’s so bad here. But overall, a somewhat usable card for the time, if you didn’t mind juggling drivers for every game.


The Savage 3D renders textures quickly, as long as it doesn’t run out of memory. When it does, it’s not a pretty sight. Nevermind the Mystique and Trio scores, since they respectively show a black screen, and render only a small part of the texture.

Intel 740 8MB

Ah, Intel’s famous failed experiment. Use the on-board memory as framebuffer alone, and rely on system memory for textures! What could possibly go wrong? It at least ended up as the base for their GMA line, so that’s not too bad… wait, the GMA line sucked. Oh well. Framerates are bad and fillrate is really low, 48MT/s, comparable to the Rage LT Pro with multitexturing (which the I740 doesn’t have), and don’t forget that was supposed to be a mobile chipset. But image quality is great. For some reason there’s no 32-bits support at all, but given the overall speed, it would have been unusable anyway.

Matrox G200A 8MB

The G200 was the first actually decent card by Matrox. It supported OpenGL for a start, though speed was still the worst, and I saw some bugs too (performance in Quake 2 suddenly drops by about 25-30% after a few repeated tests, and I don’t think it’s throttling). But at least it’s kinda fast. Bilinear is not that great yet, but it’s a step above the G100 for sure, and at the time only S3 and 3dfx did better. Fillrate is lower than the Savage 3D, but game speed is faster. Trilinear is somewhat slow though. Not a bad choice overall. The G200A, which I have, is simply a die-shrink which allowed the card to run without a heatsink. But it gets kinda hot, I must admit.

Nvidia TNT 8MB (Vanta?)

The most mysterious among my cards, this one is recognized by both Powerstrip and the system itself (including the BIOS) as a regular TNT. But it’s too slow, and its clock is a mere 80mhz. However, it clearly supports multitexturing, and not badly either (almost doubling from 60MT/s to 110MT/s). Overall, I’m not sure what it is. I have a feeling that it might be either a downclocked OEM version, or some kind of Vanta. But did it really need active cooling? Well, image quality is great at least.

Matrox G250 8MB

Not much to say here. It’s an overclocked G200A, no more, no less. With frequencies at 105/140mhz, compared to its predecessor’s 84/112mhz, scores should be roughly 25% higher across the board – and lo, they are. In spite of the higher clocks, it doesn’t have a heatsink. According to Wikipedia, the card itself should be clocked at a maximum of 96/128mhz, but my model defaults higher. That’s why I’ve just edited the page. Hehehe.

Matrox G550 32MB

Wow, that’s quite the jump. I wanted to talk about the G400 before this one, since the G550 is essentially a die shrink with an extra TMU per pixel pipeline, and DDR-64 memory instead of SDR-128. Sounds better? Well, the DDR memory actually makes it slower overall, and the extra TMU per pixel pipeline (for a total of 4 TMUs) doesn’t help all that much. I can’t check the frequencies because Powerstrip doesn’t recognize the card, but looking at that single-texturing fillrate, they should be the same as the G400. Multitexturing only gains about 20%. Quality is great, but overall this card seems pretty unstable under Windows 98, at least on the latest drivers, so I wouldn’t bother.

Nvidia Geforce 2 GTS 32MB

Holy crap. Why is this one so low? Various test results are ridiculously high, multitexturing fillrate is almost 3 times better than the second highest score (freaking 840MT/s!). So why? Well, it seems that 3DMark 99 places a bit too much emphasis on the game tests. And for some reason, the Geforce 2 doesn’t do all that well in those two tests, even though it stomps all over the other ones. Nothing we can do about that, I’m afraid, other than shake our heads in disappointment at MadOnion (Futuremark now). Quality is expectedly great.


Compare the top image to the bottom one, and see how some graphic cards supported multi-texturing to increase fillrate. It wasn’t really a perfect parallelism, except for a few cases. But the GTS almost doubles its score with its 8 (!) TMUs over 4 pixel pipelines. Don’t dwelve too much on the bottom cards, I bet they are bugged. There’s no way a Virge/DX supports multitexturing.

3dfx Voodoo 3 3000 16MB

A pretty common card for several reasons: fast, good quality overall, Glide support and non-horrible OpenGL. Still, it has its issues. According to 3DMark (yeah, yeah) its texture memory is limited to 5MB. Probably wrong – it must be checking the maximum available framebuffer size for my monitor (1280×1024) and compensating for it. Admittedly however, I’ve seen in the past that the card suffers from some strangely blurry textures, and Tonic Trouble doesn’t look its best either. The Voodoo 3 is still limited, for multitexturing, to games that specifically support that feature: otherwise your fillrate is cut by half. And yes, single-texturing is slower than the G400 indeed. Quality is great, except for textures, but trilinear filering is still pretty slow. Anyway, one has to wonder if 3dfx shouldn’t have at least featured AGP memory. Too late now.

When your powerful card draws worse textures than the Rage IIc, you know there’s a problem. Parity with the Trio 3D sounds about fine… not.

Ati Rage 128 GL 16MB

I acquired this card just recently, and it was the main reason for attempting this whole endeavor in the first place. It is both good and bad. Ati still stuck with their bad bilinear filtering implementation (in 1999?), but at least performance doesn’t suffer as much as the older ones. It’s pretty fast overall, but fillrate is not that good, and multitexturing is also pretty inefficient. Not a bad choice, but why use this when there’s better stuff around? Still, probably Ati’s first truly viable chip.

Nvidia TNT2 M64 32MB

One of the most common graphic cards around, and thus one of the easiest to find on auction sites and garage sales. Quality is good, as expected of something that came out fairly late, and overall it doesn’t lack anything, although it doesn’t excel in anything either. Quite the “boring” card for sure. Multi-texturing could be more efficient.

Matrox G400 32MB

The current champion, at least according to this test. It has great image quality (including good bilinear from Matrox for the first time), it’s fast, contains lots of memory on board and even some extra non-local memory, it’s better than the DDR-64 version, and its parallel pixel pipelines with one TMU each meant there was no performance deficit if the game didn’t support multi-texturing, as the card simply single-textured at twice the rate. Impressive. Even OpenGL support wasn’t all that bad. Of course, the Geforce 2 destroys it in real life scenarios, and the Voodoo 3 is a bit faster in truth. But we’re talking about 3DMark here, and the G400 wins. Not to mention, the Geforce 2 doesn’t run as well with DOS games. So there you go.

That’s a lot of cards. Too bad I don’t have a PowerVR anymore, and my Voodoo is broken, and so are my Voodoo 2 and Banshee… damn, old stuff really is fragile. And expensive. Did you know a Voodoo 4 will easily run you over 100 euro? I’m afraid it’s true. So I need to aim lower. For my next card, I’d like to get a proper TNT2 Pro, not a M64 this time… and maybe also something older like a Trident or SiS.