Tag Archives: virge

S3 Trio3D/2X – The Cream of the crop?

Thanks to an auction which ended very well for me (no doubt nobody else wanted this piece of crap), I managed to get my hands on a S3 Trio3D/2X AGP 4MB. This was pretty much the last card from the Virge/Trio line that I wanted. I’m still missing some Virge models, however I do have the original and the DX, and the other ones are really just overclocks and whatnot. On the other hand, with the 3D/2X, S3 apparently made a few changes from the previous chip. But just how many? And does it even matter, when the starting point was so bad?

Monolith tells it like it is.

At least it has one thing to go for it. This is the first card among those in the old S3 line-up to complete a 3DMark 99 run without any particular issues. Sure, the framerate is abysmal, but eveything looks as it should. Well, almost everything: 3-pass embossing only shows a white texture, and there’s still a strange pattern in the filtering test. But at least it works.

Motoracer looks decent, but there are some transparency issues. At least it runs a lot faster than the Virge. Until you brake.

And since it works, it’s the first time I can actually trust its final score. Alas, with a miserable 218, it’s the lowest score yet. I assume only a few 1st gen accelerators could go lower. Even the Rage IIc is a lot faster, and that card was terrible – I’ll take its perspective correction issues over the Trio any day. And let’s not even bother with the 6326, comparing the Trio to that is like bringing a knife to a gunfight.

In general, it seems the card is more compatible than the previous Trio, and it also sports better AGP features – if it’s able to show all the textures in 3DMark, there must be some kind of AGP texturing (as also shown by Final Reality), although its performance seems to indicate it’s very slow. Then again, maybe an 8MB model would have been just as bad. Can’t say.

Dash, what happened to your face?

Surprisingly, however, it appears to be a little slower than the Trio. Despite the improved architecture, my framerates appear to be roughly 10-15% lower. The difference decreases as we go up to 800×600. I imagine that, if the architecture wasn’t made any faster but simply more compatible, then having to render all the effects properly would take a toll on the card.

Another strange thing is trilinear filtering – the original Trio actually tried to do it, although it reduced speed by half (and bilinear already reduced speed by almost two thirds – do the math). The 2X doesn’t even bother, it’s technically supported but looks exactly like bilinear and there’s no speed drop anymore.

So to answer my first question, does it even metter if the architecture has been improved? I guess not. This card was the equivalent of installing some snazzy rear mirrors and windscreen wipers to the original Fiat 500. It wasn’t gonna make it any faster, and you were still stuck with an antique that would have made any Homo Erectus proud.

Advertisements

The Bilinear Dream

I’ve recently acquired a S3 Virge. Yes, that’s right, the very first Virge. While I already had the DX version, I know that the 3D engine was improved for that card, so I wanted to see what kind of framerates I could really get from the original decelerator. Also, as my DX has merely 2MB of video memory, I went out of my way to find a 4MB Virge this time, so I could test a few games all the way up to 800×600 (why I would want to suffer like that, nobody knows), or at least run certain tests without the constant threat of dropped textures.

Can’t win them all, of course. In Half-Life, the card outright refuses to texture the enviroment, even on 320×240… so I just took this picture at 640×480 for the extra crispness.

It should be pointed out that its popular nickname might be something of a misnomer, as the card did indeed “accelerate” rendering. Back then, it was sometimes hard to get even 320×240 to run well in software mode. If you stuck a Virge in there, kept the resolution low, and didn’t bother with filtering or anything else, your game was gonna look a lot like software rendering… but faster.

The only way to run Blood 2 properly: in 320×240 on Low settings, with disabled lightmaps. Even the manual says so.

Unfortunately, many S3-specific games wanted to run on at least 512×384, with bilinear filtering. Texture filtering! On a card that, by most accounts, needs 8 cycles for bilinearly filtered textures! Those engineers must have seriously overestimated their chances. It would be a bit like trying to run a 4K game on a regular Xbox One.

Tough luck, CPU guys: Turok requires a 3D card. It runs decently on 320×240, as long as you disable texture filtering and fog and fancy stuff and… just about everything else… you know, maybe you should just take your chances with a N64.

Enabling bilinear filtering incurs in a lower framerate (those numbers up there din’t tell the story anyway), and also an unneeded look into S3’s dithering solution. Hardly the worst I’ve ever seen, at least.

Steer clear of any fancy effects, and your games might actually run a bit faster than before. After all, I have a P3, but most people at the time were stuck with, er, 133mhz? A Virge was bound to help a little bit in D3D-compatible games. Alas, so many D3D games didn’t give many settings to play around with, either.

It’s surprising how much the extra memory helps. Some games actually run better than the DX, despite the less optimized (eheh) architecture, because the card isn’t choking under the weight of the textures.

Dithering was annoying, but you had to deal with it, at least in 16 bits mode. Worth noting that, due to the Virge actually running in 15 bits colors (5550), there’s a bit more color banding than the average 16 bits game.

If you have the guts to enable 24 bits rendering on your damn Virge, quality improves considerably. Of course, the card was already slow enough. But in truth, I didn’t see all that much of a difference in 320×240. Then again, so few of these older games actually let you pick anything higher than 16 bits.

Could it have been better? Oh yes. But if you consider the average 3D card at the time (did someone say NV1, AT3D and 3D Rage?), it was actually okay-ish. I like that it tries to go for high-fidelity bilinear filtering, although it was probably a bad idea performance-wise. A lower-quality solution like Ati’s might have been better in retrospect, but it should be at least noted that Ati’s own bilinear occurred in even worse performance drops even on the Rage II.

I’m going to try a few S3-specific games next, although some of them were only available on DOS, so it will be hard to get pictures.

The great graphics showdown

Edit 19/12/16: I eventually managed to get a standard CPU result with the Matrox G100. With that, the score goes all the way up to 560! Amazing (not really)! However, for some reason, the CPU still runs slower at times. Occasionally, restarting the computer fixes it, but not all the times. I’m not sure what causes it, but for fairness’ sake (and also laziness), I’ll leave the old results untouched.

Original: It took a while, but my 3DMark 99 marathon is finally over. I thought I would share the results (for posterity, you know) and also drop in a few comments about each card, seeing as how data doesn’t mean much without explanations.

Let me preface this wall of text by mentioning my system specs:

– Pentium III 450mhz
– 128MB SDRAM PC-100
– Windows 98

And the default settings for 3DMark 99 are:

– 800×600 screen resolution
– 16 bits color depth
– 16 bits Z-Buffer
– Triple Buffering

These settings require 3.67MB of video memory (0.92MB for each buffer, including depth), therefore you’ll need at least a 4MB card to run the default test. And even then, you won’t have a lot of memory left for textures. That’s why some of the tests are run in 640×480 and Double Buffering. That way, you only need 1.77MB, and even a 2MB card can make it… however badly.

I also realize that my CPU is both overpowered for some of the oldest cards in there, and underpowered for some of the more powerful cards. Ideally, you’ll want at least a good 600mhz to take advantage of a Voodoo 3 and G400, not to mention the GF2 GTS which would be best served by a 900mhz at least. I still have a faster P3-700 tucked around. But I didn’t want the CPU to influence the results too much on the slower cards’ side, so I thought this was best.

But enough of my yapping. Let’s have some results.

3dmark1
Two days’ work. And at least 3DMark built the graphs for me. Quake 2 and Forsaken won’t make it so easy.

Well, there are certainly a few weird things here. Even at a glance, you might notice some decidedly unexpected results. Let’s take a look at each, starting from the bottom.

S3 Virge/DX 2MB

The good old graphics decelerator, is there any way in which it won’t disappoint us? Well, admittedly, the feature set is fairly complete, including mipmaps and trilinear filtering. The lack of score is not due to the horribly low results, but because 3DMark would crash whenever I tried to run the 2nd test (I guess 2MB is really not enough…), forcing me to disable it, thus invalidating the score. But judging from other intermediate results, I’d be surprised if this card went above 150 or so. In truth, I’m amazed that textures are actually displayed correctly in the first game test. Most 4MB cards would simply blank them. But what’s the point, when it ends up running at roughly 2fps?

S3 Trio 3D 4MB

I’ve already mentioned this card before, and now you see the numbers. There are actually two results here, one at default settings and one at 640×480 Double Buffered (like the Virge), just to see if it made a difference. As you can see, it did – in an unexpected manner. Since the card doesn’t even bother trying to render most of the tests in 800×600, the score turns out higher. The dithering is some of the worst I’ve ever seen, too.

3dmark18
Ok, the bottom two are pretty  bad, but for the record, not even Matrox could reach reference quality until the G400.

Ati 3D Rage IIc 8MB

Yes, this is actually an AGP card. No, I don’t think it has anything other than electrical compatibility. It gets some of the worst results alongside the Virge and Trio. Bilinear filtering is ridiculously inefficient (see below) and looks terrible (see above). Mipmapping doesn’t work, so of course there’s no real trilinear support. At least the on-board 8MB allows it to run textures at their highest resolution. That much memory feels wasted on this kind of card, really.

3dmark16
Aside from S3 and its general issues, the Rage IIc really can’t seem to do bilinear filtering without losing a lot of performance. I tested it in Forsaken too.

Matrox G100 Productiva 4MB

What a strange card. It lacks alpha blending, subpixel precision is horrible, as is bilinear filtering. The chip seems derived straight from the Mystique, but overall improved AND gimped. It’s faster, textures actually work most of the time, and it supports bilinear and mipmapping, however badly. On the other hand, alpha stippling is perhaps even worse than the Mystique, and overall it’s still pretty crappy. Strangely, the card scores higher in multitexturing, but since the Mystique and a few other old cards (such as the Virge) do the same, I’d rather think of it as a bug. After all, even the G200 didn’t have it. I should also mention another thing: this is the only card to get a lower score in the CPU test. It doesn’t make any sense, because that test doesn’t depend on the graphics card. Every other card obtains the same score (roughly 7000), give or take a few points. Why does the G100 only get 5200 points? I really can’t say.

3dmark20
Unlike the Mystique, the G100 at least manages to (almost) properly render textures. Nevermind the lack of alpha blending. Also look at the bottom left, and you will see a curious lack of mipmapping by the Rage LT Pro.

Ati Rage LT Pro 8MB

We’re finally leaving Horribleville to get into something decent. The LT Pro was still almost a mobile chipset, so scores are overall very low, but the tests all run properly. With one caveat: mipmapping doesn’t seem to work. I could expect that from the Rage IIc, but from something off the Pro series? How strange. The card even gets a small boost with multitexturing. Bilinear quality is also a notch above the Rage IIc.

Matrox Mystique 170 4MB

A step to the side. I said we were out of the horrible territory? Sorry, there was still this one last obstacle. This card utterly fails almost every test, rendering a black screen instead. Hence, it gets fast framerates. Hence, it gets a high score. And that’s why you shouldn’t trust synthetic benchmarks too much. Textures are completely absent in the game tests, so why do we get 4x the score in multitexturing? Who knows. At this point, I’m convinced it’s simply a bug.

S3 Savage 3D 8MB

Okay, now it gets better for real. The Savage 3D is quite good, although at the time it was hampered by poor yields and immature drivers. Whichever version you choose, you are bound to have problems. I’m using an engineering release from 1999, which does help in OpenGL games, but also causes issues in other games like Forsaken. 3DMark works though, so there’s that. And it aces every quality test too: that’s quite something. On the other hand, 3D scores are low across the board, it lacks multitexturing, and texturing speed drops to an utter crawl when the card runs out of local and non-local memory. Really, other cards don’t drop as much. I wonder why it’s so bad here. But overall, a somewhat usable card for the time, if you didn’t mind juggling drivers for every game.

3dmark10

3dmark11
The Savage 3D renders textures quickly, as long as it doesn’t run out of memory. When it does, it’s not a pretty sight. Nevermind the Mystique and Trio scores, since they respectively show a black screen, and render only a small part of the texture.

Intel 740 8MB

Ah, Intel’s famous failed experiment. Use the on-board memory as framebuffer alone, and rely on system memory for textures! What could possibly go wrong? It at least ended up as the base for their GMA line, so that’s not too bad… wait, the GMA line sucked. Oh well. Framerates are bad and fillrate is really low, 48MT/s, comparable to the Rage LT Pro with multitexturing (which the I740 doesn’t have), and don’t forget that was supposed to be a mobile chipset. But image quality is great. For some reason there’s no 32-bits support at all, but given the overall speed, it would have been unusable anyway.

Matrox G200A 8MB

The G200 was the first actually decent card by Matrox. It supported OpenGL for a start, though speed was still the worst, and I saw some bugs too (performance in Quake 2 suddenly drops by about 25-30% after a few repeated tests, and I don’t think it’s throttling). But at least it’s kinda fast. Bilinear is not that great yet, but it’s a step above the G100 for sure, and at the time only S3 and 3dfx did better. Fillrate is lower than the Savage 3D, but game speed is faster. Trilinear is somewhat slow though. Not a bad choice overall. The G200A, which I have, is simply a die-shrink which allowed the card to run without a heatsink. But it gets kinda hot, I must admit.

Nvidia TNT 8MB (Vanta?)

The most mysterious among my cards, this one is recognized by both Powerstrip and the system itself (including the BIOS) as a regular TNT. But it’s too slow, and its clock is a mere 80mhz. However, it clearly supports multitexturing, and not badly either (almost doubling from 60MT/s to 110MT/s). Overall, I’m not sure what it is. I have a feeling that it might be either a downclocked OEM version, or some kind of Vanta. But did it really need active cooling? Well, image quality is great at least.

Matrox G250 8MB

Not much to say here. It’s an overclocked G200A, no more, no less. With frequencies at 105/140mhz, compared to its predecessor’s 84/112mhz, scores should be roughly 25% higher across the board – and lo, they are. In spite of the higher clocks, it doesn’t have a heatsink. According to Wikipedia, the card itself should be clocked at a maximum of 96/128mhz, but my model defaults higher. That’s why I’ve just edited the page. Hehehe.

Matrox G550 32MB

Wow, that’s quite the jump. I wanted to talk about the G400 before this one, since the G550 is essentially a die shrink with an extra TMU per pixel pipeline, and DDR-64 memory instead of SDR-128. Sounds better? Well, the DDR memory actually makes it slower overall, and the extra TMU per pixel pipeline (for a total of 4 TMUs) doesn’t help all that much. I can’t check the frequencies because Powerstrip doesn’t recognize the card, but looking at that single-texturing fillrate, they should be the same as the G400. Multitexturing only gains about 20%. Quality is great, but overall this card seems pretty unstable under Windows 98, at least on the latest drivers, so I wouldn’t bother.

Nvidia Geforce 2 GTS 32MB

Holy crap. Why is this one so low? Various test results are ridiculously high, multitexturing fillrate is almost 3 times better than the second highest score (freaking 840MT/s!). So why? Well, it seems that 3DMark 99 places a bit too much emphasis on the game tests. And for some reason, the Geforce 2 doesn’t do all that well in those two tests, even though it stomps all over the other ones. Nothing we can do about that, I’m afraid, other than shake our heads in disappointment at MadOnion (Futuremark now). Quality is expectedly great.

3dmark6

3dmark7
Compare the top image to the bottom one, and see how some graphic cards supported multi-texturing to increase fillrate. It wasn’t really a perfect parallelism, except for a few cases. But the GTS almost doubles its score with its 8 (!) TMUs over 4 pixel pipelines. Don’t dwelve too much on the bottom cards, I bet they are bugged. There’s no way a Virge/DX supports multitexturing.

3dfx Voodoo 3 3000 16MB

A pretty common card for several reasons: fast, good quality overall, Glide support and non-horrible OpenGL. Still, it has its issues. According to 3DMark (yeah, yeah) its texture memory is limited to 5MB. Probably wrong – it must be checking the maximum available framebuffer size for my monitor (1280×1024) and compensating for it. Admittedly however, I’ve seen in the past that the card suffers from some strangely blurry textures, and Tonic Trouble doesn’t look its best either. The Voodoo 3 is still limited, for multitexturing, to games that specifically support that feature: otherwise your fillrate is cut by half. And yes, single-texturing is slower than the G400 indeed. Quality is great, except for textures, but trilinear filering is still pretty slow. Anyway, one has to wonder if 3dfx shouldn’t have at least featured AGP memory. Too late now.

3dmark19
When your powerful card draws worse textures than the Rage IIc, you know there’s a problem. Parity with the Trio 3D sounds about fine… not.

Ati Rage 128 GL 16MB

I acquired this card just recently, and it was the main reason for attempting this whole endeavor in the first place. It is both good and bad. Ati still stuck with their bad bilinear filtering implementation (in 1999?), but at least performance doesn’t suffer as much as the older ones. It’s pretty fast overall, but fillrate is not that good, and multitexturing is also pretty inefficient. Not a bad choice, but why use this when there’s better stuff around? Still, probably Ati’s first truly viable chip.

Nvidia TNT2 M64 32MB

One of the most common graphic cards around, and thus one of the easiest to find on auction sites and garage sales. Quality is good, as expected of something that came out fairly late, and overall it doesn’t lack anything, although it doesn’t excel in anything either. Quite the “boring” card for sure. Multi-texturing could be more efficient.

Matrox G400 32MB

The current champion, at least according to this test. It has great image quality (including good bilinear from Matrox for the first time), it’s fast, contains lots of memory on board and even some extra non-local memory, it’s better than the DDR-64 version, and its parallel pixel pipelines with one TMU each meant there was no performance deficit if the game didn’t support multi-texturing, as the card simply single-textured at twice the rate. Impressive. Even OpenGL support wasn’t all that bad. Of course, the Geforce 2 destroys it in real life scenarios, and the Voodoo 3 is a bit faster in truth. But we’re talking about 3DMark here, and the G400 wins. Not to mention, the Geforce 2 doesn’t run as well with DOS games. So there you go.

That’s a lot of cards. Too bad I don’t have a PowerVR anymore, and my Voodoo is broken, and so are my Voodoo 2 and Banshee… damn, old stuff really is fragile. And expensive. Did you know a Voodoo 4 will easily run you over 100 euro? I’m afraid it’s true. So I need to aim lower. For my next card, I’d like to get a proper TNT2 Pro, not a M64 this time… and maybe also something older like a Trident or SiS.

Filtering textures like it’s 1996

While still working on-and-off on my years long project, I came across an S3 Trio 3D. A fairly common graphics card, though not as common as the Virge (which I have too, unluckily for me). Veterans will probably remember the Virge line as the decelerator of its time, and Trio wasn’t much better.

General agreement is that the Virge was trying to accomplish too much with too little. PC players at the time were barely playing with perspective correction, and that was already something: PS1 and Saturn didn’t even support it. Still, the Trio is more modern. You’d think it could have done better. It didn’t.

s3trio3dmark99
A typical 3DMark99 results screen. Part of it anyway. Running on 800×600 actually results in a higher score… because the card doesn’t even bother rendering some of the tests.

Well, it didn’t, but what a strange failure it was. It’s really kind of impressive to see a 4MB, very cheap 2D/3D card properly supporting not just bilinear filtering, but also mipmapping and trilinear. And it does so quite well too, unlike other budget cards of the era. As a matter of fact, if you don’t mind playing in 320×240, you’ll end up getting decent speed with much more features compared to software rendering. Well, if you can deal with the 15-bits colors and resulting color banding.

Video memory is indeed especially low. And with S3 only assigning a mere 1.5MB to textures, you can forget about great results. But let’s face it, the card is way too slow to run anything above 320×240 anyway. I used 640×480 only for testing, but it was clearly not meant for this kind of hi-res.

With 3DMark99 set that high, weird things start to happen. Texturing Speed is especially strange. Don’t be fooled by the results up there: the card is only rendering a small part of the test plate. Other cards usually drop to a crawl, but still pass the test. It seems like the Trio doesn’t even try to swap, simply omitting the repeated texture instead.

s3trio3dmip1

s3trio3dmip2

s3trio3dmip3

The filtering test also shows something interesting. There isn’t nearly enough memory to properly render the graphics, but notice how mipmaps are working, and even trilinear is supported normally. What were they trying to do, supporting this stuff in a low-budget card? You can even see in the results screen that texture filtering comes with some of the heftiest performance drops I’ve seen this side of the ATI Rage II.

Who knows what they were thinking. But it’s quite impressive nonetheless. Games run (or better, don’t) like molassa, textures aren’t rendered, extra features are way too slow… but filtering itself seems to work fine. Mind, it was 1998, so perhaps not supporting those features would have made S3 a laughing stock in the market. But they pretty much were one already, so why bother?

My tests aren’t over, so I’m sure I’ll find other interesting results. I’ve seen a few already (how did the G100 support multitexturing when even the G200 didn’t?), and it can only get better. For example, I have that Matrox Mystique waiting…

Virge, I choose you!

I finally found a use for the Virge DX. I honestly thought I would never actually use it, outside of benchmarking. But no, there’s a real reason to put it in the PC now: it’s the only graphics card I have which doesn’t create jerky scrolling in Commander Keen. I’m amazed.

After all, why else would anyone want to use a Virge? Although, for all its framerate woes, compatibility was actually somewhat decent. I’ll let these captures speak for themselves. All native resolution… I’m afraid. Pixel-doubling them seemed to show the dithering in an even worse light than it already was on my screen, so I kept them at their original resolution. You may need glasses.

The textures in Blood 2 work, and that's more than I can say for the Mystique. On 320x240 and minimum, bottom of the barrel details, the game runs somewhat decently, but framerate will drop frequently.
The textures in Blood 2 work, and that’s more than I can say for the Mystique. On 320×240 and minimum, bottom-of-the-barrel details, the game runs decently enough, but framerate will drop frequently. As the developers themselves say in the options screen, “performance is questionable”.

Ever wondered what Rogue Squadron would have looked like on the PS1? This is probably a decent approximation. Except for the filtering, but I imagine an eventual PS1 version would have run a lot better than this.
Ever wondered what Rogue Squadron would have looked like on the PS1? This is probably a decent approximation. Except for the filtering, but I imagine an eventual PS1 version would have run a lot better than this.

Terrain issues make Mechwarrior 2 something of a mess. The only solution seems to be disabling ground textures, but then you might as well play the Pentium edition.
Terrain issues make Mechwarrior 2 something of a mess. The only solution seems to be disabling ground textures, but then you might as well play the Pentium edition.

Turok, as one of the granddaddies of Direct3D benchmarks, deserves a few more screenshots.

This is what happens if you run the game with everything on. Clearly the mipmapping causes trouble. Not that it would be any playable, with barely 1fps.
This is what happens if you run the game with everything on. Clearly the mipmapping causes trouble. Not that it would be any playable, with barely 1fps.

On the other hand, disabling everything will make the game run okay-ish. There are still visual glitches, perhaps some issues with Z-buffering.
On the other hand, disabling everything will make the game run okay-ish. There are still visual glitches, perhaps some issues with Z-buffering.

When I de-selected "fancy sky", I was hoping I would still get something a bit better than this. Nintendo 64 owners must have been laughing in our faces. Well, joke's on them, we had a mouse and keyboard.
When I de-selected “fancy sky”, I was hoping I would still get something a bit better than this. Nintendo 64 owners must have been laughing in our faces. Well, joke’s on them, we had a mouse and keyboard.

Dithering is an issue. What's the use of gratuitous blood if it looks this bad?
Dithering is an issue. What’s the use of gratuitous blood if it looks this bad?

Terracide has a specific S3 setting. A nice surprise, since the regular D3D was slow as molassa.

Running the game on 512x384 with low or medium textures and no filtering, will yield the most playable results. It doesn't mean it has to look good though.
Running the game on 512×384 with low or medium textures and no filtering, will yield the most playable results. It doesn’t mean it has to look good though.

By 1997, I guess very few people actually still used a Virge for games. With good reason.
By 1997, I guess very few people actually still used a Virge for games. With good reason. At least the colored lighting works.

DOS games coded for the specific S3 API tend to fare a little better, but just slightly (photos resized to 640×480).

Terminal Velocity is stuck at SVGA resolution and high textures on the Virge. Bad choice. Even with no filtering, the game isn't smooth at all. Also, judging from those textures, I say the regular version looks better.
Terminal Velocity is stuck at SVGA resolution and high textures on the Virge. Bad choice. Even with no filtering, the game isn’t smooth at all. Also, judging from those textures, I say the regular version looks better.

Screamer runs very well at 320x20, but the dithering is overkill, so you might want to set it at 320x400, as shown here. It's still smooth enough, surprisingly, and doesn't look nearly as bad.
Screamer runs very well at 320×20, but the dithering is overkill, so you might want to set it at 320×400, as shown here. It’s still smooth enough, surprisingly, and doesn’t look nearly as bad. Unfortunately, Screamer is still a terrible game.

S3 eventually made other accelerators, including the decent Savage line. But those won’t work with the S3 API, even the Trio3D has trouble with many titles (I only managed to make Croc run). So it’s a Virge or nothing.

But hey, all of the above becomes meaningless if you consider the smooth scrolling in Commander Keen. I had forgotten how good it looked. So, let’s raise a glass to S3, if just this once.

Croc – Legend of the 3D Accelerators

The middle 90’s era is a proper nest for treasure. Just like Tomb Raider, other games had to include support to multiple APIs in order to get their games to run properly on various graphic cards (or they could also use software mode, but… ehhh).

Croc is one such game, although unlike Tomb Raider, there’s support for 3D accelerators right out of the box. To specify, the game supports the omnipresent Glide, a special Voodoo Rush mode (according to the Vogons Wiki, among the Rush’s various shortcomings there was also an incompatibility with certain Glide games), the good old Matrox Mystique API, the S3 API because your game can never be slow enough, and CIF for the two people who had a Rage II+ for games.

Being a Windows game, I actually managed to take screenshots this time. With two exceptions. The software mode will give out screenshots in garbled colors, so it’s out. And the Voodoo, well, I have some drivers issues with my Voodoo 3 which I still haven’t managed to solve, so I couldn’t even get it to run. The game doesn’t support Direct3D or OpenGL, so unless you have one of these cards, the game will run in software mode.

But let’s start now, as usual, with my favorite card (for nostalgic reasons only), the Matrox Mystique. Click on the screenshots for the full image!

No filtering, no alpha blending - it's a Mystique alright.
No filtering, no alpha blending – it’s a Mystique alright.

I said i couldn’t take screenshots of the software mode, but this is pretty close. Well, as expected of the Mystique. The texture pixels are as big as my hand, and shadows are pixelated. It doesn’t look like an improvement over software rendering.

Mystique owners had to get used to the lack of proper transparencies. These are actually a little better-made than usual - you don't want to see Forsaken.
Mystique owners had to get used to the lack of proper transparencies. These waterfalls actually look a little better than usual – you don’t want to see Forsaken and its light effects (although that game still impressed me in terms of speed on the Mystique).

But you know what, it’s not that bad. It looks like software mode, but good luck running that at any decent speed on 640×480. The Matrox, on the other hand, will deliver similar quality (and in fact better, since the image looks cleaner somehow) at much improved framerates. The game is very well playable with the Matrox, running smoothly and with no hiccups.

You only need to deal with the aforementioned filtering and alpha problems. But what if you dont want to? Well, we have reserves. Come into play… the Virge.

We traded giant pixels for terrible dithering. Good job, S3.
We traded giant pixels for terrible dithering. Good job, S3.

I don’t see an improvement here. Quite the opposite. We have filtering now, but the sampling seems to be really spotty, with color stains on the grass that make me think of some kind of disease, and dithering that looks like it crawled straight out of the PSP. I don’t know what’s up with the terrible filtering, but it almost appears as if the green gives it trouble. The cobblestone and well look better, but the dithering still ruins it.

Besides, even if I could accept this image quality, I certainly wouldn’t be able to accept the framerate. The last time I’ve played such a jerky platformer was Evil Twin on PS2 (I just remembered a game I had buried real deep… I’ll need to look for it on PC). 640×480 is nowhere near playable. 512×384 starts getting a little better, but in the end you’ll probably have to go all the way down to 320×240.

That’s if you want to torture yourself and use a Virge. Why not use a Rage 2 instead?

Something is amiss here... the CIF seems to have trouble with Croc.
Something is amiss here… the CIF seems to have trouble with Croc.

Marvel at Ati’s prowess. No more dithering! Yay! Well, actually, not so fast. There is a bunch of other issues here. For starters, the texture filtering is as bad as on the Virge, which seems to imply a bad implementation within the game itself. It’s unfortunate that I couldn’t run the game on the Voodoo, or we could see if at least the Glide version fixed that.

I hope you like polygon seaming issues because you'll see them a lot.
I hope you like polygon seaming issues because you’ll see them a lot.

The other big problem is polygon seaming. You could already see it on the well in the previous screenshot, but this one shows it a lot better: pretty horrible lines where polygons meet. Even in the skybox. The readme mentions this problem with CIF cards, but can only offer one solution, disabling the texture filtering. This is a doubly good idea, since as I have noticed in other games, the Rage II incurs a pretty hefty performance hit with bilinear filtering – the framerate is definitely worse than the Mystique.

I liked those pixels better on the Mystique.
I liked those pixels better on the Mystique.

With texture filtering disabled, the seaming issues all but disappear and the framerate gets much better – wait, what’s up with those ground textures? If you thought the pixels on the Mystique were bad enough, just wait until you see these ones. It’s like the ground has measles. What’s more, the smoothness still isn’t up to par with Matrox. You will need to go down to 512×384, which makes it pretty smooth overall, but it won’t look as good as 640×480.

So what’s the best choice? Well, if you have a Voodoo and it actually works, just go with that, I guess. Otherwise it’s a victory for Matrox – it has the highest speed of the bunch, coupled with the best textures (because they are unfiltered, go figure) and no glaring image artifacts. I don’t know about you, but I’ll take pixelated shadows over dithering everywhere. And I’ll take smaller pixels over green stains.

The speed factor is actually a victory for Ati, however, as the CIF was supported up to the Rage Pro, meaning you could use that card to get better results. Indeed, my Rage LT Pro ran the game very smoothly even with the max resolution of 864×480! But you’ll still get the seams or the giant pixels, so in the end I’d rather go with Matrox.

It’s a matter of compromises, and for once, the Mystique proves itself the most useful. At least until I get my hands on that 3dfx Velocity.

Tomb Raider: 3D patches on test

Having recently acquired a copy of Tomb Raider (the 1996 one), and with all those 3D accelerators gathering dust in my cupboard, I had this idea flash in my mind: why not try and see how the game looks with different cards?

Tomb Raider was one of those mid-90s games which required a different executable for each card it needed to run on. Sounds like a pain, but I guess people back then didn’t really have a choice. Oh, of course, most people could just run the game in software mode, but that wasn’t very good. You needed a very strong CPU to run in high resolution (such CPUs usually weren’t even available yet), while the low resolution mode looked straight out of the Saturn, except without the higher quality cutscenes.

Of course all that changed when the 3D patches were released. But which was the best one? We’ll have to see, though many could take an educated guess and give it to the Voodoo. But who knows. These tests are run on a P3-450mhz with 64MB RAM, Soundblaster 16, and Windows 95. Unfortunately, I can’t take screenshots of DOS games, so you’ll have to bear with some crappy camera photos. I’ve tried to take detailed shots whenever needed.

The good old Matrox Mystique. My 4MB edition allows me to play with an impressive 640x480.
The good old Matrox Mystique. My 4MB edition allows me to play with an impressive 640×480.

So we start with one of Matrox’s most famous cards. Shut up, it was just misunderstood, okay? This is the Matrox Mystique MGA 170, which usually came with 2MB of video memory. Tomb Raider will default to 512×384 on such versions. My own card has 4MB, so I get to play at 640×480. Envy me.

Tomb Raider

Tomb Raider

Not too horrible. The Mystique didn’t support bilinear filtering, as you can probably see from the floor in the first screenshots, but there is proper polygon perspective correction. The framerate is not very good though, acceptable but not quite there.

Tomb Raider

Tomb Raider

Some details. The cages in the first one show the limits of nearest neighboring on textures, while the second one makes the dithered shadow very noticeable. The Matrox was famous for its lack of alpha blending, so this was the best they could do with alpha stippling, short of a simple black circle. It’s already quite enough that they put shadows at all, as we’ll see later. For now let’s move on to something else.

I had promised to myself I would never take the Virge out again... but promises are made to be broken.
I had promised to myself I would never take the Virge out again… but promises are made to be broken.

The infamous Virge. No, this one wasn’t misunderstood, it was just bad. This particular model is a S3 Virge/DX with 2MB of video memory. A very standard graphics decelerator. Let’s see how it compares.

The Matrox version didn't even have a graphics options screen, so this is a step forward. But that 640x480 is a pipe dream.
The Matrox version didn’t even have a graphics options screen, so this is a step forward. But that 640×480 is a pipe dream.

Tomb Raider

Tomb Raider

Welcome to the world of S3. Now, it looks okay, but that’s all it does. Because you won’t really notice that when the game is running under 10fps. On the plus side, it has everything, from bilinear filtering to perspective correction to even high resolution. But the bilinear filtering doesn’t seem to have an appreciable impact on performance (amazingly), so you are stuck with terrible framerates. There is also some strange graphical artefact, and notice the lack of shadow under Lara’s feet. Following, some more images to show how to try and fix the situation.

Tomb Raider
A weird mosaique effect can be seen in the side.

You can disable perspective correction, if you don't mind playing in an Escher painting. You won't gain much though.
You can try disabling perspective correction, if you don’t mind playing in an Escher painting. You won’t gain much though.

The best way to get some more frames out of the old Virge is, as you may have guessed, to lower the resolution. You could also try disabling bilinear filtering or perspective correction, but for some reason they don’t have that much of an impact. Now, as for resolution, the game gives you a choice of 640×480 (ahahah), 512×384, and 320×200.

512x384 is probably your best bet. It gets a little choppy in heavier scenes, but overall it's quite playable and doesn't look bad at all. Also notice Lara has a shadow now. The mosaique pattern is still visible though.
512×384 is probably your best bet. It gets a little choppy in heavier scenes, but overall it’s quite playable and doesn’t look bad at all. Also Lara has a shadow now. The mosaique pattern is still visible though.

320x200 is the smoothest option you have, and boy, it is smooth. It looks very close to the regular software version though, the filtering is not easy to notice.
320×200 is the smoothest option you have, and boy, it is smooth. It looks very close to the regular software version though, the filtering is not easy to notice.

Well, anyone could have expected the Virge to be the worst version. It actually is quite full of features, but with that kind of speed, nobody will want to play this. Not even Turok 2 fans. Maybe we should get something else.

Ati Rage IIc. I don't actually remember where I got it. But it's a very unremarkable card, especially considering its quite impressive 8MB and AGP connection.
Ati Rage IIc. Before Mantle, there was CIF.

Quite interesting to know that Tomb Raider even had an Ati version. Ati had its own API back then, the CIF, which very few games used. In this particular case, Tomb Raider actually becomes a native Windows application when you apply the Ati patch. And you get to use a separate executable, which is handier than renaming Tomb.exe all the time.

Tomb Raider

Tomb Raider

Bilinear filtering, perspective correction, resolution choice up to 800×600, and even a framerate counter. Very impressive despite the lack of shadows. The framerate itself is not so impressive though. 10fps? 13fps? Better than the Virge, but just barely, and considering the card is more recent, quite a poor showing. Of course you can still lower the resolution to get better results. These are my countings, showing the heaviest area (the one with the cages) and one of the lightest areas (the starting room).

800×600: 9-15 fps
640×480: 14-19 fps
512×384: 19-25 fps
320×200: 30 fps

At 320×200 it’s pretty much locked at 30 fps, but so was the Virge. Then again, it’s worth noting that, unlike the Virge, disabling texture filtering will have a strongeffect on the framerate: give or take, you can add 5 more fps to all those results (still, the game will never go above 30 fps). The card seems especially slow with filtering, an effect I had also noticed while testing Forsaken. So if you don’t mind playing with poorer filtering, the Rage IIc will give you a very smooth 512×384 pretty much all the time.

Ati Rage LT Pro. Now we are getting a bit closer to dream material.
Ati Rage LT Pro. Now we are getting a bit closer to dream material.

But the Rage II was quite poor. Surely the CIF supported better cards. Well, not that many actually, and I don’t quite have a 3D Rage Pro yet… but the LT Pro was at least a decent substitute. This card once again has 8MB and AGP, but it’s a lot faster than the IIc.

800x600, all details on, and 30 fps to boot? Hold me.
800×600, all details on, 30 fps, and shadows to boot? Hold me.

Compared to the IIc, it was a bit harder to make the game run at all. I had to revert to an older set of drivers. Once started though, we have some great results. The game is running at 800×600 with all the effects on, and it holds 30fps most of the times, with drops down to 25fps in the cages area. That’s still extremely playable, but if you want that extra smoothness, you can revert to 640×480, or just disable bilinear filtering.

3dfx Voodoo 3. Spoiler: it's pretty fast.
3dfx Voodoo 3. Spoiler: it’s pretty fast.

Most people are probably familiar with the 3dfx edition, but let’s talk about it anyway. This board in particular is a Voodoo 3, so not quite something from the company’s peak days. It’s still a very fast card, and I use it regularly in my Win95 PC due to its Glide support for many old titles. Of course, Tomb Raider is one such game, so let’s take a look.

Well, it doesn't get much better than this, but where's Lara's shadow?
Well, it doesn’t get much better than this, but where’s Lara’s shadow?

Tomb Raider

The only graphics choice in the game is between mipmapping and no mipmapping, which the other cards don’t offer. But unlike on the other cards, you can’t choose to disable perspective correction and filtering. Considering the game is running at 30fps literally all the time, this is hardly a problem. It’s a little weird to notice that Lara doesn’t have a shadow though. This might be an issue with the Voodoo 3 in particular, as the game itself was made to run on the first Voodoo board, and there are other games which show issues with newer cards.

Disabling mipmapping will have some strange effects on the water. I still wouldn't do it, as mipmapping without trilinear will result in those ugly detail jumps, as you can see on the right.
Disabling mipmapping will have some strange effects on the water. And I still wouldn’t enable it, as mipmapping without trilinear will result in those ugly detail jumps, as you can see on the right.

So there we are. For my money, I’m actually surprised to say that the best version is the Ati one: assuming you have a supported card (as there’s probably no wrapper for CIF) and a decent CPU, it gives you the best features of the four, and the highest resolution. I have read that the PowerVR version is even better, but unfortunately I don’t have the means to try it. Maybe one day.

To finish, three actual screenshots. The software and 3dfx ones are taken from the GOG version of the game, while the Ati one was screencapped from my old PC (since the Windows executable allows you to print the screen, unlike all the other DOS-based versions). Click on the screens for their native resolution.

Look at that garish colors palette. It's not even very smooth. But what did you expect from software mode?
Look at that garish colors palette. It’s not even very smooth. But what did you expect from software mode?

The Glide version actually has shadows, unlike on the Voodoo 3. I suppose it means it could be a problem with that particular card indeed.
The Glide version here actually has shadows, unlike on the Voodoo 3. I suppose it means it could be a problem with that particular card indeed.

Ati Rage LT Pro. There are some color banding issues here, but you can't really notice in-game. If I do replay through Tomb Raider, I shall put this card to use.
Ati Rage LT Pro. There are some color banding issues here, but you can’t really notice in-game. If I do replay through Tomb Raider, I shall put this card to use.