Tag Archives: gpu

Rage 128 Pro: Rage is right

Once again, I’m reminded of why I hate Ati cards.

Looks like an Ati. Trouble will be forthcoming.

I recently received a Rage 128 Pro. This is supposedly not a bad card for its time, should be somewhere around the TNT2 and G400. I already had a Rage 128 Pro Ultra, but that was a 64 bits bus width model, so I was eager to see what kind of difference the extra bandwidth would make. Everest says they are both running at 120c/120m, so that means the memory bus is 1920MB/s against 960MB/s.

Of course, I’d need to get it running first. I try to install my usual 7192, which worked just fine on the Ultra, but the card is not recognized. Apparently, Ati decided that OEM vendors modifying the ID was not a significant issue, so you’ll encounter this problem a lot. This is what happened in the next hour:

Drivers 7192 (latest Ati): card is not recognized
Drivers 610: installs, but card doesn’t work
Drivers 7087: installs, Windows hangs, must remove from safe mode

At this point I was kinda annoyed. Luckily, I seem to hit the jackpot with the next attempt, on drivers 654 beta. About time, too.

Somehow I expected more.

Initial results were somewhat disappointing. 3DMark shows no real difference outside of texture rendering speed, which is almost doubled. I should mention that the drivers of course are different, which might have an impact, but I’m not convinced that’s the only reason. Could it be that the Rage 128 Pro doesn’t really suffer from low bandwidth outside of very high resolutions? I’ve heard of its performance feats in 32 bits mode. Game tests will hopefully clarify things. Or at least, they should. Unfortunately, just like most Ati cards of the era, disabling Vsync in Direct3D is not an option.

(well, technically it is an option, it just doesn’t do anything)

Nevermind. OpenGL is still our friend, and even in D3D I can still compare the card with other similarly vsynced models. A good point of comparison would be provided by cards that used either two pipelines or two TMUs. My favorite kind of architecture, really. As usual, the tests are running on a P3-450mhz and 128MB SDR PC-100. Here we go:

Poor Trident… uh, I mean… as we can see, the Voodoo 3 is clearly ahead of the pack, and it should be, given that Quake 2 takes perfect advantage of its dual TMU design, plus it has a far higher clock speed than any other card (166mhz, while the others are ranging between 90mhz and 125mhz). The Kyro 1 and Oxygen GVX1 have some bottlenecking issues somewhere, either that or bad drivers. But from this chart, it seems that 128-bits data bus cards have little trouble powering through 800×600, while 64-bits model start choking already. The only exception is the Rage 128 Pro Ultra, which hangs on – could it truly be? Let’s see another game.

Newer is slower. Trident redeems itself! A bit, anyway. We can see here that no card can go past 48fps. I’d be hard pressed to call it a real CPU bottleneck, given that more powerful cards seem able to reach 60fps (and T&L models even 70fps). But it’s still something to consider. Things are more uneven here, but overall, it seems that 64-bits cards still struggle more. Not surprising. Even the Pro Ultra falters this time. Clock speeds are a bit higher than the Vanta, so obtaining similar results is not great. But hey, wanna talk about the Oxygen GVX1 and its 128-bits memories? I thought not.

The cards for Incoming have been all hand-picked, to only show you the ones limited by vsync. Be grateful. The G550 is actually an exception to my previous rule, since it’s a 2×2 design, but all my tests and other tests on the internet show that it’s usually as fast, if not a bit slower even, than a G400. And since, unlike the G400, it seems to be plagued by vsync, it is a better pick. But unfortunately the Rage 128 Pro doesn’t have a good showing here, even next to its older Rage 128 GL sibling. I blame drivers. The situation is better in Forsaken, but I forgot to fix a few things so you won’t get a chart. I will tell you that on 1024×768, the Pro reaches 47.2fps, better than the Ultra (38.8fps) and the GL (37.2fps). Be grateful.

At a glance, it seems to me that the Rage 128 Pro architecture is slightly less reliant on memory bandwidth than its competitors indeed. However, given that it scores lower across the board, it might not be that important – why pick a Rage 128 Pro Ultra, when you could have a TNT2 M64? Let’s face it, the drivers were going to be a million times easier to install too. The dithering was also better (the Rage 128 Pro uses a very noisy diamond pattern, who knows why anybody could have thought it was a good choice) which is kinda important because I can’t imagine people playing in 32-bits mode with budget cards.

I also got a SIS 315 Pro, which I thought was an actual SiS 315. Turns out it’s just a 315E. As usual, my little SiSter keeps disappointing me. Some things I discovered though: the card actually goes up to AGP 4x (even though my motherbord can’t support it), while the 315L only goes up to 2x. And… that’s just about it? Yeah, not much. I’ll need to check out those lot auctions a little better next time.

The Trident Blade T64: one prong for each bit

A couple years ago, I learned an important lesson: you aren’t going to have an easy time in auctions. Rare cards, if cheap, are snagged almost immediately. If they are expensive, well, you probably won’t get close to them in the first place. What’s a collector to do? Spending a lot of money is an option, of course, but consider other routes. For example, there are many people who sell graphics cards by the lot. Most of these are unspecified in auctions, and you can only rely on photos to tell what you are actually looking at. With some luck, a rare find can be made, one that others won’t know about. This is how I got my super cheap Geforce 256 SDR.

And now, I got a very cheap Trident Blade T64. This might be an even rarer find. I had never even seen any of the Blade XP series on sale, and getting one for cheap seemed beyond hope. That is already impressive enough. But what’s even more impressive is performance, although not in a good way.

A graphics card in 2000 without even a heatsink. This bodes well. There are holes in the PCB, so I was afraid someone had removed it, but the surface of the chip is smooth and the card seems stable.

We should probably look at some specs first. Data is scarce, as befitting of a card nobody had, but I was certainly surprised to see that even Everest gives up entirely – no specs are shown at all. Powerstrip on the other hand makes a token effort, and tells me that memory is clocked at 166mhz and data bus is 64 bits. While that software is notoriously fidgety, this is not too difficult to believe, since Trident mentioned these same specs on their old archived website.

Look at these figures and play bingo: cross one out every time your head tilts.

According to Trident’s own words, this card is a dual pipeline design. I’ll just ignore that 1328MT/s texel rate – at 166mhz, it would imply a monstrous 2×4 design, and there’s no way in hell that’s true. Now, Wikipedia says it’s a 2×2 design. But this seems wrong. A core clock of 166mhz, peak fillrate of 332MP/s, would be 2×2? Mmh. Let’s look elsewhere. Tom’s Hardware has an old review of the Blade XP, and according to them, the T64 is the same architecture but with 64-bits data bus and lower clocks. That’s possible, and since it was written way back in 2001, it’s more reliable than recent data. Trident’s old website also says that the T64 worked at “up to 166mhz”, meaning it could be lower. 143mhz is possible then. Given those numbers, it sounds like a 2×1 design. This is more believable.

However, Trident themselves say that it can process “up to 4 textures per clock”, and specifically calls it a “dual pixel, quad texture rendering engine”. This is harder to explain, but I still won’t believe in a 2×2 design, because it makes no sense here. Perhaps it’s using a trick similar to the Savage 4, which merged textures to apply them in a single pass? This is possible too. Admittedly the easiest answer would be 2×2, but there’s so much that seems wrong with this idea, I can’t just accept it yet. You’ll see.

Trident claimed single pass anisotropic filtering. Reality check: trilinear incurs in a hefty performance drop and anisotropic isn’t even supported.

Now, I’m the first to admit that 3DMark isn’t the most reliable benchmark around. But those fillrate numbers don’t look like 2×2 to me. Even worse, the results line up with the Blade 3D in a bad way. The B3D was 100mhz, while if we assume that the T64 is 143mhz, that means a 43% increase in single pipeline performance. Add in the faster memories, and a 50% increase sounds likely. But if so, I’m just not seeing the impact of the second pipeline here.

Well, look at that. Enviromental Bump Mapping is supported. So this card does have something to it.

A similar story is repeated with 3DMark 2000. Again, same results between single texturing and multitexturing. In order to try and get some definitive answer, I even tried 3DMark 2001. At default settings, I get around 78MT/s single, 83MT/s multi. At my wit’s end, I tried setting it to the minimum of 640x480x16. Finally a difference: 110MT/s single, 160MT/s multi. Managing to break the 143MT/s barrier is proof that the card can’t be a mere 1×1, at least. But we’re still far from the fabled 2×2 design, if you ask me. Even if a difference between single and multi texturing would imply there’s more than one TMU per pipeline, these numbers and Trident’s own sheets are simply too low to support that claim.

By the way, all 3DMark’s say that the card can apply up to 3 textures per single pass. Strange, but they said the same thing about the G450, which is definitely a 2×1 card, so I’m not gonna put too much stock into it.

GL_ARB_multitexture
GL_SGIS_multitexture not found
Make up your mind!

Quake 2 says that there’s some kind of multitexturing extension, but doesn’t actually find it. And unlike the TNT2, I couldn’t find any registry key to enable it. Theory number two: maybe the card was supposed to be a 2×2 design, but the second texture unit in each pipeline was disabled for some reason? Or maybe it can only be enabled by a DX8.0 program, such as 3DMark 2001? After all, the sheet says that the card is a 7.0 piece (no T&L by the way), but has software interfacing with 8.0, whatever that means. I think I’m making stuff up here. It’s just annoying.

Performance could be a good way to debunk all these myths, but unfortunately the card is crippled by vsync, much like the Blade 3D before it. There’s no way to disable it, not even Powerstrip. Besides, my Pentium 3 450mhz is probably a bit too weak for a 2000 card, even one such as this. Here are a few numbers though:

All these cards had vsync. Level field for sure.

Vsync makes things hard to discern, but overall I see too many similarities between the Blade 3D and the T64. Any time the T64 is higher, it’s always by less than 50%, which is more easily explained by the higher clocks (the only outlier is MDK2, but the Blade 3D had some big CPU frametime spikes in that game, and only 800×600 is reliable as a result). At this point I’m starting to wonder if perhaps even the second pipeline wasn’t gimped in the T64 at least, making it effectively a 1×1 design. The lack of heatsink certainly is strange, maybe a gimped card had no need of it? Of course, there’s also the possibility of CPU bottlenecks. But that looks unlikely. Or horrible drivers. That sounds less unlikely.

There are other issues that are worth mentioning, even if not related to performance. The card has similar color trouble when playing certain side-scrolling DOS games as the Voodoo 3 did. So in the end, it’s just not a great card. But an interesting one for sure. It’s always difficult to explain why some data doesn’t line up with official specs. It’s usually bad drivers, but who knows, maybe sometimes it’s just bad specs? I don’t think I’ll ever know.

HIS sold a Blade XP, and this was the official data sheet. Their numbers are the same as Trident’s, meaning they don’t make much sense.

Possibly even stranger, if you take a look at Jaton’s archived webpage for the Trident Blade T16, it has the exact same data as the T64. It even calls it a 9970! But the T16 is actually a 9960, there’s pictures of the card on VGAMuseum. I smell a copy-paste job on Jaton’s part. I can only wonder how the T16 was gimped, since the T64 was already gimped enough. Maybe they removed the effectively non-functional second texture unit in each pipeline. What a loss.

From the user manual. The 3DForce G-16 is supposed to be the T16. But apparently it’s based on the 9970 T64. But apparently it’s a 9960 T16. Excuse me while I blink endlessly.

Theory number three: maybe Trident made such a mess with their models that everyone called their chips however they liked. Maybe the T16 is actually a T64, just with lower clocks and less memory. Maybe the entire Blade line is just an overclocked Blade 3D.

Maybe some things are best left undiscovered, for sanity’s sake.

Edit 02/01/20: I’ve since discovered (you learn something new every day) that GL_SGIS_multitexture was an obsolete extension by 1999, hence Quake 2 looking for that over GL_ARB_multitexture. Of course, every other card supports it still. But Trident not being aimed at gamers, I guess they didn’t care for legacy extensions. Their loss. Results are still bad either way.

Making lemonade out of Radeons

Don’t you just hate bugs? Even if you file a report, who knows how long they will take to fix. After all, there’s no telling how high in the priority list they are. In some situations, it might just be better to… turn a bug into a feature. How? There are ways.

Let’s go back many months. I purchased Assassin’s Creed Origins on sale way back in 2018. But as it turned out, my measly Ryzen 1500X was simply not good enough to run the game at 60fps. Lowering the settings helped, but a stable 60fps was a utopia still. Well, nevermind that, let’s just cap the framerate at 30fps and pretend that we’re playing on consoles. Right?

Welcome to incorrect vsync hell.

Maybe not. As it turns out, the in-game 30fps framerate limiter actually operates at 31fps. I’m not sure if this is a widespread problem – I was unable to find anyone else with the same issue. What it means though, is that the game stutters constantly and without fail.

I tried a few possible ways to fix it. What if I disabled vsync and kept the 31fps limit? Well, that would just create a nice tearing line throughout the game. Someone suggested to limit the framerate with RTSS. This almost worked, but the image overall still looked more unstable than it had any right to. I even attempted to use the Framerate Targeting in the AMD software, but as expected, that is just a target and therefore doesn’t work too well. Eventually, I just played with RTSS, but then dropped the game after a while because it was getting on my nerves.

Now let’s fast forward several months. While playing around with System Shock, I discover an interesting bug: when I’m trying to play at 1280×1024 in native resolution, the framerate drops hard. Some time later, while discussing this issue, a pattern emerges – it turns out that my RX480 engages Vsync at 30fps whenever it needs to render any letterboxing in a Direct3D game. The most common case, of course, is when trying to play a game with GPU Scaling set to Center, for example because I want to avoid any non-integer scaling. Reading around the net, this bug seems to happen on Polaris cards on TVs with HDMI Scaling turned on. But this is a monitor, and there is no such option. Strange.

That counter in the top right… it looks like some very stable 30fps.

The most annoying result of this bug, for me, is that it makes the new Integer Scaling option – introduced on Polaris cards with the new Adrenaline software – completely useless, since many old games ran at 640×480 and scaling that would give me 1280×960 as a result, triggering the bug. Pretty much no old games had 540p as a resolution, therefore I simply can’t avoid it.

But why not take advantage of this long-running bug, and use it to… maybe, just maybe, fix another long-running bug? Let’s go back to Assassin’s Creed Origins.

Yes, you can pet the cat.

The game only has a few selectable resolutions, but going lower than 1080p is possible. There’s of course 1600×900, but I’d get more screen usage overall if I select 1680×1050 instead, even though I’d lose a bit of extra picture on the sides. After that, I just have to set the GPU Scaling to Center (which the new software lets me set up for the individual game, luckily) and then enable Vsync in game.

The sweet taste of properly capped victory.

Success. I can now enjoy the game at a stable 30fps. And one advantage of this console-style framerate is that I can set the graphics to High and still not get any drops. Now I don’t want this bug fixed anymore. At least until I’m done with the game.

As for the future, I have an interesting card on the way. My favorite holiday present.

Put some bandwidth in your dynamite

Long-time readers of this blog (who?) might remember that for a long time I have had a strange Riva TNT, one that gives laughably bad scores in any benchmark. They are so bad, in fact, that I’ve often wondered if it was truly a TNT. At higher resolutions, it seems more like a Riva 128 with multitexturing. I can confirm the multitexturing part because Quake 2 uses it. But still.

The culprit. As you can see, the outside doesn’t show any indication that you have a blazing fast Nvidia Riva TNT ™. Fishy to say the least.

The Bios POST screen also showed little doubt, but who knows? These things can be changed relatively easily, I think. But even less doubt is presented by the drivers. The card will only take TNT drivers. So that’s it. From that perspective, there’s no doubt.

The chip itself also would rather you didn’t get its name wrong. Although, that fan was so hard to pull out, I wonder if it was put on it to hide the chip. Well, I’ll have to look into swapping it with a heatsink.

The first idea that sprang to my mind was that maybe I had a 64-bits bus width card. However, Powerstrip said it was 128-bit. So for a long time, I trusted it and didn’t think much of the situation anymore. Recently, I’ve learned that Powerstrip is unreliable at best, and I’ve also read around that Nvidia released some incognito TNT cards with half the bus width and half the memory size. Interesting. Let’s try with something else?

Everest to the rescue.

And so the truth is revealed. The card indeed has a 64-bits wide data bus. But not just that, the memory is also clocked at a measly 80mhz. That gives me a bandwidth of 640MB/s. For comparison, a regular TNT had its RAM clocked at 110mhz and a 128-bits bus, for a total of 1760MB/s. Even though the core clock is the same at 90mhz for both cards, that’s barely more than a third of the memory bandwidth. Oof.

Well, that explains a lot, certainly. If I must tell the truth, I was initially unwilling to admit this because there was no mention of this kind of card on VGAMuseum or anywhere else, aside from that small blurb on Wikipedia. I even started thinking: maybe this is actually based on the TNT2 M64, rather than the original TNT? That seemed unlikely, of course. The Vanta line was a real thing, so to speak, with a properly named chip and everything, even the Vanta LT wasn’t incognito. This, on the other hand, is just called a TNT. So, is it?

I do have a TNT2 M64 in my personal collection, of course, given that those things on auction sites are more common than TIE Fighters in a galaxy far far away. I do not have a Vanta LT, but I can easily “simulate” it by bringing down the clocks to 105c/100m. The architecture is the same after all. The only difference would be the extra memory size (32MB, while the Vanta LT was only 16MB), but for these old tests it shouldn’t make any difference. As usual, it’s on a P3-450mhz.

Quake 2, 16-bits (multitexturing enabled from registry)
Forsaken, 16-bits
Incoming, 16-bits

The scores shouldn’t leave much to the imagination. The Vanta LT would have been massively underpowered next to a regular TNT2, due to its lower clocks and narrower data bus, only somewhat holding up at 640×480. But my TNT is also massively underpowered next to the simulated Vanta LT, even though the bus width is the same, and core and memory frequencies aren’t too different (my TNT is 90c/80m). In all cases, the same drivers were used. Improved architecture then. I can’t see any other explanation.

I do find it interesting that nobody really seems to have any info on this old, gimped TNT. Is it so rare that nobody among enthusiasts even acknowledges its existence? That sounds hard to believe. This kind of cards must have sold to OEMs for a dime a dozen. The 64-bits variant of the Riva 128ZX is known, so this is strange.

Or maybe it’s all Occam’s Razor and people simply forgot about it.

S3 Savage IX on trial

While browsing for old cards (as usual), my eyes landed upon something called Savage IX, the first of S3’s attempts to breach into the laptop market. While I had read about the IX in the past, there was little to no info about it, other than some very basic specs, which might or might not be correct. The card itself was cheap enough, so I thought, why not see if I can clear the confusion? So I went and ordered it. As it turns out, things are never so easy.

Here’s the culprit. Notice the lack of heatsink, this was supposed to be a laptop part after all. Memory is integrated into the package itself. The backside looks very simple, presumably because of that.

The card was shipped in its box. As you might expect from an old laptop product, the packaging was the tackiest thing ever, and I especially loved the manual in broken English that contained information about every possible card, except the one you just bought. Of course. I don’t know who the vendor was, but even they didn’t want people to know they had just bought a Savage. Figures.

Trouble started immediately, by the way. Upon installing the drivers and rebooting, Windows 98 would seemingly become unresponsive. After a long time scratching my head and deleting drivers from safe mode, I finally noticed that Windows wasn’t unresponsive, rather it was in extended desktop mode. Yes, for whatever reason my monitor was set as secondary monitor (even though the card has no TV-out…) and even reinstalling the drivers didn’t change it. Laptop heritage? Either way, I was ultimately able to fix it from registry. In case anyone is actually interested in testing this card, I’ll save you the trouble and tell you what you need to change. Go to: “HKEY_LOCAL_MACHINE\Config\0001\Display\Settings”, then to the device subfolder, and set the string AttachToDesktop to 0. And just to be sure, change the string AttachToDesktop in the main folder too.

So let’s finally check the card. One final thing before we start: results have been inconsistent sometimes. At one point, 3DMark 2000 would give me a fillrate of 90MT/s, then the next time it would drop to 62MT/s. I thought maybe the chip was overheating, but it’s not even warm to the touch, despite the lack of heatsink. I can only assume the drivers are terrible like that. Also, vsync can be disabled from S3Tweak, but the framerate still seems capped and the resulting flickering is reminescent of the Savage 3D, albeit not as bad. Just for the sake of testing, I disabled it.

And of course, the specs: Pentium 3 450MHZ, 128MB SDRAM PC100, i440BX-2, Windows 98, 60hz monitor.

Powerstrip first. Well, it kinda bugs out. The core clock is rated at 14Mhz and you can’t even see the memory clock. It also says the data bus is 64-bits wide. That sounds more believable. I read some info on the internet that the data bus could be 128-bits wide, due to having its memory integrated in the package. But based on my results, I don’t believe that. Or perhaps, it’s true but makes no difference. Now, S3Tweak is slightly better, since it correctly recognizes the card and says the memory clock is 100Mhz. Could be, could be not. Unfortunately there’s no indication of the core clock, but we’ll get to that. MeTaL support is apparently present with version 1.0.2.5, but Adventure Pinball reverts to software rendering when I attempt to enable it. I’m going to assume it’s non-functional, like many other things on this card.

Check out that botched texture filtering. No surprise that it runs horribly in software mode, maybe they should have just kept to point sampling.

Checking out System Info on 3DMark reveals a horrible truth: the card doesn’t supported multitexturing at all, and Z-buffer is only 16-bits. Doesn’t look like a Savage 4 to me, then. What’s worse, it doesn’t even support S3TC. Why they would take out that kind of feature, I don’t know. It doesn’t support edge anti-aliasing either, but that doesn’t surprise me. Still, for a laptop card that was supposedly aimed at gamers, you’d think they would try and attract attention with some of their more popular features.

Some interesting results here. The card seems definitely closer to a Savage 3D than a Savage 4. Fillrate makes me think the SIX is clocked at 100Mhz just like its memory. Nevermind the high texture rendering speed on the S3D, that was probably due to vsync issues that caused heavy flickering. Polygon scores are higher on SIX, maybe some architecture improvement? But trilinear rendering is a lot slower. Now granted, many cards of the time weren’t doing real trilinear. However, the SIX apparently isn’t even attempting that, and doing something else entirely. Luckily that only happens in 3DMark, and the games seem trilinearly filtered.

Mipmapping? Where have you gone? At a glance, this is just bilinear filtering without mipmaps, which explains the sudden lower score.

As you can see, rendering quality is much closer to the Savage 3D than the Savage 4. There are however some issues with textures or maybe lightmaps. Luckily I haven’t noticed them anywhere else.

So much for 3DMark 99. Some quick info about 3DMark 2000: scores are very close to the Savage 3D (again), but actually higher all around. My S3D is probably an old model for OEM sellers, presumably running at 90Mhz. I’m sure it’s nowhere close to those 120Mhz Hercules chips. If the SIX is truly running at 100MHZ, then the higher results check out. Final Reality isn’t really worth checking out either, especially because I lost my detailed S3D scores, but I can say that just like that card, the SIX has some serious issues with the 2D transfer rate.

And did I say that 3DMark 2000 at 1024x768x16 gives out higher scores than the Savage 3D? Well, PC Player D3D Benchmark running at 1024x768x32 gives me 28.3fps, far lower than the S3D at 35.9fps, or the S4 at 41.5fps. That 64-bits data bus is looking right at the moment.

But enough with the syntethic benchmarks, and let’s have ourselves some game tests. The cards I included were as follow: Savage 3D and Savage 4 for obvious reasons. Also the Savage 2000, just for fun. Two more cards are included: the first one is the Rage LT Pro, a common laptop card for the time, albeit obviously older. The other is an Ati Rage 128 Pro Ultra. See, I kinda wanted to try and see how the Savage IX compared to the Mobility 128, another potential laptop card of its year. Unfotunately it’s quite expensive. So I went for the next best thing: looking at the specs, the Rage 128 Pro is too fast thanks to its 128-bits data bus, but the Ultra variant for the OEM market is only 64-bits, which corresponds to the Mobility. Everything else is very close, except for the 16MB of memory, but for my tests that shouldn’t count too much. As a whole, it should be a decent surrogate, I hope.

Sorry about the terrible chart. My OpenOffice skills are showing again.

If nothing else, the SIX supports OpenGL, and it’s even the same driver version as the S3D and S4. So we can try Quake 2 and MDK2. Results don’t look flattering to start with, however. Without multitexturing, there’s only so much the card can do. At 1024×768, it even falls behind the S3D despite its supposedly higher clocks. The Ultra is looking quite fast at this moment, and considering the Mobility 128 came out not much after the Savage IX, that didn’t look like good news for S3. Maybe it was more expensive though. The Savage 2000 is in a league of its own, expectedly.

Exciting times for S3. in MDK2, the SIX manages to actually beat the S3D by a sizeable margin, more than the higher clock would suggest. I’ll chalk that down to architecture improvements. But the drop at high resolutions is steep, which further convinces me that the data bus can’t be 128-bits. At any rate, the Ultra isn’t far ahead here, but the Savage 4 can’t be even approached. Check out the bottom… poor Rage LT Pro.

Things get muddier in Direct3D, because as I said before, disabling vsync can show some strange flickering, plus it doesn’t even seem to work all that well, as framerate in Incoming is still capped at 60fps. However, the average is higher, and it doesn’t appear to be double buffered. I’ve disabled it for the time being, meanwhile let’s cross our fingers.

Nevermind the Turok results, because T-mark was always an incredibly unreliable benchmark in my experience (a Matrox G200 beating the Geforce 2 GTS, really?). Let’s take a look at Incoming instead. The three bottom cards are stuck with vsync, so the results are probably not good for a comparison. Even so, we can take a look at the Ultra and the SIX. At 640×480, there is too much flickering, and I’m afraid it’s actually impacting the results. At 800×600 and 1024×768, things are much more bearable. Still, seeing the Ultra beaten by the SIX despite its dual texturing engine is quite the sight, and a far cry from the OpenGL results. In fact, even the Rage Pro can actually compete with the Ultra at 640×480.

I decided to spare the Rage LT Pro the embarassment here. Or maybe I just forgot. What would you expect anyway?

Let’s switch to 32-bits rendering and things are a bit muddier, especially at higher resolutions where vsync doesn’t factor in as much. The SIX actually falls to the bottom of the pack, while the Rage 128 architecture proves itself more efficient than the competition. Either way, even with vsync disabled, I don’t think the SIX could have beaten the S3D.

Again, the SIX and Rage LT are vsynced. However, as far as I could tell, the 128 Ultra was not. Despite this, you can see that it doesn’t perform very well. Maybe there are some issues with D3D, since in OpenGL the card is comfortably ahead of the old Savage line. Comparisons between the SIX and the Savage cards are impossible due to the refresh rate lock, which is a shame.

Somehow I don’t think vsync is disabled.

Here’s also Shadows of the Empire, just for kicks. None of the cards manage to render the fog properly, which is quite an achievement. Even then, the Ultra is clearly punching under its weight. What’s going on? Some early bottleneck during D3D rendering?

That’s all? No, there’s another small surprise. Let’s move to a DOS enviroment and try a few different resolutions in Quake. Mmh, what’s this? No matter which resolution you use, the monitor will be set to 640x480x60. Even 320×200, which every single other card I own renders as 720×400 and 70hz, as is proper. In fact, setting a resolution higher than 640×480 will cut off part of the image. I have no idea what could be causing this. Maybe it’s yet more laptop heritage? Or maybe it’s somehow decided that my monitor is actually a NTSC TV. This happens even on real DOS, without loading any drivers. Whatever the reason, it means the card is no good for DOS games. When I try to play Doom, it’s not as smooth as with other cards that run it properly at 70hz. It’s more like playing one of those console ports. Maybe it would be good for Normality, which switched between different resolutions between gameplay and inventory. But I can’t think of much else.

Well, there we have it. Conclusions? this looks like a slightly overclocked Savage 3D, yet there are clearly some architectural improvements. It fixes some of the bugs of the Savage 3D, after all. I’m still very hesitant to call it an off-shoot of the Savage 4, as I’ve read around. It lacks too many features, it’s slow, and there was no point for S3 to drop the texture merging feature of their more successful chip. The lack of S3TC is equally disappointing for a card supposedly aimed at the gamers market, even if we were talking about the budget-conscious ones. I also don’t believe the data bus is really 128 bits. On the other hand, even with vsync enabled, the card does look like it would have outperformed the Mobility 128 in D3D, despite its lack of multitexturing. OpenGL shows a different story, however, and the Ati architecture is indeed stronger on paper. It’s still weird that the card manages to be slower than the S3D in Quake 2, and drops further at higher resolutions in MDK2. That’s hard to explain. Maybe data bandwidth is lower. I should mention that Powerstrip set itself to 60Mhz when I tried to check the default clock, but I didn’t want to risk enabling it. Besides, the fillrate score from 3DMark 99 doesn’t seem to support that idea.

I wish I could have disabled vsync properly, since all other Savage cards let me do it. But I suppose things never go as planned. This was still an interesting card to test, with a few surprises along the way (and some swearing too). I don’t think I would have wanted to find one in my laptop though.

Bits and Pieces

it’s been a while, but work still hasn’t quite let up yet. I expect it will keep me super busy until at least the end of July. And while that’s great, it means I have time for little else. But that doesn’t mean I can’t make a few random purchases once in a while. Here’s what I got.

Now you are playing with… very limited power, even for 2002.

First up, a Geforce 4 MX 420. This is actually beyond my usual period range of graphics card, having been released in 2002. Normally, I wouldn’t consider anything beyond a Geforce 2 for my benchmarks. But this specific card was the bottom of the barrel for the Geforce 4 MX line, which itself was already the red-headed stepchild of the Geforce 4 line, missing its shader units and looking more like a souped-up Geforce 2 MX. So I thought it was worth trying.

Unfortunately, it won’t happen – the card seems to be broken. One long beep, three short beeps, and you know there’s nothing to do anymore. A shame, because I wanted to see how it would compare to my MX440SE – essentially the same card, but with half the bandwidth due to the use of 64-bits memory. It would have been nice to see how the limited bandwidth impacted performance. But that won’t happen. Oh well, it was cheap enough that it’s worth keeping in my drawer anyway.

Italian language manual (with various strange mistakes) included. The rest of the box and game is entirely in English. Judging from the customs company sticker, this entire release seems to have been imported.

My second purchase had a happier ending. Blood Omen on PC is a game that I have been long pining for, and I finally found a boxed copy for relatively cheap. It’s somewhat beat up on the right side, but nothing too noticeable. Much more surprising was that the game itself seemed brand new – it was still shrinkwrapped, and even the jewel case inside was definitely new. The auction did say “new”, but I thought it was the usual crap. I wonder if this box had been left buried somewhere under other things, and then someone found it randomly and decided to put it on Ebay. Either way, the first thing I did was to… remove the shrinkwrap. I can already hear the collectors crying in pain, but I prefer to actually play my games.

Sits well enough with my few other PC game boxes. Love the old Crystal Dynamics logo – makes you think someone would actually remember Gex today. Hey, now there’s an idea for a remaster.

The last purchase is a copy of Splinter Cell for Xbox. With the announcement that the original trilogy was to be released on Xbox One with resolution and framerate enhancements, I thought I could finally use my own review disc to play it. Of course, that wasn’t to be. The disc isn’t recognized by the console. So I ordered a cheap replacement. Worth it overall, given that Splinter Cell is among my favorite games from the PS2/Xbox generation. Time to wait the playing game now.

What? You mean my promotional-use-only, not-final-code disc is not recognized by the Xbox One? Surely you jest! In my defense, it did run on the Xbox 360. False hope is all the rage today.

What else awaits? As usual, nobody can tell yet. I do have one hope though: now that Blood Omen is out of the way, there is only one PC game left that I absolutely need. Should I ever find it cheap enough (unlikely, I know) it will be mine. As for which one it is, that will be revealed in due time.

Tactical Espionage Graphics

I remember Sons of Liberty. It had one of the weirdest plot I’ve ever seen, at least for its time – nowadays, it’s been surpassed by many other games. But without letting the mumbo jumbo confuse us, it was still a competent action-stealth game. If you were playing on PC, though, Solidus Snake wasn’t your only problem: you also had to deal with a bunch of technical issues. I was able to play relatively well a few years ago, but the lack of sea was especially baffling – it felt like the Big Shell was some kind of airbase. Can’t forget that.

Luckily those genome soldiers with their highly developed senses of hearing and vision can’t see my arm sticking out

By most accounts, MGS1 wasn’t as bad. Of course, it does get a bit worrying when you open the readme file for the demo to see the “list of cards with known issues” – and notice that it includes just about every graphics card available at the time. Nevermind that their standard answer to every issue was to just update your video drivers. I wonder how well that worked for everyone.

On my part, I was able to try the game with a number of graphics cards, and didn’t notice anything particularly bad.

Geforce 2 GTS

Voodoo 3

Savage 3D

STB Velocity 128

Aside from some gamma differences, there is nothing really setting the first three cards apart. The Riva 128, on the other hand, has clear polygonal issues, and Snake’s shadow is also bad – although I would worry about his missing forehead first. It’s also a bit too slow. The V3 and GF2 have no trouble, while the Savage 3D can drop frames. I was also able to try a SiS 6326 and it was about as bad as you’d expect. Image quality was okay, but 320×240 was required for a playable experience.

The screenshots also highlight the dithering methods used by the different cards. Unsurprisingly, the Geforce 2 comes out on top, while the Riva 128 hands in the worst result. The S3D and V3 seem to trade blow on paper, but in truth, 3dfx’s higher precision internal rendering gives the image a much better quality that is simply not visible in screenshots.

Unfortunately, the game is prone to crashing when you try and change the visual options, so I’ll probably end up playing on 640×480 despite the Geforce 2 being able to power through 1024×768 just fine, even with 4x MSAA enabled from the GPU control panel. Not that I really wanted to use that, since the MSAA employed by Nvidia for the Geforce 2 seemed to cause a few issues.

1024×768: no AA, no problem

1024×768: AA 4X apparently causes what seems to be a reduction in color depth, with more obvious dithering: not a good tradeoff for some slightly smoother lines

Metal Gear Solid was clearly not a heavy game, being a PS1 port after all (on that note, I noticed some slightly wonky perspective correction no matter which card I used, perhaps a remnant of its PS1 roots). Sure, a SiS 6326 is outmatched, but what doesn’t outmatch a 6326? Yet even a semi-budget card like the Savage 3D turns in a well playable experience. Technically the game requires a DirectX 7 card, but that’s obviously not the case once you start playing, and the 4MB VRAM listed among the minimum requirements gives it away as well, since I’m pretty sure no DX7 card ever had less than 32MB.

The only other issue I noticed, is a distinct lack of correct framepacing during cutscenes. And for a game with a lot of cutscenes, that is not small problem after all. Oh well. It might be worth playing again anyway, if only to see how they handled Psycho Mantis without two gamepad ports…