Tag Archives: games

We need to go deeper

I’ve talked in the past about how installing the latest available drivers for any given old graphics card, could occasionally not be the best idea.

Today I decided to test my cards again with a new game for the benchmark suite – MDK2. It is one of the most demanding tests included, especially since the only other OpenGL game I’ve got is Quake 2, which is not that difficult to run nowadays. It is, in fact, demanding enough that on default settings you need 15MBs just for textures. Most old cards won’t have that much, but AGP texturing is obviously an option. Except when it’s not…. and also when it should be, but it isn’t.

Crappy photo time! Look at that frametime, just look at it. You might also notice the color banding, but that should probably be the least of your worries.

My first test brought me to benchmark the STB Velocity 128, using the proprietary drivers with OGL support. As it turns out, it wasn’t a good idea. After one hour of suffering, I got my results: 0.36fps. Clearly there was something wrong. Attempting again with the newest universal drivers, I got a far more acceptable 27.8fps, albeit with some transparency issues. Most likely, the older drivers were unable to use system memory for textures. The score was effectively the same on the 128ZX in spite of its halved bus width and SDRAM, proving that the game wasn’t really using the card’s onboard memory at all. Though I suspect something else was at play too – other cards similarly unable to use system memory, still got better results (for example, the SiS 6326 8MB reached 2.2fps even though it should be slower than a Riva).

Look, I know zero frames per second isn’t exactly playable, but the overall image quality was actually higher than those newer drivers! That’s gotta count for something!

Moving onto the oft-maligned Trident Blade 3D, I initially noticed a similar behavior – horrendously slow framerates. This didn’t make sense, since I knew the Blade 3D was supposed to support AGP texturing just fine. I once again tried reverting to an older drivers set, which wasn’t so easy to find. Just about every site around will offer only the 6.50.5452-95 drivers, which are the latest ones. After a while, I was able to dig out these 6.50.5452-73 ones, which may sound similar but are really one year older. And now MDK2 works. Well, somewhat. While it’s now acceptable on 800×600 (though an average of 19fps isn’t anything to write home about), other resolutions will cause huge CPU frametime spikes for no apparent reason. Anyway, still better than 0.3fps. And the older drivers also gave me slightly better framerates in Quake 2, even though the OpenGL ICD included was supposedly still in beta, and solved some picture quality issues in Final Reality and Shadows of the Empire!

It’s a shame these drivers were hidden in the depths of the internet. The most commonly found ones tend to have more issues. I don’t think many people will care about old video cards anymore, but just in case there are other weirdos like me around, they oughta remember not to stop at what you see on the surface.

In other news, I’ve just ordered an Oxygen GVX1, which of course will need to go through the entire benchmark suite. Let’s hope I can do that without swapping drivers between tests.


Older But Faster

Life goes on as it has always done, and so does my testing. Occasionally, I manage to find something interesting, such as this Riva 128 which had been abandoned in a waste management center. Might as well give it a new home.

Gotta love the ridiculously small heatsink. At first I thought it was a bad patch-job by the card’s previous owner, but checking VGAMuseum proved me wrong.

I already have a Riva 128ZX, but it was disappointing. Bad results in Direct3D and even in OpenGL left me somewhat cold. Some research, however, showed that the ZX came in two flavors: slower 64-bits SDRAM and faster 128-bits SGRAM. The former was much more common, meaning that you are more likely to find it around today. For comparison, the original Riva 128 was limited to 4MB, but always SGRAM. That alone wouldn’t explain the quality issues, though, since my tests were way worse than I expected. So I thought it could be a drivers issue, and tried the same universal drivers available on the Nvidia website. Indeed, same problems. But here comes STB to the rescue.

As it turns out, this specific Riva 128 is a STB Velocity 128, a fairly common model of its time. And STB made its own drivers for the card. Which I managed to find. I forgot the website, alas, but I downloaded them already. So now they are safe for posterity. After some extensive testing, I did find one useful set: drivers 0132 give me the best Direct3D quality and speed overall.

Only one problem then: OpenGL support is not available. For that, you gotta use the newer 0166 drivers, and those have the exact same issues in D3D as the latest drivers on Nvidia’s website, if not worse. Anyway, OpenGL works well with those, so I was able to test Quake 2 and still get much better framerates than the ZX, which at this point I’m assuming is because of the latter’s SDRAM memory. Unfortunately, with just 4MB of video memory, 1024×768 is never an option. But arguably, this wasn’t the kind of card you’d have wanted to run at anything higher than 800×600 anyway.

Specs: i440BX2, Pentium 3-450mhz, 128MB PC100

I wish I was able to test the ZX with those same 0132 drivers, but there’s a hardware check: when I tried to install the drivers, it let me do it, but then refused to boot Windows 98 due to “wrong hardware”. Crafty. I had to remove the drivers from safe mode. Either way, I’m guessing the scores would have been appreciably lower, just like in Quake 2.

On the plus side, the 8MBs of the ZX give you better textures. With its smaller amount of memory, the Velocity has some trouble with distant mipmaps, as seen on the barman in Jedi Knight here. AGP texturing, if available at all, must not have been very good.

One weird thing: the newer drivers show multitexturing among the available features, although obviously the card cannot support it. With the STB drivers, this wasn’t the case. Perhaps Nvidia added it at the very end of the Riva’s life to make it more appealing to the average consumer. Or maybe the drivers will manage textures to reduce the workload for the CPU. I’m not sure.

Dithering is a sore spot. The Riva 128 shows an overly grainy image, and while noise isn’t quite as irregular as with the Radeon VE, it’s much more prevalent. At least Nvidia got better later on.

And while the STB drivers don’t have the same issues with subpixel accuracy and mipmapping, filtering is still lacking and trilinear is of course absent. So the card has its limits, to be sure. But at least this experience goes to futher show that, on older grapgics cards, drivers could make all the difference.

Now, what is my next card going to be? Maybe another trip to the waste management center tomorrow will show me the way…

Variks would be really bad at Minesweeper

It’s not a widely known fact, but I was one of those who bought Destiny at launch. Yes, the beta was good enough to convince me to preorder, something I do quite rarely. Did the full game meet my expectations? It sorta did, though for a particular reason: I had to leave for a traineeship in Spain at the end of September, so I only got to play for about three weeks. By the time I got back, I had other stuff to play.

Of course, I knew all about the grind. Who didn’t? It was the biggest story of September 2014. But then I missed all the old DLC, eventually passed over The Taken King (trivia: I got together with a random group to finish the Vault of Glass just before the Year 2 reset), and even Rise of Iron. I thought my Destiny days were over, but I eventually bought The Taken King on a while because it was really cheap. And from there, I was hooked again. Compared to what the game used to be (remember the Spinmetal runs?), there’s just so much stuff to do now.

I thought the Warden wanted me to kill this guy. Why is he sending mines for me to dismantle?

Unfortunately, matchmaking is still missing from many activities. One of these being Skolas’s Revenge. Nowadays people only want to play the highest level stuff, and this guy is old news, so good luck finding someone to play with. But he’s still a massive pain. I had to do it solo, and it took some 15 attempts. When I finally got lucky with the mines placement, the guy went down easily enough.

So I got some items which are near useless today (thanks for the free Judgment reputation though)… and this nifty emblem. This is all I ever wanted. Oh, and that’s my PSN name, I guess.

It does exemplify the basic problem with Destiny though – the lack of matchmaking in several activities can hurt badly. But hey, I’m done with this arena, I can finally let my PS+ expire with no qualms (it will expire next week).

I got Rise of Iron too, so I’m sure there’s lots of other stuff to do. Even if most of it seems to require cooperation… oh well. I don’t really feel like renewing, so I’ll do the single player stuff. Just like old days. And maybe, if I can finally get rid of this addiction, I can start taking care of my backlog again.

Leo’s Toy Store

By now everyone knows, I’m certain, that 428 is going to be localized by Spike Chunsoft next year. This event, the magnitude of which has been unseen in decades, has consequences twofold for me. First, it gives me a chance to experience the sequel to my personal game of 2016 (yes, Machi came out in 1998, I know). Most importantly, it means I can drop my plans to use the on-screen translator on the emulated Wii version.

On the left, the ultimate villain of Machi (probably not featuring in 428)

I’m somewhat worried, though. I can’t help but think that the sheer amount of effort required to play Machi while translating on the fly, trying to interpret those words the OCR couldn’t recognize, and overall spending a very long time with the game, has been instrumental in my enjoyment of it. Now that 428 is going to be easily available, that crucial element is going be missing. Perhaps… perhaps I should translate it myself again, instead?

Another thing that required a lot of effort was War and Peace, which I finally finished a few days ago. Started in December, finished in March… not bad. At least now I know that Tolstoy was totally a Napoleon hater, and Kutuzov fanboy. The first half was really good, and one of the best written books I’ve ever read. But the second half, when the war starts, quickly goes downhill. Even the writing falls in quality, with Tolstoy often repeating himself five or six times in the same paragraph. It doesn’t even feel like I was reading the same book. Too bad, because it had started so well. Oh well, the hype can’t always be real.

Of course, there’s a new book in the pipeline already. Did you doubt it?

Memory Goes Here, Performance Goes There

Another failure? At least an interesting one, this time.

A whole 8MB on a single stick. Only in 1998, folks.

Just a few days ago, I found a cheap 8MB SGRAM expansion for the Matrox G200 series. Yes, it’s a memory expansion for real this time. It was supposed to bring my G250 all the way up to 16MB. In itself, it’s already a useless experiment – the G400 32MB has more memory, is faster in everything, and has literally the same compatibility (including the same issues). While I was sure it wouldn’t make any difference in lower resolutions, I was thinking that perhaps you could see an effect once the local memory was entirely filled up by the framebuffer.

What I didn’t know, was that the memory expansion would actually decrease the default memory and core clocks on the card.

You don’t have to worry about higher resolutions if your monitor is crap.

I said in the past, that my G250 seems a bit different from the specs originally mentioned on Wikipedia: the core runs at 105mhz core, and the memory at 140mhz. That’s pretty high for its time, but I tested the veridicity of Powerstrip’s claims by running a few games and noticing that framerates scaled almost linearly against the G200A (which runs at 84/112mhz). It doesn’t even seem like an anomalous overclock, since scores stay up no matter how long I keep the tests running, and there are no artifacts in sight.

But after installing the memory daughterboard, suddenly I found the clocks going down to 90/120mhz. Attempting to overclock the card all the way up to the original values produced slight artifacts, so I didn’t make any further attempts. And sure enough, testing the card showed a sizeable decrease over the original framerates. The Forsaken test is particularly telling: the framerate matches the core clocks almost entirely, and shows that, at least on a P3-450mhz, the game is completely bound by the graphics card.

The complete set. Now with automatic downclocking.

I made two mistakes: I thought there would be no difference at lower resolutions, but there was. And also, I thought there might be a difference at high resolutions, but it didn’t quite turn out. Even with something like 1024x768x32 in Incoming, which is supposed to fill the framebuffer almost entirely, the framerate delta is still effectively the same. 3DMark 99 does show a slight proportional increase when running at 1280×1024, but the difference is pretty small. I suppose the G200 series was really good at AGP texturing. It had DiME support, like the i740, whereas many AGP cards of the era stopped at DMA.

So what happened? Well, I have a theory. The expansion module was made for the old G200, which only ran at 84/112mhz (just like the later G200A die shrink). So they didn’t bother making memory chips that could run much faster than that, since they weren’t expecting people to clock the card any higher – after all, the G200 wasn’t even quite a gamer’s card to begin with. Therefore, since the G200 seems to always run with a 3:4 ratio between the core and memory, if you add slower memories, the core will go down too. Bummer, uh?

Thank god my paycheck came in just a few days ago.

So that was an interesting experiment, but it could have gone better. Lately, all of my experiments haven’t gone so well, perhaps it’s a sign that my benchmarking days are over? Time will tell. At least the rest of my haul from yesterday wasn’t bad, as you can see. I expect to start Barrow Hill pretty soon, perhaps in the weekend (still playing Claw)… while the Zork book will have to wait until War and Peace is finished, which might take a little while.

Oh, and the SiS 6326 is a C3 revision with just 4MB of memory. Even worse than expected. I’ve never seen such horrible texturing perspective issues. Another one for the shelf.

My Matrox Mystake

That was a failure.

See, a few days ago, I bought myself something that seemed real nifty – a Matrox Mystique 220 with memory expansion add-on. Just imagine, an 8MB MGA! Think of the possibilities! (well, mostly, playing games in 1024×768 really slowly)

See that bump on the left? It looks exactly the same as a MGA chip. Why have a second chip on a memory expansion module? The answer will be revealed soon.

So it arrived. At first it looked real nice, but then I noticed something amiss: the expansion module is too big, for starters. Also, it had what looked like a second MGA chip on it. And finally, the PC still only saw it as a 4MB card. A quick googling for the MYST/RRSTI code revealed the truth: it’s actually a Rainbow Runner Video add-on card, essentially an AVI and MPEG1 encoder/decoder. Useful, I guess, but not for my purposes.

Oh well, at least I got a Mystique 220 out of it, right? Well, that was something of a disappointment too. The chip is an MGA-1164 (compare to my old Mystique, which was MGA-1064), and compatibility overall is actually a lot lower. Most tests and games will freeze at random, if they even start at all. The only solution I could find was disabling Bus Mastering, but that will reduce performance severely.

Apparently, with bus mastering, the card can start working on polygons while the CPU is still busy. Or maybe the opposite? Anyway, Turok has absurd slowdowns, 3DMark is a lot slower, sound in Forsaken seems slowed down, and overall stuff doesn’t run as well. At least it didn’t crash anymore, but at that point I might have as well used the older Mystique instead. Surprisingly, Quake even shows that the card doesn’t support the same 256 colors resolutions as its older brethren: nothing higher than 640×480 is shown.

At least it looks cool, I guess? It’s like a sandwich card. Two slices of PCB with a filling of memory and graphic chips. The unfiltered taste and stippling aftertaste might not be to everyone’s liking though.

So yeah, a big disappointment overall. I need to check my sources better. Can’t really fault the seller, because he probably didn’t know what he was actually selling. A lesson for me, I guess. The next card coming, and possibly the last one (I know I always say it…) is a SiS 6326 AGP. Not sure how much memory, but the memory chips make it look like a 4MB model. Strange, since I believed 8MB ones would be more common when it comes to the AGP model. Either way, it should be better than this mistake here.

Revenge is a dish best served old

In this day of strange inaugurations and general protests, there are many things to complain about. No least of them, is the fact that Nvidia lies to us. See, the latest Geforce 2 drivers available on the site is 71.84, which is essentially the last driver to support many of their older cards. Sounds good, right? Except those drivers don’t really work all that well. Several games had somewhat unwelcome issues or were slower than expected. Removing Vsync was a pain. Don’t even get me started on Final Reality suddenly glitching the whole framebuffer.

So I had to cycle through a few older drivers. I tried going back a year… nothing. Another year… nothing. In the end, I settled for the 12.84 driver, which came out long ago, in 2001. And what do you know, they work much better. No more issues, faster results, anisotropic is now supported, even 3DMark 99 shows improvements.


TNT2 M64: 3953
Geforce 2 GTS: 3303


TNT2 M64: 4349
Geforce 2 GTS: 4266

Nvidia strikes back! And let’s not even go into my Forsaken results. The GTS goes from a disappointing 132fps in 640×480 (which made it slower than the Voodoo 3), to a much more satisfying 200fps. And holds to 190fps in 1024×768, too.

Oh, and being unified drivers, they work with literally everything from the TNT onwards. Well… they should. The Geforce 2 MX, TNT2 and TNT2 M64 all work fine. But the old TNT didn’t want to be recognized. One more mystery with this card… I had to revert further to 6.47, and even then, it’s still a horrible model. Who knows what’s up with it.

I’ve always wanted an Elsa card. It’s so going to become my new main.

Did I say TNT2 up there? That’s right, I got a TNT2 recently. While the world of online auctions is inundated with M64 models, plaguing our searches like locusts during a swarm, I was eventually able to procure a basic TNT2 for cheap. And it does its job well (of course using those 12.84 drivers). Similar results to the M64 in 640×480, but keeping up much better when you run into higher resolutions.

What your money could buy in 1999. The 128 GL doesn’t really belong, but I’m afraid I don’t have its more powerful Pro cousin. And look at that trilinear filtering speed on the Nvidia card!

I had wanted to make the Radeon VE my main, as it runs Blood 2 better than most other cards. But its bad 16-bits dithering was very annoying. The TNT2 takes its spot instead, as its twin pixel pipeline seems to work better than the “triple TMU on a single pipeline” setup of the Ati card. I do question its ability to multitexture well, though. Despite enabling the option in Quake 2 via registry edit (for some reason the game doesn’t use multitexturing in 16-bits by default on the TNT series), my results didn’t improve at all. Mmh.

Well, still looking for other cards, I guess. But the number keeps getting smaller. I suppose a Sis 315 could be nice to check, but a mild distraction at best. I think a 6236 would be too slow to be really interesting. Geforce 256’s apparently are rarer than a Voodoo 4, and you can’t get close to PowerVR prices. Perhaps my collecting days are over… but I always keep an eye open on Ebay. You never know when someone will drop that cheap Fury MAXX you are looking for.