Tag Archives: gpu

Bits and Pieces

it’s been a while, but work still hasn’t quite let up yet. I expect it will keep me super busy until at least the end of July. And while that’s great, it means I have time for little else. But that doesn’t mean I can’t make a few random purchases once in a while. Here’s what I got.

Now you are playing with… very limited power, even for 2002.

First up, a Geforce 4 MX 420. This is actually beyond my usual period range of graphics card, having been released in 2002. Normally, I wouldn’t consider anything beyond a Geforce 2 for my benchmarks. But this specific card was the bottom of the barrel for the Geforce 4 MX line, which itself was already the red-headed stepchild of the Geforce 4 line, missing its shader units and looking more like a souped-up Geforce 2 MX. So I thought it was worth trying.

Unfortunately, it won’t happen – the card seems to be broken. One long beep, three short beeps, and you know there’s nothing to do anymore. A shame, because I wanted to see how it would compare to my MX440SE – essentially the same card, but with half the bandwidth due to the use of 64-bits memory. It would have been nice to see how the limited bandwidth impacted performance. But that won’t happen. Oh well, it was cheap enough that it’s worth keeping in my drawer anyway.

Italian language manual (with various strange mistakes) included. The rest of the box and game is entirely in English. Judging from the customs company sticker, this entire release seems to have been imported.

My second purchase had a happier ending. Blood Omen on PC is a game that I have been long pining for, and I finally found a boxed copy for relatively cheap. It’s somewhat beat up on the right side, but nothing too noticeable. Much more surprising was that the game itself seemed brand new – it was still shrinkwrapped, and even the jewel case inside was definitely new. The auction did say “new”, but I thought it was the usual crap. I wonder if this box had been left buried somewhere under other things, and then someone found it randomly and decided to put it on Ebay. Either way, the first thing I did was to… remove the shrinkwrap. I can already hear the collectors crying in pain, but I prefer to actually play my games.

Sits well enough with my few other PC game boxes. Love the old Crystal Dynamics logo – makes you think someone would actually remember Gex today. Hey, now there’s an idea for a remaster.

The last purchase is a copy of Splinter Cell for Xbox. With the announcement that the original trilogy was to be released on Xbox One with resolution and framerate enhancements, I thought I could finally use my own review disc to play it. Of course, that wasn’t to be. The disc isn’t recognized by the console. So I ordered a cheap replacement. Worth it overall, given that Splinter Cell is among my favorite games from the PS2/Xbox generation. Time to wait the playing game now.

What? You mean my promotional-use-only, not-final-code disc is not recognized by the Xbox One? Surely you jest! In my defense, it did run on the Xbox 360. False hope is all the rage today.

What else awaits? As usual, nobody can tell yet. I do have one hope though: now that Blood Omen is out of the way, there is only one PC game left that I absolutely need. Should I ever find it cheap enough (unlikely, I know) it will be mine. As for which one it is, that will be revealed in due time.

Advertisements

Tactical Espionage Graphics

I remember Sons of Liberty. It had one of the weirdest plot I’ve ever seen, at least for its time – nowadays, it’s been surpassed by many other games. But without letting the mumbo jumbo confuse us, it was still a competent action-stealth game. If you were playing on PC, though, Solidus Snake wasn’t your only problem: you also had to deal with a bunch of technical issues. I was able to play relatively well a few years ago, but the lack of sea was especially baffling – it felt like the Big Shell was some kind of airbase. Can’t forget that.

Luckily those genome soldiers with their highly developed senses of hearing and vision can’t see my arm sticking out

By most accounts, MGS1 wasn’t as bad. Of course, it does get a bit worrying when you open the readme file for the demo to see the “list of cards with known issues” – and notice that it includes just about every graphics card available at the time. Nevermind that their standard answer to every issue was to just update your video drivers. I wonder how well that worked for everyone.

On my part, I was able to try the game with a number of graphics cards, and didn’t notice anything particularly bad.

Geforce 2 GTS

Voodoo 3

Savage 3D

STB Velocity 128

Aside from some gamma differences, there is nothing really setting the first three cards apart. The Riva 128, on the other hand, has clear polygonal issues, and Snake’s shadow is also bad – although I would worry about his missing forehead first. It’s also a bit too slow. The V3 and GF2 have no trouble, while the Savage 3D can drop frames. I was also able to try a SiS 6326 and it was about as bad as you’d expect. Image quality was okay, but 320×240 was required for a playable experience.

The screenshots also highlight the dithering methods used by the different cards. Unsurprisingly, the Geforce 2 comes out on top, while the Riva 128 hands in the worst result. The S3D and V3 seem to trade blow on paper, but in truth, 3dfx’s higher precision internal rendering gives the image a much better quality that is simply not visible in screenshots.

Unfortunately, the game is prone to crashing when you try and change the visual options, so I’ll probably end up playing on 640×480 despite the Geforce 2 being able to power through 1024×768 just fine, even with 4x MSAA enabled from the GPU control panel. Not that I really wanted to use that, since the MSAA employed by Nvidia for the Geforce 2 seemed to cause a few issues.

1024×768: no AA, no problem

1024×768: AA 4X apparently causes what seems to be a reduction in color depth, with more obvious dithering: not a good tradeoff for some slightly smoother lines

Metal Gear Solid was clearly not a heavy game, being a PS1 port after all (on that note, I noticed some slightly wonky perspective correction no matter which card I used, perhaps a remnant of its PS1 roots). Sure, a SiS 6326 is outmatched, but what doesn’t outmatch a 6326? Yet even a semi-budget card like the Savage 3D turns in a well playable experience. Technically the game requires a DirectX 7 card, but that’s obviously not the case once you start playing, and the 4MB VRAM listed among the minimum requirements gives it away as well, since I’m pretty sure no DX7 card ever had less than 32MB.

The only other issue I noticed, is a distinct lack of correct framepacing during cutscenes. And for a game with a lot of cutscenes, that is not small problem after all. Oh well. It might be worth playing again anyway, if only to see how they handled Psycho Mantis without two gamepad ports…

We need to go deeper

I’ve talked in the past about how installing the latest available drivers for any given old graphics card, could occasionally not be the best idea.

Today I decided to test my cards again with a new game for the benchmark suite – MDK2. It is one of the most demanding tests included, especially since the only other OpenGL game I’ve got is Quake 2, which is not that difficult to run nowadays. It is, in fact, demanding enough that on default settings you need 15MBs just for textures. Most old cards won’t have that much, but AGP texturing is obviously an option. Except when it’s not…. and also when it should be, but it isn’t.

Crappy photo time! Look at that frametime, just look at it. You might also notice the color banding, but that should probably be the least of your worries.

My first test brought me to benchmark the STB Velocity 128, using the proprietary drivers with OGL support. As it turns out, it wasn’t a good idea. After one hour of suffering, I got my results: 0.36fps. Clearly there was something wrong. Attempting again with the newest universal drivers, I got a far more acceptable 27.8fps, albeit with some transparency issues. Most likely, the older drivers were unable to use system memory for textures. The score was effectively the same on the 128ZX in spite of its halved bus width and SDRAM, proving that the game wasn’t really using the card’s onboard memory at all. Though I suspect something else was at play too – other cards similarly unable to use system memory, still got better results (for example, the SiS 6326 8MB reached 2.2fps even though it should be slower than a Riva).

Look, I know zero frames per second isn’t exactly playable, but the overall image quality was actually higher than those newer drivers! That’s gotta count for something!

Moving onto the oft-maligned Trident Blade 3D, I initially noticed a similar behavior – horrendously slow framerates. This didn’t make sense, since I knew the Blade 3D was supposed to support AGP texturing just fine. I once again tried reverting to an older drivers set, which wasn’t so easy to find. Just about every site around will offer only the 6.50.5452-95 drivers, which are the latest ones. After a while, I was able to dig out these 6.50.5452-73 ones, which may sound similar but are really one year older. And now MDK2 works. Well, somewhat. While it’s now acceptable on 800×600 (though an average of 19fps isn’t anything to write home about), other resolutions will cause huge CPU frametime spikes for no apparent reason. Anyway, still better than 0.3fps. And the older drivers also gave me slightly better framerates in Quake 2, even though the OpenGL ICD included was supposedly still in beta, and solved some picture quality issues in Final Reality and Shadows of the Empire!

It’s a shame these drivers were hidden in the depths of the internet. The most commonly found ones tend to have more issues. I don’t think many people will care about old video cards anymore, but just in case there are other weirdos like me around, they oughta remember not to stop at what you see on the surface.

In other news, I’ve just ordered an Oxygen GVX1, which of course will need to go through the entire benchmark suite. Let’s hope I can do that without swapping drivers between tests.

My little SiS (6326) can’t be this fast…er

Remember when I tried the SiS 6326 C3 and it sucked? Well, I ultimately decided to test the C5 revision as well. I actually found one still in its box. The stuff people keep around these days… it has its drivers disc (only includes version 1.23, so pretty useless today) and even a small user manual.

Never even heard of this brand. But the side of the box implies that they might have made an Intel i752. Now that would be a rare find.

So let’s look at the facts. The C3 revision had its fair share of issues, chiefly some horrible perspective correction and warping polygons. After completing my tests, I can say that these are mostly gone here. I say mostly, because it’s still not as good as some other cards… but considering the price, it’s not that bad.

One quick look at the manual would show the line “Supports 4MB SGRAM memory configurations”. Suspicious. My model is supposed to have 8MB. And it is. Except that, as it turns out, no game will run at anything above 800×600, choking on Out Of Memory errors. Digging around the net, I discovered that some people believe the SiS 6326 can’t actually access more than 4MB. That seems to be true. It would mean everything else after that amount can’t be used as framebuffer and becomes texture memory. My tests, again, seem to corroborate this theory. Having an 8MB card is useful though, because unlike my 4MB card, it doesn’t drop any textures. And let’s face it, the 6326 is slow as a snail so you wouldn’t really wanna run on 1024×768, even in the simplest games.

I was also able to find a specific “High Angle” driver (yes, that’s the version, it doesn’t have a number) that manages to support OpenGL in Quake 2 (about as badly as you’d expect) and somehow allows 3DMark 99 to run on its default settings. Speed is just as bad as always, perhaps slightly faster than before, but not in any appreciable manner. It’s really all about the improved image quality. Still, OpenGL manages to make it even worse.

The Quake 2 demo1.dm2 hall of shame (i440BX2, P3-450mhz, 128MB PC100). I guess the Riva and i740 don’t really belong in there, but that performance drop on 1024×768 is quite ugly.

Strangely, the card seems to use the same refresh and resolution timings as the newer SiS 305, rather than those used by the previous 6326 model. Maybe some things were changed in between.

With its outstanding issues fixed, the 6326 is a little bit faster than a Rage IIc and even a bit more reliable. For 1998 however, one year after the original (buggy) 6326 was released, it was just too slow, no matter how small your budget may have been. It sure sold a lot though. I wonder how many people bought one, just to discover that it was the true successor of the graphics decelerator? Good times indeed.

This is why we don’t do 16 bits anymore

A new year arrives, and with it, new resolutions! New goals in life! nah, not really, it’s still the same as always. Somebody has to think about old graphics card, and that somebody is obviously me. I even got several more cards in the last couple months. I’m reaching a point where my collection is nearly complete, although some expensive models are obviously missing. I doubt I’ll ever find a PowerVR for cheap. Oh, why did I sell mine years ago? I’m so dumb.

One thing I’ve been noticing lately, is how every chip maker had their own approach to dithering. Back in the days, obviously, fillrate was a precious resource and 32-bits rendering was essentially out of the question for all but the simplest of games. This generally changed with new techniques like Ati’s Hyper-Z, but until that, most people aiming for a mid-end or low-end graphics card were likely going to play games on 16-bits color depth. And that meant either color banding, or some kind of dithered approach. Or maybe nothing, Matrox was pretty good at this, but a few other cards apparently weren’t.

32bits
Incoming! The orange sky is really great to see bad dithering artifacts. Not here though.

I fear WordPress compression will ruin the images, so I linked on Imgur. Just click on them for the full PNG picture.

What you see above is a fine example of 32 bits rendering (taken on a SiS 315L, but it makes no difference really). See any color banding? Any dithering? Of course you don’t. There’s 16 million colors up there after all, alpha channel not included.

But alas, that many colors aren’t easy to sustain. I guess we’ll have to settle for 65 thousand colors instead.

So you wanted a budget experience. Have a budget experience!

I suppose there is only so much you can do without MILLIONS of colors. Even so, the Kyro 1 turns in a decent effort. Dithering is visible on the upper right corner, and a bit on the ground too, but this isn’t too shabby. Ideally, if all cards looked like this, it would be great.

Now keep in mind the Kyro would easily power true colors in an old game like Incoming, but this is just science. Sorry Kyro, you get the same playing field as everybody else. And I hate that your drivers make my PC crash half the time and you can’t even run certain DOS games without fatal errors. So there.

Not all is right in Noise Land.

This second picture comes from a Radeon VE, a quite popular budget card of the time, which somehow had three TMUs even though most games never needed more than two textures at once. Fine engineering choice there. Anyway, it wasn’t the only one, because as you can see from the screenshot above, 16-bits rendering wasn’t especially nice.

The noise is evident everywhere, but especially on the metal plate and the portion of sky on the upper right. The distant mountains even show some clear color banding. I would even say this could be actually running in 256 colors, but that isn’t the case. I’ve tried with a couple different sets of drivers, and tinkered with all the options I could find in the settings panel, but to no avail.

I should mention that the Radeon VE came out in 2001, so it’s not even that ancient. The Radeon SDR (which I’ve acquired recently) seems to have the same problem, too, even though it’s technically a different chip. The Rage 128, on te other hand, was better. Weird. This kind of stuff would have been unacceptable in 2000 already, so what’s up with it? No old review seems to make any mention of this problem – were they all playing in 32 bits? I find it hard to believe, especially for a budget card.

Brace yourselves, for the worst is yet to come.

But as bad as the Radeon was, it still wasn’t as bad as the SiS 315 (yes, that same card which I used to take the 32-bits picture, funny what a difference a few bits can make). Alright, so very few people actually even played games on this card, although the number of units available on eBay would easily make you think otherwise. Maybe a lot of people had them in office computers… but it’s the same thing, really.

Anyway, if anyone actually played on this card, and I doubt they were able to use 32-bits colors because of relatively measly specs for its age (simply dual pipelines, and even trilinear incurs a substantial drop in speed), then they would have been greeted by the mess you see above. Vertical lines everywhere – and color banding too? Look at the sky and weep.

Worth noting that, of the three cards on test today, the SiS 315L is the only one unable to really sustain true color rendering. It’s a a cheaper variant of the 315, which already wasn’t a speed monster in itself, Incoming already falls down to an average of 44fps in 800x600x16 (would be more without Vsync, but i couldn’t find a way to disable it and besides it’s not like you actually want tearing while playing). But even the Radeon shouldn’t be excused, since several old games only ever supported 65k colors – and from those, there’s no escaping the dithering patterns.

It wasn’t really something I ever tried to notice in my tests, but now something as evident as these two cards has come along, I can’t unsee it anymore. It will be something to notice in future cards as well.

My little SiS (6326) can’t be this fast

I remember when I first tried the SiS 6326. My first thought was something along the lines of “what the hell is this piece of crap?”. A few months later, I have to revise my opinion, if only somewhat.

Resident Evil doesn’t seem to like the old 6326 much. This is, however, a masterpiece of pop-art that should be cherished. Perhaps in a modern art museum.

One thing I should mention, is that my card is merely a C3 revision. From what I can gather, the original C1 model was terrible. And from my tests, the C3 is not that much of an improvement. Games will occasionally show warped polygons, and in general, texture perspective correction is botched to the point of not working, at least in some of the titles in my benchmark suite. 2D elements seem to have trouble displaying correctly. Texture resolution is often abysmal (the card’s paltry 4MB and very simple AGP compliance don’t help). Mipmapping apparently turns on and off at random. And the lack of OpenGL is just icing on the sewage cake, not that the 6326 would be able to run Quake 2 or even GLQuake decently, even if it were supported. Mind, the newest drivers even say that an ICD OpenGL is included… but I haven’t managed to run anything on it.

Crank that Blood 2 all the way up to 640×480. The SiS can run it. In low details, of course. If you want proper lighting, then you gotta go down to 512×384… and possibly 400×300.

So it doesn’t sound great today, but as I tried using it a bit more, and considering the circumstances, there’s something to like too. Image quality is actually okay for a cheap 1997 card – there was far worse around, usually for a higher price. A common Virge might look generally better, but it also runs slower, and was more expensive anyway. And it should be mentioned that a later C5 revision solved most of the quality issues of the 6326, making it far more attractive. If it had come out earlier, it would have been even better… but you can’t have everything.

Rogue Squadron doesn’t support anything lower than 640×480, so you get a choppy framerate, no two ways around it. But it’s playable, and the Nintendo 64 version wasn’t very smooth either. For a 1998 game on a 1997 budget card, I’d call it good enough.

A basic Voodoo 1 was another world, of course. But it was also $300. Bragging to your console friends was a somewhat expensive proposition then. The SiS 6326 though, that was just $50… and for that, you got a decent card that could accelerate games to 16 bits fairly smoothly up to 400×300 or even 512×384. Of course it supported higher as well, but people weren’t fans of slideshows even back then.

If I can find it cheap (and it should be easy, since these cards sold quite a lot), I would like to get my hands on the AGP C5 revision, to see how it compares. I also wonder how it compares to a Nintendo 64… maybe I will try a few conversions.

Memory Goes Here, Performance Goes There

Another failure? At least an interesting one, this time.

20170216_194151
A whole 8MB on a single stick. Only in 1998, folks.

Just a few days ago, I found a cheap 8MB SGRAM expansion for the Matrox G200 series. Yes, it’s a memory expansion for real this time. It was supposed to bring my G250 all the way up to 16MB. In itself, it’s already a useless experiment – the G400 32MB has more memory, is faster in everything, and has literally the same compatibility (including the same issues). While I was sure it wouldn’t make any difference in lower resolutions, I was thinking that perhaps you could see an effect once the local memory was entirely filled up by the framebuffer.

What I didn’t know, was that the memory expansion would actually decrease the default memory and core clocks on the card.

g250bench1
You don’t have to worry about higher resolutions if your monitor is crap.

I said in the past, that my G250 seems a bit different from the specs originally mentioned on Wikipedia: the core runs at 105mhz core, and the memory at 140mhz. That’s pretty high for its time, but I tested the veridicity of Powerstrip’s claims by running a few games and noticing that framerates scaled almost linearly against the G200A (which runs at 84/112mhz). It doesn’t even seem like an anomalous overclock, since scores stay up no matter how long I keep the tests running, and there are no artifacts in sight.

But after installing the memory daughterboard, suddenly I found the clocks going down to 90/120mhz. Attempting to overclock the card all the way up to the original values produced slight artifacts, so I didn’t make any further attempts. And sure enough, testing the card showed a sizeable decrease over the original framerates. The Forsaken test is particularly telling: the framerate matches the core clocks almost entirely, and shows that, at least on a P3-450mhz, the game is completely bound by the graphics card.

20170216_194429
The complete set. Now with automatic downclocking.

I made two mistakes: I thought there would be no difference at lower resolutions, but there was. And also, I thought there might be a difference at high resolutions, but it didn’t quite turn out. Even with something like 1024x768x32 in Incoming, which is supposed to fill the framebuffer almost entirely, the framerate delta is still effectively the same. 3DMark 99 does show a slight proportional increase when running at 1280×1024, but the difference is pretty small. I suppose the G200 series was really good at AGP texturing. It had DiME support, like the i740, whereas many AGP cards of the era stopped at DMA.

So what happened? Well, I have a theory. The expansion module was made for the old G200, which only ran at 84/112mhz (just like the later G200A die shrink). So they didn’t bother making memory chips that could run much faster than that, since they weren’t expecting people to clock the card any higher – after all, the G200 wasn’t even quite a gamer’s card to begin with. Therefore, since the G200 seems to always run with a 3:4 ratio between the core and memory, if you add slower memories, the core will go down too. Bummer, uh?

20170215_200400
Thank god my paycheck came in just a few days ago.

So that was an interesting experiment, but it could have gone better. Lately, all of my experiments haven’t gone so well, perhaps it’s a sign that my benchmarking days are over? Time will tell. At least the rest of my haul from yesterday wasn’t bad, as you can see. I expect to start Barrow Hill pretty soon, perhaps in the weekend (still playing Claw)… while the Zork book will have to wait until War and Peace is finished, which might take a little while.

Oh, and the SiS 6326 is a C3 revision with just 4MB of memory. Even worse than expected. I’ve never seen such horrible texturing perspective issues. Another one for the shelf.