Another failure? At least an interesting one, this time.

Just a few days ago, I found a cheap 8MB SGRAM expansion for the Matrox G200 series. Yes, it’s a memory expansion for real this time. It was supposed to bring my G250 all the way up to 16MB. In itself, it’s already a useless experiment – the G400 32MB has more memory, is faster in everything, and has literally the same compatibility (including the same issues). While I was sure it wouldn’t make any difference in lower resolutions, I was thinking that perhaps you could see an effect once the local memory was entirely filled up by the framebuffer.
What I didn’t know, was that the memory expansion would actually decrease the default memory and core clocks on the card.

I said in the past, that my G250 seems a bit different from the specs originally mentioned on Wikipedia: the core runs at 105mhz core, and the memory at 140mhz. That’s pretty high for its time, but I tested the veridicity of Powerstrip’s claims by running a few games and noticing that framerates scaled almost linearly against the G200A (which runs at 84/112mhz). It doesn’t even seem like an anomalous overclock, since scores stay up no matter how long I keep the tests running, and there are no artifacts in sight.
But after installing the memory daughterboard, suddenly I found the clocks going down to 90/120mhz. Attempting to overclock the card all the way up to the original values produced slight artifacts, so I didn’t make any further attempts. And sure enough, testing the card showed a sizeable decrease over the original framerates. The Forsaken test is particularly telling: the framerate matches the core clocks almost entirely, and shows that, at least on a P3-450mhz, the game is completely bound by the graphics card.

I made two mistakes: I thought there would be no difference at lower resolutions, but there was. And also, I thought there might be a difference at high resolutions, but it didn’t quite turn out. Even with something like 1024x768x32 in Incoming, which is supposed to fill the framebuffer almost entirely, the framerate delta is still effectively the same. 3DMark 99 does show a slight proportional increase when running at 1280×1024, but the difference is pretty small. I suppose the G200 series was really good at AGP texturing. It had DiME support, like the i740, whereas many AGP cards of the era stopped at DMA.
So what happened? Well, I have a theory. The expansion module was made for the old G200, which only ran at 84/112mhz (just like the later G200A die shrink). So they didn’t bother making memory chips that could run much faster than that, since they weren’t expecting people to clock the card any higher – after all, the G200 wasn’t even quite a gamer’s card to begin with. Therefore, since the G200 seems to always run with a 3:4 ratio between the core and memory, if you add slower memories, the core will go down too. Bummer, uh?

So that was an interesting experiment, but it could have gone better. Lately, all of my experiments haven’t gone so well, perhaps it’s a sign that my benchmarking days are over? Time will tell. At least the rest of my haul from yesterday wasn’t bad, as you can see. I expect to start Barrow Hill pretty soon, perhaps in the weekend (still playing Claw)… while the Zork book will have to wait until War and Peace is finished, which might take a little while.
Oh, and the SiS 6326 is a C3 revision with just 4MB of memory. Even worse than expected. I’ve never seen such horrible texturing perspective issues. Another one for the shelf.