Tag Archives: sis

My little SiS (6326) can’t be this fast…er

Remember when I tried the SiS 6326 C3 and it sucked? Well, I ultimately decided to test the C5 revision as well. I actually found one still in its box. The stuff people keep around these days… it has its drivers disc (only includes version 1.23, so pretty useless today) and even a small user manual.

Never even heard of this brand. But the side of the box implies that they might have made an Intel i752. Now that would be a rare find.

So let’s look at the facts. The C3 revision had its fair share of issues, chiefly some horrible perspective correction and warping polygons. After completing my tests, I can say that these are mostly gone here. I say mostly, because it’s still not as good as some other cards… but considering the price, it’s not that bad.

One quick look at the manual would show the line “Supports 4MB SGRAM memory configurations”. Suspicious. My model is supposed to have 8MB. And it is. Except that, as it turns out, no game will run at anything above 800×600, choking on Out Of Memory errors. Digging around the net, I discovered that some people believe the SiS 6326 can’t actually access more than 4MB. That seems to be true. It would mean everything else after that amount can’t be used as framebuffer and becomes texture memory. My tests, again, seem to corroborate this theory. Having an 8MB card is useful though, because unlike my 4MB card, it doesn’t drop any textures. And let’s face it, the 6326 is slow as a snail so you wouldn’t really wanna run on 1024×768, even in the simplest games.

I was also able to find a specific “High Angle” driver (yes, that’s the version, it doesn’t have a number) that manages to support OpenGL in Quake 2 (about as badly as you’d expect) and somehow allows 3DMark 99 to run on its default settings. Speed is just as bad as always, perhaps slightly faster than before, but not in any appreciable manner. It’s really all about the improved image quality. Still, OpenGL manages to make it even worse.

The Quake 2 demo1.dm2 hall of shame (i440BX2, P3-450mhz, 128MB PC100). I guess the Riva and i740 don’t really belong in there, but that performance drop on 1024×768 is quite ugly.

Strangely, the card seems to use the same refresh and resolution timings as the newer SiS 305, rather than those used by the previous 6326 model. Maybe some things were changed in between.

With its outstanding issues fixed, the 6326 is a little bit faster than a Rage IIc and even a bit more reliable. For 1998 however, one year after the original (buggy) 6326 was released, it was just too slow, no matter how small your budget may have been. It sure sold a lot though. I wonder how many people bought one, just to discover that it was the true successor of the graphics decelerator? Good times indeed.

Advertisements

This is why we don’t do 16 bits anymore

A new year arrives, and with it, new resolutions! New goals in life! nah, not really, it’s still the same as always. Somebody has to think about old graphics card, and that somebody is obviously me. I even got several more cards in the last couple months. I’m reaching a point where my collection is nearly complete, although some expensive models are obviously missing. I doubt I’ll ever find a PowerVR for cheap. Oh, why did I sell mine years ago? I’m so dumb.

One thing I’ve been noticing lately, is how every chip maker had their own approach to dithering. Back in the days, obviously, fillrate was a precious resource and 32-bits rendering was essentially out of the question for all but the simplest of games. This generally changed with new techniques like Ati’s Hyper-Z, but until that, most people aiming for a mid-end or low-end graphics card were likely going to play games on 16-bits color depth. And that meant either color banding, or some kind of dithered approach. Or maybe nothing, Matrox was pretty good at this, but a few other cards apparently weren’t.

32bits
Incoming! The orange sky is really great to see bad dithering artifacts. Not here though.

I fear WordPress compression will ruin the images, so I linked on Imgur. Just click on them for the full PNG picture.

What you see above is a fine example of 32 bits rendering (taken on a SiS 315L, but it makes no difference really). See any color banding? Any dithering? Of course you don’t. There’s 16 million colors up there after all, alpha channel not included.

But alas, that many colors aren’t easy to sustain. I guess we’ll have to settle for 65 thousand colors instead.

So you wanted a budget experience. Have a budget experience!

I suppose there is only so much you can do without MILLIONS of colors. Even so, the Kyro 1 turns in a decent effort. Dithering is visible on the upper right corner, and a bit on the ground too, but this isn’t too shabby. Ideally, if all cards looked like this, it would be great.

Now keep in mind the Kyro would easily power true colors in an old game like Incoming, but this is just science. Sorry Kyro, you get the same playing field as everybody else. And I hate that your drivers make my PC crash half the time and you can’t even run certain DOS games without fatal errors. So there.

Not all is right in Noise Land.

This second picture comes from a Radeon VE, a quite popular budget card of the time, which somehow had three TMUs even though most games never needed more than two textures at once. Fine engineering choice there. Anyway, it wasn’t the only one, because as you can see from the screenshot above, 16-bits rendering wasn’t especially nice.

The noise is evident everywhere, but especially on the metal plate and the portion of sky on the upper right. The distant mountains even show some clear color banding. I would even say this could be actually running in 256 colors, but that isn’t the case. I’ve tried with a couple different sets of drivers, and tinkered with all the options I could find in the settings panel, but to no avail.

I should mention that the Radeon VE came out in 2001, so it’s not even that ancient. The Radeon SDR (which I’ve acquired recently) seems to have the same problem, too, even though it’s technically a different chip. The Rage 128, on te other hand, was better. Weird. This kind of stuff would have been unacceptable in 2000 already, so what’s up with it? No old review seems to make any mention of this problem – were they all playing in 32 bits? I find it hard to believe, especially for a budget card.

Brace yourselves, for the worst is yet to come.

But as bad as the Radeon was, it still wasn’t as bad as the SiS 315 (yes, that same card which I used to take the 32-bits picture, funny what a difference a few bits can make). Alright, so very few people actually even played games on this card, although the number of units available on eBay would easily make you think otherwise. Maybe a lot of people had them in office computers… but it’s the same thing, really.

Anyway, if anyone actually played on this card, and I doubt they were able to use 32-bits colors because of relatively measly specs for its age (simply dual pipelines, and even trilinear incurs a substantial drop in speed), then they would have been greeted by the mess you see above. Vertical lines everywhere – and color banding too? Look at the sky and weep.

Worth noting that, of the three cards on test today, the SiS 315L is the only one unable to really sustain true color rendering. It’s a a cheaper variant of the 315, which already wasn’t a speed monster in itself, Incoming already falls down to an average of 44fps in 800x600x16 (would be more without Vsync, but i couldn’t find a way to disable it and besides it’s not like you actually want tearing while playing). But even the Radeon shouldn’t be excused, since several old games only ever supported 65k colors – and from those, there’s no escaping the dithering patterns.

It wasn’t really something I ever tried to notice in my tests, but now something as evident as these two cards has come along, I can’t unsee it anymore. It will be something to notice in future cards as well.

My little SiS (6326) can’t be this fast

I remember when I first tried the SiS 6326. My first thought was something along the lines of “what the hell is this piece of crap?”. A few months later, I have to revise my opinion, if only somewhat.

Resident Evil doesn’t seem to like the old 6326 much. This is, however, a masterpiece of pop-art that should be cherished. Perhaps in a modern art museum.

One thing I should mention, is that my card is merely a C3 revision. From what I can gather, the original C1 model was terrible. And from my tests, the C3 is not that much of an improvement. Games will occasionally show warped polygons, and in general, texture perspective correction is botched to the point of not working, at least in some of the titles in my benchmark suite. 2D elements seem to have trouble displaying correctly. Texture resolution is often abysmal (the card’s paltry 4MB and very simple AGP compliance don’t help). Mipmapping apparently turns on and off at random. And the lack of OpenGL is just icing on the sewage cake, not that the 6326 would be able to run Quake 2 or even GLQuake decently, even if it were supported. Mind, the newest drivers even say that an ICD OpenGL is included… but I haven’t managed to run anything on it.

Crank that Blood 2 all the way up to 640×480. The SiS can run it. In low details, of course. If you want proper lighting, then you gotta go down to 512×384… and possibly 400×300.

So it doesn’t sound great today, but as I tried using it a bit more, and considering the circumstances, there’s something to like too. Image quality is actually okay for a cheap 1997 card – there was far worse around, usually for a higher price. A common Virge might look generally better, but it also runs slower, and was more expensive anyway. And it should be mentioned that a later C5 revision solved most of the quality issues of the 6326, making it far more attractive. If it had come out earlier, it would have been even better… but you can’t have everything.

Rogue Squadron doesn’t support anything lower than 640×480, so you get a choppy framerate, no two ways around it. But it’s playable, and the Nintendo 64 version wasn’t very smooth either. For a 1998 game on a 1997 budget card, I’d call it good enough.

A basic Voodoo 1 was another world, of course. But it was also $300. Bragging to your console friends was a somewhat expensive proposition then. The SiS 6326 though, that was just $50… and for that, you got a decent card that could accelerate games to 16 bits fairly smoothly up to 400×300 or even 512×384. Of course it supported higher as well, but people weren’t fans of slideshows even back then.

If I can find it cheap (and it should be easy, since these cards sold quite a lot), I would like to get my hands on the AGP C5 revision, to see how it compares. I also wonder how it compares to a Nintendo 64… maybe I will try a few conversions.

Memory Goes Here, Performance Goes There

Another failure? At least an interesting one, this time.

20170216_194151
A whole 8MB on a single stick. Only in 1998, folks.

Just a few days ago, I found a cheap 8MB SGRAM expansion for the Matrox G200 series. Yes, it’s a memory expansion for real this time. It was supposed to bring my G250 all the way up to 16MB. In itself, it’s already a useless experiment – the G400 32MB has more memory, is faster in everything, and has literally the same compatibility (including the same issues). While I was sure it wouldn’t make any difference in lower resolutions, I was thinking that perhaps you could see an effect once the local memory was entirely filled up by the framebuffer.

What I didn’t know, was that the memory expansion would actually decrease the default memory and core clocks on the card.

g250bench1
You don’t have to worry about higher resolutions if your monitor is crap.

I said in the past, that my G250 seems a bit different from the specs originally mentioned on Wikipedia: the core runs at 105mhz core, and the memory at 140mhz. That’s pretty high for its time, but I tested the veridicity of Powerstrip’s claims by running a few games and noticing that framerates scaled almost linearly against the G200A (which runs at 84/112mhz). It doesn’t even seem like an anomalous overclock, since scores stay up no matter how long I keep the tests running, and there are no artifacts in sight.

But after installing the memory daughterboard, suddenly I found the clocks going down to 90/120mhz. Attempting to overclock the card all the way up to the original values produced slight artifacts, so I didn’t make any further attempts. And sure enough, testing the card showed a sizeable decrease over the original framerates. The Forsaken test is particularly telling: the framerate matches the core clocks almost entirely, and shows that, at least on a P3-450mhz, the game is completely bound by the graphics card.

20170216_194429
The complete set. Now with automatic downclocking.

I made two mistakes: I thought there would be no difference at lower resolutions, but there was. And also, I thought there might be a difference at high resolutions, but it didn’t quite turn out. Even with something like 1024x768x32 in Incoming, which is supposed to fill the framebuffer almost entirely, the framerate delta is still effectively the same. 3DMark 99 does show a slight proportional increase when running at 1280×1024, but the difference is pretty small. I suppose the G200 series was really good at AGP texturing. It had DiME support, like the i740, whereas many AGP cards of the era stopped at DMA.

So what happened? Well, I have a theory. The expansion module was made for the old G200, which only ran at 84/112mhz (just like the later G200A die shrink). So they didn’t bother making memory chips that could run much faster than that, since they weren’t expecting people to clock the card any higher – after all, the G200 wasn’t even quite a gamer’s card to begin with. Therefore, since the G200 seems to always run with a 3:4 ratio between the core and memory, if you add slower memories, the core will go down too. Bummer, uh?

20170215_200400
Thank god my paycheck came in just a few days ago.

So that was an interesting experiment, but it could have gone better. Lately, all of my experiments haven’t gone so well, perhaps it’s a sign that my benchmarking days are over? Time will tell. At least the rest of my haul from yesterday wasn’t bad, as you can see. I expect to start Barrow Hill pretty soon, perhaps in the weekend (still playing Claw)… while the Zork book will have to wait until War and Peace is finished, which might take a little while.

Oh, and the SiS 6326 is a C3 revision with just 4MB of memory. Even worse than expected. I’ve never seen such horrible texturing perspective issues. Another one for the shelf.