While still working on-and-off on my years long project, I came across an S3 Trio 3D. A fairly common graphics card, though not as common as the Virge (which I have too, unluckily for me). Veterans will probably remember the Virge line as the decelerator of its time, and Trio wasn’t much better.
General agreement is that the Virge was trying to accomplish too much with too little. PC players at the time were barely playing with perspective correction, and that was already something: PS1 and Saturn didn’t even support it. Still, the Trio is more modern. You’d think it could have done better. It didn’t.
Well, it didn’t, but what a strange failure it was. It’s really kind of impressive to see a 4MB, very cheap 2D/3D card properly supporting not just bilinear filtering, but also mipmapping and trilinear. And it does so quite well too, unlike other budget cards of the era. As a matter of fact, if you don’t mind playing in 320×240, you’ll end up getting decent speed with much more features compared to software rendering. Well, if you can deal with the 15-bits colors and resulting color banding.
Video memory is indeed especially low. And with S3 only assigning a mere 1.5MB to textures, you can forget about great results. But let’s face it, the card is way too slow to run anything above 320×240 anyway. I used 640×480 only for testing, but it was clearly not meant for this kind of hi-res.
With 3DMark99 set that high, weird things start to happen. Texturing Speed is especially strange. Don’t be fooled by the results up there: the card is only rendering a small part of the test plate. Other cards usually drop to a crawl, but still pass the test. It seems like the Trio doesn’t even try to swap, simply omitting the repeated texture instead.
The filtering test also shows something interesting. There isn’t nearly enough memory to properly render the graphics, but notice how mipmaps are working, and even trilinear is supported normally. What were they trying to do, supporting this stuff in a low-budget card? You can even see in the results screen that texture filtering comes with some of the heftiest performance drops I’ve seen this side of the ATI Rage II.
Who knows what they were thinking. But it’s quite impressive nonetheless. Games run (or better, don’t) like molassa, textures aren’t rendered, extra features are way too slow… but filtering itself seems to work fine. Mind, it was 1998, so perhaps not supporting those features would have made S3 a laughing stock in the market. But they pretty much were one already, so why bother?
My tests aren’t over, so I’m sure I’ll find other interesting results. I’ve seen a few already (how did the G100 support multitexturing when even the G200 didn’t?), and it can only get better. For example, I have that Matrox Mystique waiting…