Following on my previous article on Tonic Trouble (sorry but when I end up playing something so obscure after so many years, I want to talk about it a bit more than just a couple paragraphs), there’s something interesting I wanted to say about its technical realization.
The game uses the same engine as Rayman 2, but there appear to be some improvements in how it manages it, which is a bit weird considering it should actually be older. A look at the options screen will show what I mean.
The first thing to see, is that the game allows you to change the primary display driver. Compare that to Rayman 2, which forced you to choose your display driver, and then only installed that version. You had to reinstall the game to change to Direct3D or Glide. Yuck. Talking about Glide, this one doesn’t seem to support it. While my card is erratic, Rayman 2 works and uses the same engine, so I can only assume this is because Rayman 2 was modified to support Glide.
Tonic Trouble also allows you to select your resolution from a list, the details level, and even offers the choice between Double and Triple Buffering (disabling Vsync doesn’t seem to be possible from this panel). Rayman 2 only let you choose between Low, Medium and High resolution (really 640, 800 and 1024).
But the coolest option is texture memory management.
Tonic Trouble will let you choose how much memory you want to dedicate to the textures. You can choose whether to use all the available memory, limit it to a certain amount, or even if you want to use all memory excluding AGP. This could make for some really interesting experiments. The setup will also take into account the framebuffer size, so the amount of available memory will change on the fly depending on the selected resolution and buffering mode.
On the Voodoo 3, of course we are never going to have any problems, but I still noticed something interesting. Now, I know this card doesn’t support AGP texturing. However, there still seemed to be some effect. With “all available memory” selected, on 1024×768 with Triple Buffering, I was left with 6492KB for texture memory (as you can see in the setup screen). Selecting “No AGP” would instead give me 8412KB. I wonder why the amount increased.
A second test was made with my old and buggy S3 Savage 3D (it’s so buggy, Windows won’t even start with the newest drivers. I had to use some old engineering drivers from 1998). I know this card supports AGP texturing. And indeed, despite its onboard 8MB, the setup shows about 17MB of available memory. Good. But if AGP memory is disabled, then with 1024×768 and Triple Buffer, texture memory becomes a mere 1096KB!
Therefore, we can be sure that the game is using AGP memory here (if selected). In fact, if all available memory is used, the game runs properly, and even with texture quality higher than the Voodoo 3 (that is something I have in mind for the next blog post, though). Performance doesn’t seem to drop, which is a good sign because I expected texture trashing. For the record, overall performance with the Savage 3D is not bad.
I would like to try an AGP card with 4MB onboard, which would effectively force the game to use AGP for all of the texturing at 1024×768 Triple Buffered. Unfortunately, the only such card I have is the Matrox G100, which doesn’t seem to support AGP texturing – the game doesn’t show any difference between those options.
Despite the game listing 4MB as a minimum requirement, all attempts to play this game on cards with only 4MB of onboard memory and no AGP texturing ended up pretty badly.
Other things are not related to graphics, but still amusing. Music volume is set to zero by default. I didn’t find the music grating or anything, so I don’t know what to make of that. Maybe the developers didn’t like their own music.