Well, I guess it had to happen eventually. I’ve dismissed my old PC.
There are reasons, of course. The main excuse was because it simply wasn’t working anymore – clicking sounds all over and it just wouldn’t boot to Windows. But that’s hardly a good reason, because at a glance, it just needed a new PSU. When I removed the CD-Rom molex cable, it booted just fine, so it’s likely that the PSU simply wasn’t giving enough power to keep both drives running at the same time. And given the components, even a cheap one would likely do just fine.
I guess, then, that it was simply a matter of shifting priorities. When I first built this old PC, around eight years ago, the landscape was very different from today. While there already was software like ScummVM and Dosbox, and GOG had already taken a foothold in the marketplace, the fract remained that running your own old games was still pretty hard, and usually didn’t work that well.
Things have changed for the better. There are more options to buy old games. Source ports exist for the major titles, and even some minor ones. Some highlights: ScummVM now supports The Neverhood. System Shock has been transfered very well to modern Windows. And Blood Omen now has an OpenGL port.
The other half of the equation was testing old graphics cards. But even that has more or less come to an end, now that I’ve bought all the cards I could afford. Well, technically I’m still missing the 3DImage 9750… but I consider that as dodging a bullet.
What will the future bring then? The usual, I suppose. As I said, running old games is easier than ever before. You don’t need a real old PC for a good experience anymore. So you should do that. Play some old games!
Murder on the Eurasia Express left too much of a bitter taste in my mouth, and with the lockdown still going strong, it was only a matter of time before I tried my hands at System Sacom’s more famous (and localized) game, Lunacy. In this specific case, the interval was roughly two hours. Why do tomorrow what you can do today?
Of course, this being about a story-heavy game, expect some spoilers!
Meet Fred the Traveler. I’m using a capital T there, because he sure loves introducing himself as “Fred the Traveler”, almost as if that were his family name. Of course that’s impossible, because Fred is not even necessarily his real name: see, he’s one of those amnesiac protagonists you might have heard about. They are usually stuck in JRPGs, but once in a while someone manages to escape into another genre.
Anyway, he finds himself inexplicably in Misty Town (which is not Silent Hill), where he’s strong-armed by the local ruler, an evil guy with a penchant for evil laughs, to find the mysterious city of Moons! So there starts your adventure. You’ll be going around, asking people to “please tell me everything you know about the City of Moons!”, which Fred does make sound like a threat. Not that you’ll be asking a lot, since there are maybe seven people in the whole town, perhaps it’s locked down because of the virus?
Worry not, because the second half of the game does bring you to the amazing city of Moons! Which is also empty, and actually quite small. There are ghosts and stuff. And plot twists. Also people die in hilarious manners, some of which can be brought upon by your own hands. Unexpectedly for the type of game and the era when it came out, you don’t get a good or bad ending depending on how many people you saved, so go nuts if you wish – though you will get some extra scenes during the credits if Fred saves those people instead. The game does borrow from the Lucasarts style by offering no dead ends and no game overs, and there’s no time limit to contend with either, which makes it more approachable to the casual player.
Gameplay is exactly the same as Mansion of Hidden Souls: look around, zoom on relevant items to get them automatically, use items from your inventory when you need to. That’s all, very basic. But I’m afraid it inherits similar flaws as well. Many items can’t be interacted with until you actually need them, which is not so uncommon in adventure games, but this game does sometimes go one step further: for example, at one point a character gives you an oil can. Later, you find a rusted door. You might think of using the oil, but it doesn’t work. Instead, you have to look for that character again, to trigger a second scene where he will gloat about juuuuust how good that oil is for rusted doors. Only once you have heard his boasting, the oil will work. Maybe Fred didn’t know how to apply it properly?
Some gripes aside, impressions are certainly much more positive than the last game I played. The videos look much more detailed and very smooth next to the Mega CD, and actually run a bit better than the Eurasia game. It again uses the same trick where the stills are merely video frames, and not higher quality as in Riven, so there is no jarring quality drop when you go between standing still to moving around. And the town looks quite lovely. Although it is a maze indeed. It will take a while before you remember where everything is.
Once again there are no subtitles, and one annoying problem is that the music is sometimes a bit too loud next to the voices, occasionally drowning them out. Luckily there’s an easily accessible in-game video player where you can watch any cutscene again. Considering this game is 100% cutscenes, it means you’ll never miss anything.
It’s a FMV game from the 90’s, so expecty cheesy CG animations and mostly unacceptable voices. And lip syncing? What lip syncing? I won’t lie, it is pretty jarring at times. The script doesn’t help either, you’ll often get gems like “maybe this perfume will protect me from this KILLER CARPET!”. No joke.
I was surprised to discover during the game that there are connections to Mansion of Hidden Souls. The theme of butterflies as human souls is once again present, although it has a more tangential role this time around. As I said, the gameplay is also exactly the same. Overall, it feels like it was made by the same team – unlike, say, the actual Mansion of Hidden Souls sequel on the same platform. Strange how that works.
I said that it looks like it could have been made by the same team, but a cursory look at the credits shows that it is not so. In fact, the only returning lead is the composer. I wasn’t expecting this. Also a shame that System Sacom doesn’t seem to be mentioned anywhere during the game. All credits reference single individuals, Sega or Atlus (who published it in the USA).
But know what’s even more unexpected? Apparently the director of Lunacy, Hiroyuki Maruhama, also directed Deep Fear. I never even knew that System Sacom co-developed that game. And after playing this one, I now feel quite interested in Deep Fear too. Since everybody is going to be busy with Resident Evil 3, I might as well get my own dose of survival horror too.
See? I already have my next goal. Getting through the lockdown is going to be easier than ever.
You know, normally at this time I’d be working. But not right now, for obvious reasons (that being the coronavirus: I’m just saying this in case someone, perhaps even myself, finds this page in ten years’ time and wonders what the reason could possibly be). If nothing else, I have had a lot of time to play games. And between grinding for pinnacle weapons in Destiny 2, or getting through the Game Pass backlog, there’s never been a dull moment.
Okay, I’m kidding. Dull moments abound. That’s why I’m still making an effort to check out some real obscure games. After my short Nekozamurai escapade from some months ago, it’s now time for another visual novel. And what better choice than another of System Sacom’s games, specifically their final one? Murder on the Eurasia Express!
Alas, it’s not a visual novel. I need to check my sources better next time. It is, instead, a full motion video game, not unlike Mansion of Hidden Souls. But this one is shot entirely in live action! The video quality is higher than the super choppy Mega CD game! And there are idols!… probably. Looking at the girls, I’d guess so. At least your sidekick, Tsubasa, is definitely played by an idol, it’s pretty obvious. But I guess the others too.
I won’t bore you with the details (mostly because I also couldn’t grasp much of it), but anyway, you are on the titular Eurasia Express train with a bunch of schoolgirls, and one of the teachers is murdered. For whatever reason, you only have two hours to solve this mystery, possibly because the train is going to get to the next station? It’s honestly hard to tell without at least subtitles.
Yep, since it’s not a visual novel, everything is voiced. And this was before people started noticing that subtitles in games are actually pretty useful. Unless you have a great grasp of Japanese, you won’t be playing this one for long. I know I didn’t.
It doesn’t help that the game also seems pretty… boring. You go around asking questions to the various characters, but the questions are always the same: what do you know about this accident? Do you know anything about this threatening letter that was sent to my sidekick? Did you notice anything weird? Those are literally the three questions you get to ask everyone. You will occasionally receive items, which range from photos to memos to really long booklets that describe the train and European cities (now I’m also left wondering if the Eurasia Express actually exists and this game was an advert vehicle for that too).
A quick look at Wikipedia don’t say much. It does confirm that these girls are idols, and looking at the cast list, it seems that most of them also appeared in early J-Horror movies such as Ring and Ju-On. Was this a trend? Maybe horror movies went with idols in order to attract more audience? I’ll admit, the acting in this game was F-grade material. I guess everyone has to start somewhere though.
In a move that shouldn’t surprise me, you can focus on three different views for each character: the face, the chest and the legs. I know it’s a detective game, and you could argue that the guys get the same treatment, but it still seems questionable at best. Ah, idols.
Anyway, System Sacom’s last is disappointing. Maybe there’s actually an amazing storyline in there. I doubt it though. And even if it had subtitles, I doubt I could play this game for more than ninety minutes, I’m afraid. That amounted to 30 in-game minutes. I guess time pauses often. So that’s all I can say, and I guess we’ll never know who the murderer was.. although the pig-tailed girl looked suspicious to me. The honor student always has something to hide. Don’t argue with my logic.
I’m not quite ready to give up on System Sacom yet, though. Who knows, perhaps Lunacy will offer something more substantial? At least it won’t be in Japanese.
I recently received a Rage 128 Pro. This is supposedly not a bad card for its time, should be somewhere around the TNT2 and G400. I already had a Rage 128 Pro Ultra, but that was a 64 bits bus width model, so I was eager to see what kind of difference the extra bandwidth would make. Everest says they are both running at 120c/120m, so that means the memory bus is 1920MB/s against 960MB/s.
Of course, I’d need to get it running first. I try to install my usual 7192, which worked just fine on the Ultra, but the card is not recognized. Apparently, Ati decided that OEM vendors modifying the ID was not a significant issue, so you’ll encounter this problem a lot. This is what happened in the next hour:
Drivers 7192 (latest Ati): card is not recognized Drivers 610: installs, but card doesn’t work Drivers 7087: installs, Windows hangs, must remove from safe mode
At this point I was kinda annoyed. Luckily, I seem to hit the jackpot with the next attempt, on drivers 654 beta. About time, too.
Initial results were somewhat disappointing. 3DMark shows no real difference outside of texture rendering speed, which is almost doubled. I should mention that the drivers of course are different, which might have an impact, but I’m not convinced that’s the only reason. Could it be that the Rage 128 Pro doesn’t really suffer from low bandwidth outside of very high resolutions? I’ve heard of its performance feats in 32 bits mode. Game tests will hopefully clarify things. Or at least, they should. Unfortunately, just like most Ati cards of the era, disabling Vsync in Direct3D is not an option.
(well, technically it is an option, it just doesn’t do anything)
Nevermind. OpenGL is still our friend, and even in D3D I can still compare the card with other similarly vsynced models. A good point of comparison would be provided by cards that used either two pipelines or two TMUs. My favorite kind of architecture, really. As usual, the tests are running on a P3-450mhz and 128MB SDR PC-100. Here we go:
Poor Trident… uh, I mean… as we can see, the Voodoo 3 is clearly ahead of the pack, and it should be, given that Quake 2 takes perfect advantage of its dual TMU design, plus it has a far higher clock speed than any other card (166mhz, while the others are ranging between 90mhz and 125mhz). The Kyro 1 and Oxygen GVX1 have some bottlenecking issues somewhere, either that or bad drivers. But from this chart, it seems that 128-bits data bus cards have little trouble powering through 800×600, while 64-bits model start choking already. The only exception is the Rage 128 Pro Ultra, which hangs on – could it truly be? Let’s see another game.
Newer is slower. Trident redeems itself! A bit, anyway. We can see here that no card can go past 48fps. I’d be hard pressed to call it a real CPU bottleneck, given that more powerful cards seem able to reach 60fps (and T&L models even 70fps). But it’s still something to consider. Things are more uneven here, but overall, it seems that 64-bits cards still struggle more. Not surprising. Even the Pro Ultra falters this time. Clock speeds are a bit higher than the Vanta, so obtaining similar results is not great. But hey, wanna talk about the Oxygen GVX1 and its 128-bits memories? I thought not.
The cards for Incoming have been all hand-picked, to only show you the ones limited by vsync. Be grateful. The G550 is actually an exception to my previous rule, since it’s a 2×2 design, but all my tests and other tests on the internet show that it’s usually as fast, if not a bit slower even, than a G400. And since, unlike the G400, it seems to be plagued by vsync, it is a better pick. But unfortunately the Rage 128 Pro doesn’t have a good showing here, even next to its older Rage 128 GL sibling. I blame drivers. The situation is better in Forsaken, but I forgot to fix a few things so you won’t get a chart. I will tell you that on 1024×768, the Pro reaches 47.2fps, better than the Ultra (38.8fps) and the GL (37.2fps). Be grateful.
At a glance, it seems to me that the Rage 128 Pro architecture is slightly less reliant on memory bandwidth than its competitors indeed. However, given that it scores lower across the board, it might not be that important – why pick a Rage 128 Pro Ultra, when you could have a TNT2 M64? Let’s face it, the drivers were going to be a million times easier to install too. The dithering was also better (the Rage 128 Pro uses a very noisy diamond pattern, who knows why anybody could have thought it was a good choice) which is kinda important because I can’t imagine people playing in 32-bits mode with budget cards.
I also got a SIS 315 Pro, which I thought was an actual SiS 315. Turns out it’s just a 315E. As usual, my little SiSter keeps disappointing me. Some things I discovered though: the card actually goes up to AGP 4x (even though my motherbord can’t support it), while the 315L only goes up to 2x. And… that’s just about it? Yeah, not much. I’ll need to check out those lot auctions a little better next time.
A couple years ago, I learned an important lesson: you aren’t going to have an easy time in auctions. Rare cards, if cheap, are snagged almost immediately. If they are expensive, well, you probably won’t get close to them in the first place. What’s a collector to do? Spending a lot of money is an option, of course, but consider other routes. For example, there are many people who sell graphics cards by the lot. Most of these are unspecified in auctions, and you can only rely on photos to tell what you are actually looking at. With some luck, a rare find can be made, one that others won’t know about. This is how I got my super cheap Geforce 256 SDR.
And now, I got a very cheap Trident Blade T64. This might be an even rarer find. I had never even seen any of the Blade XP series on sale, and getting one for cheap seemed beyond hope. That is already impressive enough. But what’s even more impressive is performance, although not in a good way.
We should probably look at some specs first. Data is scarce, as befitting of a card nobody had, but I was certainly surprised to see that even Everest gives up entirely – no specs are shown at all. Powerstrip on the other hand makes a token effort, and tells me that memory is clocked at 166mhz and data bus is 64 bits. While that software is notoriously fidgety, this is not too difficult to believe, since Trident mentioned these same specs on their old archived website.
According to Trident’s own words, this card is a dual pipeline design. I’ll just ignore that 1328MT/s texel rate – at 166mhz, it would imply a monstrous 2×4 design, and there’s no way in hell that’s true. Now, Wikipedia says it’s a 2×2 design. But this seems wrong. A core clock of 166mhz, peak fillrate of 332MP/s, would be 2×2? Mmh. Let’s look elsewhere. Tom’s Hardware has an old review of the Blade XP, and according to them, the T64 is the same architecture but with 64-bits data bus and lower clocks. That’s possible, and since it was written way back in 2001, it’s more reliable than recent data. Trident’s old website also says that the T64 worked at “up to 166mhz”, meaning it could be lower. 143mhz is possible then. Given those numbers, it sounds like a 2×1 design. This is more believable.
However, Trident themselves say that it can process “up to 4 textures per clock”, and specifically calls it a “dual pixel, quad texture rendering engine”. This is harder to explain, but I still won’t believe in a 2×2 design, because it makes no sense here. Perhaps it’s using a trick similar to the Savage 4, which merged textures to apply them in a single pass? This is possible too. Admittedly the easiest answer would be 2×2, but there’s so much that seems wrong with this idea, I can’t just accept it yet. You’ll see.
Now, I’m the first to admit that 3DMark isn’t the most reliable benchmark around. But those fillrate numbers don’t look like 2×2 to me. Even worse, the results line up with the Blade 3D in a bad way. The B3D was 100mhz, while if we assume that the T64 is 143mhz, that means a 43% increase in single pipeline performance. Add in the faster memories, and a 50% increase sounds likely. But if so, I’m just not seeing the impact of the second pipeline here.
A similar story is repeated with 3DMark 2000. Again, same results between single texturing and multitexturing. In order to try and get some definitive answer, I even tried 3DMark 2001. At default settings, I get around 78MT/s single, 83MT/s multi. At my wit’s end, I tried setting it to the minimum of 640x480x16. Finally a difference: 110MT/s single, 160MT/s multi. Managing to break the 143MT/s barrier is proof that the card can’t be a mere 1×1, at least. But we’re still far from the fabled 2×2 design, if you ask me. Even if a difference between single and multi texturing would imply there’s more than one TMU per pipeline, these numbers and Trident’s own sheets are simply too low to support that claim.
By the way, all 3DMark’s say that the card can apply up to 3 textures per single pass. Strange, but they said the same thing about the G450, which is definitely a 2×1 card, so I’m not gonna put too much stock into it.
Quake 2 says that there’s some kind of multitexturing extension, but doesn’t actually find it. And unlike the TNT2, I couldn’t find any registry key to enable it. Theory number two: maybe the card was supposed to be a 2×2 design, but the second texture unit in each pipeline was disabled for some reason? Or maybe it can only be enabled by a DX8.0 program, such as 3DMark 2001? After all, the sheet says that the card is a 7.0 piece (no T&L by the way), but has software interfacing with 8.0, whatever that means. I think I’m making stuff up here. It’s just annoying.
Performance could be a good way to debunk all these myths, but unfortunately the card is crippled by vsync, much like the Blade 3D before it. There’s no way to disable it, not even Powerstrip. Besides, my Pentium 3 450mhz is probably a bit too weak for a 2000 card, even one such as this. Here are a few numbers though:
Vsync makes things hard to discern, but overall I see too many similarities between the Blade 3D and the T64. Any time the T64 is higher, it’s always by less than 50%, which is more easily explained by the higher clocks (the only outlier is MDK2, but the Blade 3D had some big CPU frametime spikes in that game, and only 800×600 is reliable as a result). At this point I’m starting to wonder if perhaps even the second pipeline wasn’t gimped in the T64 at least, making it effectively a 1×1 design. The lack of heatsink certainly is strange, maybe a gimped card had no need of it? Of course, there’s also the possibility of CPU bottlenecks. But that looks unlikely. Or horrible drivers. That sounds less unlikely.
There are other issues that are worth mentioning, even if not related to performance. The card has similar color trouble when playing certain side-scrolling DOS games as the Voodoo 3 did. So in the end, it’s just not a great card. But an interesting one for sure. It’s always difficult to explain why some data doesn’t line up with official specs. It’s usually bad drivers, but who knows, maybe sometimes it’s just bad specs? I don’t think I’ll ever know.
Possibly even stranger, if you take a look at Jaton’s archived webpage for the Trident Blade T16, it has the exact same data as the T64. It even calls it a 9970! But the T16 is actually a 9960, there’s pictures of the card on VGAMuseum. I smell a copy-paste job on Jaton’s part. I can only wonder how the T16 was gimped, since the T64 was already gimped enough. Maybe they removed the effectively non-functional second texture unit in each pipeline. What a loss.
Theory number three: maybe Trident made such a mess with their models that everyone called their chips however they liked. Maybe the T16 is actually a T64, just with lower clocks and less memory. Maybe the entire Blade line is just an overclocked Blade 3D.
Maybe some things are best left undiscovered, for sanity’s sake.
Edit 02/01/20: I’ve since discovered (you learn something new every day) that GL_SGIS_multitexture was an obsolete extension by 1999, hence Quake 2 looking for that over GL_ARB_multitexture. Of course, every other card supports it still. But Trident not being aimed at gamers, I guess they didn’t care for legacy extensions. Their loss. Results are still bad either way.
Don’t you just hate bugs? Even if you file a report, who knows how long they will take to fix. After all, there’s no telling how high in the priority list they are. In some situations, it might just be better to… turn a bug into a feature. How? There are ways.
Let’s go back many months. I purchased Assassin’s Creed Origins on sale way back in 2018. But as it turned out, my measly Ryzen 1500X was simply not good enough to run the game at 60fps. Lowering the settings helped, but a stable 60fps was a utopia still. Well, nevermind that, let’s just cap the framerate at 30fps and pretend that we’re playing on consoles. Right?
Maybe not. As it turns out, the in-game 30fps framerate limiter actually operates at 31fps. I’m not sure if this is a widespread problem – I was unable to find anyone else with the same issue. What it means though, is that the game stutters constantly and without fail.
I tried a few possible ways to fix it. What if I disabled vsync and kept the 31fps limit? Well, that would just create a nice tearing line throughout the game. Someone suggested to limit the framerate with RTSS. This almost worked, but the image overall still looked more unstable than it had any right to. I even attempted to use the Framerate Targeting in the AMD software, but as expected, that is just a target and therefore doesn’t work too well. Eventually, I just played with RTSS, but then dropped the game after a while because it was getting on my nerves.
Now let’s fast forward several months. While playing around with System Shock, I discover an interesting bug: when I’m trying to play at 1280×1024 in native resolution, the framerate drops hard. Some time later, while discussing this issue, a pattern emerges – it turns out that my RX480 engages Vsync at 30fps whenever it needs to render any letterboxing in a Direct3D game. The most common case, of course, is when trying to play a game with GPU Scaling set to Center, for example because I want to avoid any non-integer scaling. Reading around the net, this bug seems to happen on Polaris cards on TVs with HDMI Scaling turned on. But this is a monitor, and there is no such option. Strange.
The most annoying result of this bug, for me, is that it makes the new Integer Scaling option – introduced on Polaris cards with the new Adrenaline software – completely useless, since many old games ran at 640×480 and scaling that would give me 1280×960 as a result, triggering the bug. Pretty much no old games had 540p as a resolution, therefore I simply can’t avoid it.
But why not take advantage of this long-running bug, and use it to… maybe, just maybe, fix another long-running bug? Let’s go back to Assassin’s Creed Origins.
The game only has a few selectable resolutions, but going lower than 1080p is possible. There’s of course 1600×900, but I’d get more screen usage overall if I select 1680×1050 instead, even though I’d lose a bit of extra picture on the sides. After that, I just have to set the GPU Scaling to Center (which the new software lets me set up for the individual game, luckily) and then enable Vsync in game.
Success. I can now enjoy the game at a stable 30fps. And one advantage of this console-style framerate is that I can set the graphics to High and still not get any drops. Now I don’t want this bug fixed anymore. At least until I’m done with the game.
As for the future, I have an interesting card on the way. My favorite holiday present.
Long-time readers of this blog (who?) might remember that for a long time I have had a strange Riva TNT, one that gives laughably bad scores in any benchmark. They are so bad, in fact, that I’ve often wondered if it was truly a TNT. At higher resolutions, it seems more like a Riva 128 with multitexturing. I can confirm the multitexturing part because Quake 2 uses it. But still.
The Bios POST screen also showed little doubt, but who knows? These things can be changed relatively easily, I think. But even less doubt is presented by the drivers. The card will only take TNT drivers. So that’s it. From that perspective, there’s no doubt.
The first idea that sprang to my mind was that maybe I had a 64-bits bus width card. However, Powerstrip said it was 128-bit. So for a long time, I trusted it and didn’t think much of the situation anymore. Recently, I’ve learned that Powerstrip is unreliable at best, and I’ve also read around that Nvidia released some incognito TNT cards with half the bus width and half the memory size. Interesting. Let’s try with something else?
And so the truth is revealed. The card indeed has a 64-bits wide data bus. But not just that, the memory is also clocked at a measly 80mhz. That gives me a bandwidth of 640MB/s. For comparison, a regular TNT had its RAM clocked at 110mhz and a 128-bits bus, for a total of 1760MB/s. Even though the core clock is the same at 90mhz for both cards, that’s barely more than a third of the memory bandwidth. Oof.
Well, that explains a lot, certainly. If I must tell the truth, I was initially unwilling to admit this because there was no mention of this kind of card on VGAMuseum or anywhere else, aside from that small blurb on Wikipedia. I even started thinking: maybe this is actually based on the TNT2 M64, rather than the original TNT? That seemed unlikely, of course. The Vanta line was a real thing, so to speak, with a properly named chip and everything, even the Vanta LT wasn’t incognito. This, on the other hand, is just called a TNT. So, is it?
I do have a TNT2 M64 in my personal collection, of course, given that those things on auction sites are more common than TIE Fighters in a galaxy far far away. I do not have a Vanta LT, but I can easily “simulate” it by bringing down the clocks to 105c/100m. The architecture is the same after all. The only difference would be the extra memory size (32MB, while the Vanta LT was only 16MB), but for these old tests it shouldn’t make any difference. As usual, it’s on a P3-450mhz.
The scores shouldn’t leave much to the imagination. The Vanta LT would have been massively underpowered next to a regular TNT2, due to its lower clocks and narrower data bus, only somewhat holding up at 640×480. But my TNT is also massively underpowered next to the simulated Vanta LT, even though the bus width is the same, and core and memory frequencies aren’t too different (my TNT is 90c/80m). In all cases, the same drivers were used. Improved architecture then. I can’t see any other explanation.
I do find it interesting that nobody really seems to have any info on this old, gimped TNT. Is it so rare that nobody among enthusiasts even acknowledges its existence? That sounds hard to believe. This kind of cards must have sold to OEMs for a dime a dozen. The 64-bits variant of the Riva 128ZX is known, so this is strange.
Or maybe it’s all Occam’s Razor and people simply forgot about it.