There’s been much talk about The Division lately: after an amazing reveal at E3 2013, it had completely fallen off the radar for many people, but the recent beta events have reignited the flame. On my part, the game is preordered on console. Nonetheless, since there’s an open beta going right now, I thought it would be at least interesting to see if the old rig could run this game in any acceptable way.
I’m immediately greeted with the text box in the picture above. Fun. But we all know system requirements are a fickle thing, born more out of the publishers’ need to ensure nobody will complain than an actual attempt to see what kind of configuration would indeed be the minimum to run the game acceptably. How do you define acceptable today, anyway? 720p30? Or perhaps at least 1080p30? In face of such a dilemma, it’s best that people try things themselves when they have the chance.
In my case, the usual i5-2400 with an HD7850 1GB (poor me) and 8GB of memory, I get a fairly good 40-50fps running at 1366×768 and medium or high details. Unfortunately, the in-game 30fps limiter creates constant judder, a situation not different from the one I had seen in The Vanishing of Ethan Carter. Except in this case, even RadeonPro doesn’t work at all. That makes things a bit annoying, to be sure.
Most importantly, the hard disk is continuously grating. There’s no texture setting in the options, so you have to do with what the game throws at you – and my 1GB card just can’t contain it all. Once again, I see the folly of my old choice.
Overall, it’s a good thing I preordered this on console, because my PC just doesn’t cut it. Next GPU? Well, we’ll see what Polaris and Pascal have to offer, but it will have to be at least 4GB.