Today, Ubisoft is launching its long-awaited Watch Dogs, a 3D open-world game in which the goal is to hack computer networks and take control of data nets to solve puzzles and track down story objectives instead of simply gunning down everyone you meet. Watch Dogs $59.99 at Dell has been held up as a visually stunning title that takes full advantage of modern GPU capabilities — and, notably, it makes prominent use of Nvidia’s technology. For those of you that haven’t followed the GW saga, GameWorks is a set of proprietary libraries, distributed by Nvidia, used to implement common 11 functions. that use these libraries cannot share the code with AMD for optimization and, in some cases, cannot see the source code at all. The result has been a competitive landscape that has often been significantly tilted against AMD.
Over at Forbes, reviewer Jason Evangelho has tested the R9 290X against the Nvidia GTX 770 with full results in a range of configurations coming today. His preliminary results show exactly the kind of skewed pattern I’d previously been concerned about, with the GTX 770 pulling ahead of the more expensive R9 290X at lower detail levels and only slipping barely behind at the highest 1080p settings. I spent a significant chunk of Monday attempting to replicate his results, but was unable to do so. My R9 290X results were somewhat faster than his, but his GTX 770 was turning in performance 15-20% faster than my own. His second more, comprehensive GPU review shows similar patterns.
A quick check of the other Watch Dog reviews popping up across the net shows a different (and more conventional) performance distributions, with the R9 290X outperforming the GTX 770 by 18-20% at High detail in 1080p. That’s a bit lower than the typical margin of roughly 25% in a non-GameWorks title, but it’s not ridiculously low the way we’ve seen in some other GameWorks games like Arkham Origins. The other thing they reveal — and something I can vouch for myself in the small amount of time I had to run my own tests — is that this game is in pretty wretched shape.
Even on a GTX 770 with “High” Texture quality (recommended for GPUs with a 2GB frame buffer), the game tends to skip and stutter with unexplained performance drops. Rotate the camera quickly, and you’ll see the game engine stutter as it tries to keep up. This occurs on both including Guru3D, complained that the game’s performance was so erratic, it had to kill their attempt to test multiple cards due to high run variation., but the problem honestly seems worse on the Nvidia side of the fence. Meanwhile, in the absence of an official timedemo, reviewers were free to create their own test runs — and many,
Here’s what I saw between the R9 290X and the GTX 770 in a three-minute timedemo run through an early part of the game. I tested four different configurations with Texture Details set to “High.”
I ran the game through the same test area under a variety of settings, mostly adjusting the Ambient Occlusion and anti-aliasing options, since those are the areas where GameWorks is employed. HBAO+ is the type of ambient occlusion that Nvidia recommends using, while the MSAA test increases VRAM requirements and puts more pressure on the memory bandwidth and fill rates. When I tested the GTX 770 in Ultra detail with Ultra textures, as Forbes did, my own GTX 770 struggled to maintain 45 FPS. Forbes’ results show it at 56 FPS — nearly matching the R9 290X in that mode.
Because my own test version of the game was based on a pre-release copy (and Forbes’ wasn’t), and because I had very little time to actually evaluate the situation, I’ve got to put more weight on Forbes’ results. One common characteristic which multiple websites address is that the RAM requirements in Watch Dogs are absolutely enormous. GPUs running at 2560x1440p and 2560×1600 slam into the 3GB memory barrier on the Radeon 7970 or GTX 780 family, while 4K Titan Black. The R9 290 family is better positioned in this regard; its 4GB buffer is large enough to allow a game to stretch its legs.is impossible on anything but