could you consider a review of the (forever postponed) S.T.A.L.K.E.R. game ???
I think it is as of now the most visully appealing, realistic and demandong game. IMHO this game is the way to test GPU's performance on future titles.
I'm sure the moment You check it You will understand.
Just wanted to add another vote for some Oblivion testing, and ALSO an inclusion of ATI's x800 and x850 series vid cards. For the record, I'm playing Oblivion on an x850xt running at 540/580 (PE speeeds), and it's running fine on high settings, 1024x768, no AA, full distance. Great game...
I suppose there could have been some pressure from Ati to post this article in order to vindicate themselves. Anandtech did like to use the previous version of B&W2 for performance testing. People were probably quick to blame Ati for the poor performance. I doubt this whole article would have ever been written had there not been some sort of outside influence.
Actually, AFAIK, Josh just wanted to write about this subject. He's also the one that has done some of the regression testing (i.e use old drivers). It's all in the search for knowledge. As far as the patch, I'm *sure* that ATI helped Lionhead make some optimizations. Okay, that's a guess, but I would be amazed if they didn't. So, file this one under the heading of, "why is it that we need to make specific optimizations to games and drivers?"
Oblivion is even worse right now. SLI you have to make a custom profile and manually enable AFR2 rendering for best performance (apparently). For ATI CF support, you actually have to rename the executable. So much for multi-GPU support out of box experience! Not that SLI/CF aren't faster, but they are frequently a hassle to deal with.
Why doesn't any site test a 7900 GTX clocked down to a 7900 GT part for core/mem and see its performance? I have a feeling there could be a US$400 market for such a tweaked GT w/ 512MB card in between a GT (256MB) & GTX (512MB). Where o'where?
"Unfortunately, one of the problems with this game has been that it tends to favor NVIDIA graphics cards over ATI cards, despite the ATI splash screen at the game's startup."
Well, humorously enough, there has been at least one "Nvidia: The way it's meant to be played" game that ran better on my 9700 Pro than my friends' 5xxxx and 4xxxx series nvidia cards. :) Most dev houses are against making their game specifically more playable on one type of hardware than another, even with branding payments. ATI and nVidia are pretty even as far as gamer-level market share goes, so they're not going to fubar half their audience on purpose, and some games just run better on one architecture than another.. Nothing really suprising except that ATI and nVidia think that is worth spending their money on. :)
This appears to be more of an ad for B&W 2 than a serious review of the video cards. Since when does AT publish articles that test the latest video cards on a single 6 month old game that nobody really cares about anyway. The first one was enough, ughh.
I don't know how many people play B&W, but I would be willing to give it a try... I have a try to The Sims, and it was nice... too bad I wasn't able to advance too much in it :(
It's a pity no X800 series cards were included in the midrange category. Many AGP systems are still based on these cards as well as the 6800GT/Ultra cards(which were represented by the 6800GS).
I concur. It was only a few months ago I put together an awesome AMD X2 system with one of those ATI x800 gto^2 cards that overclocks like crazy. I'd like to see how it compares to the most recent stuff.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
22 Comments
Back to Article
mino - Saturday, April 8, 2006 - link
Hi Josh,could you consider a review of the (forever postponed) S.T.A.L.K.E.R. game ???
I think it is as of now the most visully appealing, realistic and demandong game. IMHO this game is the way to test GPU's performance on future titles.
I'm sure the moment You check it You will understand.
Josh Venning - Tuesday, April 11, 2006 - link
We will definitely consider reviewing this game after it's released, whenever that may be.Kremy - Thursday, April 6, 2006 - link
Just wanted to add another vote for some Oblivion testing, and ALSO an inclusion of ATI's x800 and x850 series vid cards. For the record, I'm playing Oblivion on an x850xt running at 540/580 (PE speeeds), and it's running fine on high settings, 1024x768, no AA, full distance. Great game...AdamK47 3DS - Wednesday, April 5, 2006 - link
Why is there an article about it now when this patch has been out for so long?bupkus - Wednesday, April 5, 2006 - link
Maybe to be fair to ATI.AdamK47 3DS - Wednesday, April 5, 2006 - link
I suppose there could have been some pressure from Ati to post this article in order to vindicate themselves. Anandtech did like to use the previous version of B&W2 for performance testing. People were probably quick to blame Ati for the poor performance. I doubt this whole article would have ever been written had there not been some sort of outside influence.JarredWalton - Wednesday, April 5, 2006 - link
Actually, AFAIK, Josh just wanted to write about this subject. He's also the one that has done some of the regression testing (i.e use old drivers). It's all in the search for knowledge. As far as the patch, I'm *sure* that ATI helped Lionhead make some optimizations. Okay, that's a guess, but I would be amazed if they didn't. So, file this one under the heading of, "why is it that we need to make specific optimizations to games and drivers?"Oblivion is even worse right now. SLI you have to make a custom profile and manually enable AFR2 rendering for best performance (apparently). For ATI CF support, you actually have to rename the executable. So much for multi-GPU support out of box experience! Not that SLI/CF aren't faster, but they are frequently a hassle to deal with.
spinportal - Wednesday, April 5, 2006 - link
Why doesn't any site test a 7900 GTX clocked down to a 7900 GT part for core/mem and see its performance? I have a feeling there could be a US$400 market for such a tweaked GT w/ 512MB card in between a GT (256MB) & GTX (512MB). Where o'where?Araemo - Wednesday, April 5, 2006 - link
"Unfortunately, one of the problems with this game has been that it tends to favor NVIDIA graphics cards over ATI cards, despite the ATI splash screen at the game's startup."Well, humorously enough, there has been at least one "Nvidia: The way it's meant to be played" game that ran better on my 9700 Pro than my friends' 5xxxx and 4xxxx series nvidia cards. :) Most dev houses are against making their game specifically more playable on one type of hardware than another, even with branding payments. ATI and nVidia are pretty even as far as gamer-level market share goes, so they're not going to fubar half their audience on purpose, and some games just run better on one architecture than another.. Nothing really suprising except that ATI and nVidia think that is worth spending their money on. :)
Warder45 - Wednesday, April 5, 2006 - link
Let's see some Elder Scrolls Oblivion testing.Lifted - Wednesday, April 5, 2006 - link
This appears to be more of an ad for B&W 2 than a serious review of the video cards. Since when does AT publish articles that test the latest video cards on a single 6 month old game that nobody really cares about anyway. The first one was enough, ughh.Cybercat - Wednesday, April 5, 2006 - link
well uh....why the hell not? Not like everyone feels the same way you do about this game. Some people kinda actually....you know....play it.Calin - Thursday, April 6, 2006 - link
I don't know how many people play B&W, but I would be willing to give it a try... I have a try to The Sims, and it was nice... too bad I wasn't able to advance too much in it :(Lonyo - Wednesday, April 5, 2006 - link
Oblivion testing would be much better than bloody B&W2 updates.microAmp - Wednesday, April 5, 2006 - link
Another vote for TES: Oblivion testing.Goi - Wednesday, April 5, 2006 - link
It's a pity no X800 series cards were included in the midrange category. Many AGP systems are still based on these cards as well as the 6800GT/Ultra cards(which were represented by the 6800GS).Brian23 - Wednesday, April 5, 2006 - link
I concur. It was only a few months ago I put together an awesome AMD X2 system with one of those ATI x800 gto^2 cards that overclocks like crazy. I'd like to see how it compares to the most recent stuff.whitelight - Thursday, April 6, 2006 - link
another vote to include an x800/x850 card that has 16 pipes and is clocked around 550/550.Howard - Wednesday, April 5, 2006 - link
I can't access the 2nd page.JarredWalton - Wednesday, April 5, 2006 - link
I've tested with both Firefox 1.07 and IE6, and everything seems fine. Are you still having problems?Howard - Wednesday, April 5, 2006 - link
Nope, fixed. Must have been an early-article glitch of some sort.mikecel79 - Wednesday, April 5, 2006 - link
I can't get to page 5 but I can get to page 6.