1.Let me help you out. More people have 1280x1024 monitors, and any who do have flat panels, are most likely stuck at 1680x1050 because the 1920x monitors command a premium.
So MOST GAMERS are far below 2650, and 1920, and even 1650, and some can't run 1280x1024.
A common game rez is 1024x768, 800x600 is also used currently on all the high end games - both especially with gamers with brand name store bought systems - we all know the big names ( not the multiple thousand dollar gaming brands - that's one of 50 gamers!). When you're stuck in a lab with $2,000 monitors and then travel to checking out the cebit babes, staying in touch with the average gamer is difficult, to say the least.
2. Even though the 260 passes 20 of 21 tests, and the 4850 passes LESS, Derek the red just HAS to state that the Sapphire passed every test they threw at it. Now the upper number doesn't jibe with that - the one showing the 4850 worked in LESS situations than the GTX260 -
BUT THE RED RAGE FUD NEVER ENDS.
( obviously more than one 4850 brand was in play - NEVER THE LESS - that is the type of CONSTANT red slant that is all over EVERY SINGLE PAGE.)
I found an oddity. This articles posted power consumption doesn't correspond to other articles findings; yes I know I can't cross reference data across articles. In this article:
http://www.anandtech.com/video/showdoc.aspx?i=3437...">http://www.anandtech.com/video/showdoc.aspx?i=3437... The 9800GTX+ uses less energy than the HD4850 and in this multi-GPU article it's the other way around. I know I can't compare total system power findings but I'd think that the cards would at least be in the same order.
Can someone PLEASE clear this up for me? I'm very confused as to which card is more power efficient as the findings have varied so heavily across different articles.
It would probably be a good idea to list the power consumption of only the GPU, instead of the whole system. Doing that would remove variables, which is of course, a goal of any scientific experiment.
I just wanted to say, this is a great article. I'm glad you guys finally wrote an article comparing pretty much every card out there that people will consider buying, you could have included some lower end cards and resolutions. But hey, it takes a lot of time to do that many tests accurately; I appreciate all that you did do. It really was a well written and interesting article to read; good job anandtech.
P.S. Any news on when new GPU's will be coming from Nvidia/AMD. I've noticed a LOT of price drops and mail in rebates on GPU's recently.
"In general, more than one GPU isn't that necessary for 1920x1200"
Depending on the definition of "that":) I disagree, gameplay is so much smoother with two cards on a 24" monitor. I tried playing with one 285 versus two, and while I don't have any real numbers, the differences is very noticeable. Crysis never dips below 40fps. This is my first real SLI since Voodoo II and I'm very glad I did it. You can always buy one card and decide if you want the second one later. I paid under $700 for two 285s and also got a copy of World at War and FarCry2 with two cards (a fluke maybe), that's almost $100 worth of games:)
I know it's 2 years old, but it is still a relatively taxing game for it's age and is still relevant with the Tales of Valor expansion coming out... so could you PLEASE add Company of Heroes back into your reviews? I would just ask for Dawn of War 2 benchmarks, but it's the same engine and they didn't include a nice performance test. As it stands there are no RTS's in your hardware reviews. I'd just like to see a broader spectrum of games represented.
i gotta commend u on ur graphs, very clean and very easy to understand unlike some other sites that have confounding line graphs. i especially love how i can choose between different resolutions to see the diff instantly. Kudos to u guys.:)
I upgraded my gpu awhile back and got a MSI 4870 1 gig (ATI) card to replace my old MSI 7600gt 256 mb (Nvidia). However, although I can get better quality graphics with the ATI card, the Nvidia card is actually faster when playing Vanguard soh.... The game loads slower and takes more time shifting graphics (graphic lag), but the graphics look better and I can turn up settings a little higher. Any ideas on what's going on?
Early Core2Duo 6600
Early ab9 pro abit motherboard
All drivers up to date
Any time you increase graphics settings on a game you increase the amount of information that needs to be off loaded to and from HDD, CPU, RAM, GPU so this is likely you reason for slower load times.
But that card is in not actually "faster" than the 4870, in actuality that card is probably 2-3x better at rendering then the 7600 across all games.
Well, the problem is that I tried it at the same settings too, the 7600gt was still faster. The difference is so great that it takes about 15s for the 4870 to even process a change in settings while with the 7600 gt it's practically instant.
Is bolting two video cards together really necessary? I've played Call of Duty at max settings on my two year old midrange card and it ran beautifully. These cards seemed to be designed for games not of today or tomorrow, but rather for games that may never exist. Few store fronts sell pc games today, and many of the games produced today are not terribly graphics intensive. Also, most popular pc games are available on consoles, which are much more practical. I know this sounds negative but the truth is that the video card manufactures are just ignoring the current pc game market.
Geeze dude, there are hundreds of PC games out every year. I suppose the couch console has a wider selection, but it's not like we're dead space here. (hahah)
I guess it's easier to sell to Santa for the little curtain climbers when even Daddy has a resistence problem. ( Believe me, it's a standing joke amongst friends).
Then we have the kiddies screwing up the computer issue - not a problem when the household heads do so... the console keeps the PC functional, so to speak.
But really, lots of game choices for the PC - the biggest retailer in the USA - you know whom - has loads of PC games. Ever heard of best buy - aww just forget it.
Just go whine somewhere else would ya ?
^ Go take a 2nd look at the 2560x1600 res graphs. In almost every case the single cards are struggling to keep their heads above water and in a few cases failing miserably. Ideally, minimum framerates would be >60fps in all of today's games for that buttery-smooth feel but we are far from that with even dual graphics cards.
I agree with you that dual graphics cards are not needed by most gamers but to say that there is no need period is ignoring reality.
I'm confused about your opinion on AMD drivers. You made this comment in this article:
"Because of AMD's driver issues, we often have to wait when new games come out to enjoy proper CrossFire scaling with them. And when hotfixes come out it takes more than a month (usually more like two) to fully integrate changes into a WHQL driver."
However, you've also made comments in multiple articles similar to this one from GPU Transcoding Throwdown: Elemental's Badaboom vs. AMD's Avivo Video Converter on December 15th, 2008:
"The train wreck that has been the last few months of Catalyst has happened before and it will happen again as long as AMD puts too many resources into pushing drivers out every month and not enough into making sure those drivers are of high enough quality."
So on the one hand it seems you are criticizing AMD for not releasing WHQL drivers soon enough. While on the other hand, you seem to want them to take more time with their drivers. Anandtech is not the kind of site to post contradicting opinions on a whim, so I have to assume that your opinion simply hasn't been stated clearly (or I just missed it).
I remember a comment from an article to the effect of nVidia has a good driver release model. It is interesting to note that nVidia's driver release schedule over the last 6+ months hasn't been all that different from AMD's schedule.
nVidia's driver release schedule:
182.06 February 18, 2009
181.22 January 22, 2009
181.20 January 8, 2009
180.48 November 19, 2008
178.24 October 15, 2008
178.13 September 25, 2008
177.41 June 26, 2008
177.35 June 17, 2008
182.05b February 10, 2009
181.22b January 16, 2009
While nVidia doesn't have a specific release data, they have on average put an update out nearly every month. Being a user of more nVidia hardware than AMD hardware I know that nVidia uses beta drivers in a similar fashion to how AMD uses hot fixes.
In my opinion, AMD needs to ditch the set release date. They should try to target updates about once a month, but release them only when they are ready. I do think their driver team is likely understaffed, but I a think nVidia's The Way it's Meant to be Played program plays a significant role in why nVidia can support certain games well at launch and AMD can't. Though, I can't say whether it's simply good nVidia developer relations or forced negligence towards AMD. I don't really have a problem with them releasing hotfixes, especially given that AMD has had issues supporting some games at launch, but if their driver team and/or developer relations were better, they wouldn't need them as often.
I would, like to hear more clearly stated opinions from Derek and Anand on this subject. Thanks.
What's the word on overclocking these multi-gpu setups? I can only speak for the 9800 GX2 but it's a very good OCer. EVGA GX2's lowest end model is 600/1500/1000. The SSC edition is 675/?/1100. Same card, just factory overclocked.
I was running the lowest end model @ a stable 715/1787/1050 Currently testing out 720/1800/1065 (w/ the GX2's cover removed) with good results.
i think the 2 gtx 260's in Sli are a good value right now...that setup comes in the top of the charts most of the time competing with the 280 and 285 sli setups and right now...can be had for less than 400 at newegg after rebate (380ish i believe) which is just as cheap at 1 gtx 285 and 120 cheaper than a 295.....i dont think you can go wrong with that right now....and i cant wait to see 3 of those badboys in tri sli :)
Very few people (like around 1%) that read this article play games at 2560x1600, because that resolution requires a CRT monitor or a super high end LCD monitor (not even sure where to get an LCD that goes that high). I realize this article wanted to push the SLI and crossfire video card configurations and see what kind of FPS they would get at that resolution, but the FPS graphs in this article should be set at 1920x1200 by default, not 2560x1600, since that resolution is useless to almost every gamer.
I think the point was to show that UNLESS you are running >24" monitors there is no REASON to have CF/SLI. I would normally agree with you but in this particular review I agree with the writer.
I still think the broken line graphs showing all resolutions on the same x/y axis was more beneficial.
As usual, this isn't an article, meant for the proles, but for the aforementioned 1%.
I wouldn't read it if they used 8 GPUs and quadrupled that resolution...
Really, what do you wanna prove, Anandtech?
That you can test things, that don't matter to anyone?
That you can test things, because you can test things?
Come on!
Get something with a BIT more information value for the general crowd out here.
For example:
A big lot of us has still old GPUs in use. But if any piece of hardware is older than 2 months, you erase it from the benchmarking process, effectively annihilating any comparisons that could have been made.
So 66% are 1650 and below, and we know people like to exagerrate on the internet, so the numbers are actually HIGHER in the lower end.
So let's round up to 75% at 1650 1280 and 1024.
They didn't offer the 1440x900 monitor rez all over wal mart - not to mention 1280 800 laptops (who await $ tran$fer to a game rig) - .
Yes, perhaps not 1% but 3% isn't much different - this site would COLLAPSE INSTANTLY without the 75%- not true the other way around.
Frankjly I'd like to see the list of cards from both companies that do sli or xfire - I want to see just how big that list is - and I want everyone else to see it.
1920x1200 and 1680x1050 are definitely in more use, but this article is also useful for those users.
This article demonstrates the lack of necessity and value in multiGPU solutions at resolutions below 2560x1600 in most cases. This is important information for gamers to consider.
Any chance we could get some figures on average performance per dollar from the whole suite you through at it? And performance per power consumption figures would be awesome, too.
Just some suggestions that would benefit your readers.
I have a hard time not seeing the value in the 295. It's much closer performance-wise to 2x280/285 than one 280/285, yet costs much less, doesn't require an SLI motherboard, and consumes much less power at load and at idle.
It seems that you get a considerably larger performance boost for your money with the 295 than is traditional with the fastest graphics card available. Remember the 8800 Ultra? How much faster was it than the GTX, and how much price difference was there? The 9800GX2 was much worse. $600 for the same performance as two $200 8800GTS, and not much better power consumption numbers either, a very bad buy.
And just because most games out now run fine on a 260 at 1920x1200 and don't need anymore power, some of the value in buying the higher end is longevity. On of my friends actually bought an Ultra nearly two years ago. He's still using it and hasn't need an upgrade near as often as I have, as I usually go the midrange route. I'm always more tempted to upgrade as new things come out because of how much better they usually are than my older midrange hardware.
Overall, an excellent article. But I found it a bit 'cluttered' with all of the bar graphs in three different formats. Perhaps a line graph with all formats might be just as cluttered... Hmm. Maybe one button to change the default resolution for the article instead of the one selection / graph might help? And possibly another button to look at bar or line graphs? Food for thought... My thought. Could I get a CSV file of the raw data?
Great article Derek. I've been waiting for an update on the state of the multi gpu tech. Thank you for taking the time to include the 3 different resolutions and range of cards. Can't wait to see Tri and Quad gpu setups. Please keep the 1920x1200 resolutions in your upcoming article!
Great read. The single GPU/multiple GPU option is always a tough decision.
On paper the 4850x2 2gb is awesome. Amazon was selling these for $260 AR awhile back. Although I was in the market for a new GPU, I didn't buy one. If you read the newegg reviews, 20% of buyers give it 1 or 2 stars. Issues include heat, noise, and poor driver support. The card is also 11.5" long. I'd have to mangle my hard drive cage to make it fit. At the end of the day, I'd rather spend another $50 (GTX 280) and get a card that runs quiet, cool, and just works without headaches.
Nvidia cards do not perform AA correctly or at all. This has been a problem since the 1xx.xx version drivers were released right up through the latest 182.06 drivers. 9x.xx drivers and prior do not have this problem. This can easily be reproduced by using a 6,7 or early 8 series card and swapping between a 9x.xx driver and any 1xx.xx driver. This test cant be done on newer cards because 9x.xx drivers do not support the hardware.
Best case AA is only acting on objects close to your in game view point. Anything farther away gets no AA at all. Worst case AA does not function at all. This happens using the AA settings in game or through the driver it self. I find this problem most noticeable in racing games as there are lots of straight objects at a distance. ATI cards do not have this problem in my testing.
Nvidia has known about this problem near forever. I would guess its by design. Doing full screen AA takes horse power so if they limit or eliminate AA their cards will bench faster.
What really sucks is review sites seem not to care about image quality only FPS. While I'm on the subject what about 2d image quality and performance. Some of the newer cards just plain suck as far as 2d performance goes.
Now you may think I'm anti Nvidia well I'm not I'm running a 8800 GT in the box I'm typing this from. I tend to buy what I get the most bang for the buck from though the next card I buy will have working AA if you get the idea.
So Anandtech please start comparing 3d image quality in all reviews. While your at it test basic 2d image quality and 2d performance. Performance Test would be a good measure of 2d performance BTW.
Have you ever used a tool or edited the game profile yourself?
I had an 8800GTS 320MB that I used with AA extensively (Also with 3D stereoscopic), and I was told on a forum to use nHancer to modify the profile into a specific mode of Anti-Aliasing, I am pretty sure it worked. It was the beta 162.50 Quadro drivers I believe, you can just put your card's id into the inf and they install and work great.
It is possible the drivers work great and the control panel/GUI is piss-poor (a theory that may hold water).
I wish that nVidia would open up the drivers a little so that control freaks like myself could really tweak the settings to where I want them.
Yeah In my main rig right now i have a i7 920 with two 1gb 4850's i recently bought a third 4850 and installed it. There was some funky flickering, that i think was driver related in BF2 and HoI2 in 3-way mode, but most games seemed okay. Funny thing is... same thing happened when i tried a 3870x2 & 3870 in 3 way on my older x38 core2. I am really hoping these next articles will come with some additional commentary on image quality.
To the person who stated that the 9800gtx+ was comparable to the 4850x2. What R U thinking???
I have never really had a problem with any crossfire setups before except with 3-way and i wonder if it is the odd gpu count and if 4 would eliminate some issues. Looking forward the the upcoming articles, this is mostly a teaser with information many already knew.
I agree that the new format for graphs looks good line graphs are crap visually, but i think the default should be the 1920x1080/1200 that most people are interested in based on your survey data : )
THANK YOU !
" Yeah In my main rig right now i have a i7 920 with two 1gb 4850's i recently bought a third 4850 and installed it. There was some funky flickering, that i think was driver related in BF2 and HoI2 in 3-way mode, but most games seemed okay. Funny thing is... same thing happened when i tried a 3870x2 & 3870 in 3 way on my older x38 core2. I am really hoping these next articles will come with some additional commentary on image quality. "
________
Another PERFECT REASON to not mention "image quality" - the red fan boy wins again - assist +7 by Derek !
Amazing.
Thank you.
Have you tried forcing on transparency super-sampling? If you don't edges defined by transparency in the texture won't AA. By default Nvidia (ATI?) only AA edges defined by depth difference.
I've seen one review on that, with the blown up edged images, and the ati cards don't smooth and blurr as well - they have more jaggies - so they HAVE to leave that out here - cause you know Derek loves that red 4850 and all the red cards -
Or WHY the GTX260 isn't praised to the stars for running 20 of 21 tests successfully - taking THE WIN !
I guess it doesn;t matter when a gamer spends hundreds and hundreds on their dual gpu setup then it epic fails at games... gosh that wouldn't be irritating, would it ?
Amazing red bias...chizo pointed out the sapphire 4850 / other 4850 driver issues thankfully, while Derek has a special place in his heart for the bluebacked red card, and says so in the article - then translates that to ALL 4850's.
DREAM ON if you think that would happen with ANY GREEN card Derek has ever tested!
I'd like to see an article that rates overall systems in price to performance. Try to get as high as fps for the least amount of money spent.
As one reader mentioned frame rate below 15 fps doesn't count because it's unplayable, so just pick a number between 10 and 15 and subtract it from the fps. Maybe vary it by game. Frame rates over 60fps shouldn't count either because most monitors can't even show that.
This would be interesting because even small tweaks would make a difference e.g. adding a $60 sound card might get you 4 or 5 fps in a few games and might pay for itself.
It doesn't look like the GTX 260 Core 216 provides much, if any, tangible benefit over the GTX 260 according to these tests. Sure it had some wins, but they weren't very big ones and it also had some loses--albeit not very big ones either. One would be tempted to just get a GTX 260 or 4850 and wait to upgrade until the next generation of cards come out this summer. The time is getting close, anyways.
Good call.
Even the 4830 or the 9800GT twice either, or the 9800gtx gts250 or 9600gt or 9600gso twice each - or the ati the ati - uhh... uh... do the reds have their "midrange" filled up ? Uh.. the 4670 ?
LOL
Yeah, nvidia needs more midrange - right ?
LOL
THE RED LIARS ARE SOMETHING ELSE!
Oh, I'm sorry go to that new technolo9gy red series the 3000 series, and get that 3780... that covers that GAP the reds have that they constantly WHINE nvidia has but DOES NOT.
Yes, go back a full gpu gen for the reds card midrange...
GOOD GOLLY - how big have the reddies been lying? !?
HUGE!
I know you're trying to isolate and rate only the video cards but the fact of the matter is if you spend $200 on a video card that bottlenecks a $3000 system you have made a poor choice. By your metrics integrated video would "win" a number of tests because it is more or less free.
You should add a chart where you include the cost of the system. Also feel free to use something besides i7 965. Spending $1000 on a CPU in an article about CPU thrift seems wrong.
I'm not sure that the article was trying to demonstrate how these cards compare in a $3000.00 system, as much as it was trying to eliminate any possibility of a CPU bottleneck.
What about micro stuttering in MultiGPU configurations?
I just bought a HD4870 1GB today, the only reason i didn't choose a MultiGPU configuration was all the talkings about micro stuttering on german tech websites.
In general, we don't see micro stuttering except at high resolutions on memory intensive games that show average framerates that are already lowish ... games, hardware and drivers have gotten a bit better on that front when it comes to two GPUs to the point where we don't notice it as a problem when we do our hands on testing with two GPUs.
Nice job on the review Derek, certainly a big step up from recent reviews of the last 4-5 months. A few comments though:
1) Would be nice to see what happens and start a step-back, CPU scaling with 1 GPU, 2 GPU and 3 CPU. Obviously you'd have to cut down the number of GPUs tested, but perhaps 1 from each generation as a good starting point for this analysis.
2) Some games where there's clearly artificial frame caps or limits, why wouldn't you remove them in your testing first? For example, Fallout 3 allows you to remove the frame cap/smoothing limit, which would certainly be more useful info than a bunch of SLI configs hitting 60FPS cap.
3) COD5 is interesting though, did you contact Treyarch about the apparent 60FPS limit for single-GPU solutions? I don't recall any such cap with COD4.
4) Is the 4850X2 still dependent on custom drivers from Sapphire? I've read some horror stories about official releases not being compatible with the 4850X2, which would certainly put owners behind the 8-ball as a custom driver would certainly have the highest risk of being dropped when it comes to support.
5) Would've been nice to have seen an overclocked i7 used, since its clearly obvious CPU bottlenecks are going to come into play even more once you go to 3 and 4 GPU solutions, while reducing the gain and scaling for the faster individual solutions.
Lastly, do you plan on discussing or investigating the impact of multi-threaded optimizations from drivers in Vista/Win7? You mentioned it in your DX11 article, but both Nvidia and ATI have already made improvements in their drivers, which seem to be directly credited for some of the recent driver gains. Particularly, I'd like to see if its a WDDM 1.0-1.1 benefit from multi-threaded driver that extends to DX9,10,11 paths, or if its limited strictly to WDDM 1.0-1.1 and DX10+ paths.
" 2) Some games where there's clearly artificial frame caps or limits, why wouldn't you remove them in your testing first? For example, Fallout 3 allows you to remove the frame cap/smoothing limit, which would certainly be more useful info than a bunch of SLI configs hitting 60FPS cap.
3) COD5 is interesting though, did you contact Treyarch about the apparent 60FPS limit for single-GPU solutions? I don't recall any such cap with COD4.
4) Is the 4850X2 still dependent on custom drivers from Sapphire? I've read some horror stories about official releases not being compatible with the 4850X2, which would certainly put owners behind the 8-ball as a custom driver would certainly have the highest risk of being dropped when it comes to support.
"
#2 - a red rooster booster that limits nvidia winning by a large margin- unfairly.
#3 - Ditto
#4 - Ditto de opposite = this one boosts the red card unfairly
Yes, when I said "red fan boy" is all over Derek's articles, I meant it.
thanks for the feedback ... we'll consider some of this going forward.
we did what we could to remove artificial frame caps. in fallout 3 we set ipresentinterval to 0 in both .ini files and framerate does go above 60 -- it just doesn't average above 60 so it looks like a vsync issue when it's an LOD issue.
we didn't contact anyone about COD5, though there's a console variable that's supposed to help but didn't (except for the multiGPU soluitons).
we're looking at doing overclocking tests as a follow up. not 100% on that, but we do see the value.
as for the Sapphire 4850 X2, part of the reason we didn't review it initially was because we couldn't get drivers. Ever since 8.12 we've had full driver support for the X2 from AMD. We didn't use any specialized drivers for that card at all.
we can look at the impact of multithreaded optimizations, but this will likely not come until DX11 as most of the stuff we talked about requires DX11 to work. I'll talk to NVIDIA and AMD about current multithreaded optimizations, and if they say there is anything useful to see in current drivers we'll check it out.
"In general, more than one GPU isn't that necessary for 1920x1200 with the highest quality settings,..."
I see many computer setups with 22" LCDs and lower that have high end graphic cards. It just doesn't make sense to have a high end card when you're not utilizing the whole potential. Might as well save some money up front and if you do need more power, for higher resolutions later, you can always purchase an upgrade at a lower cost. Heck, most of the time there will be new models out :)
Then again, I have a qaud-core CPU that I don't utilize too but... :D
Everyone's situation is unique. In my case I just built a nice C2D system (OC'd to 3.8GHz with a lot of breathing room up top). I have a 4870 512meg that is definitely overkill with my massive 19" LCD (1280X1024). But within the year I plan on giving my dad or wife my 19" and going to a 22-24". Using your logic I should have purchased a 4850 (or even 4830) since I don't NEED the power. But I did plan ahead to future proof my system for when I can benefit from the 4870.
I think many people also don't upgrade their systems near as frequently as some of the enthusiasts do. So we spend a bit more than we would need to at that particular time to futureproof a year or two ahead.
The other side of the coin is that most likely for similar money, you could have bought something now that more closely matches your needs, and a 4870 in a year once it has been replaced by a new card if it still meets your needs.
Of course. Or I could spend $60 now, another $60 in 3 months, and you see the point. It's all dependant on your actual need, your perceived need, and your desire to not have to upgrade frequently.
I think the 4870 is one of those cards like the ATI 9800pro that has a perfect combination of price and performance to be a very good performer for the long haul (similarly to how the 8800GTS was probably the best part from a price/performance/longevity standpoint if you were to buy it the day it first came out).
Also important is looking at both companies and seeing what they are releasing in the next 3-6 months for your/my particular price range. Everything coming out seems to be focused either on the super high end, or the low end. I don't see any significant mid-range pieces coming out in the next 3-6 months that would have made me regret my purchase. If it was late summer or fall and I knew the next round of cards were coming out I *may* have opted for a 9600GT or other lower-midrange card to hold over until the next big thing but as it stands I'll get easily a year out of my card before I even feel the need to upgrade.
Frankly the difference between 70fps and 100fps at the resolutions I would be playing (my upgrade would be either to a 22 or 24") is pretty moot.
quote: This unique card really shined and held it's own all the way up to 2560x1600.
Fourth paragraph, closing comments:
quote: But when were talking multiple graphics cards for rendering it's really not worth it without the highest resolution around.
Please remove the apostrophe from the first sentence (where it should read its) and instead move it to the second (which should be we're).
Otherwise excellent article. This is the kind of work I remember from years past that originally brought me to the site.
One thing - would it be too difficult to create a performance/watt chart based on a composite performance score for each single/pair of cards?
I do think you really pushed the 4850X2 a bit too much. The 9800GTX+ provides about the same level of performance (better in some cases, worse in others) and the SLI version manages to kick the crap out of the GTX 280/285 nearly across the board (with the exception of a couple of 2560x1600 memory-constricted cases) at a lower price point. That's actually in my mind one of the best performance values available today.
Forget about Derek removing the apostrophe, how about removing the raging red fanboy ati drooling ?
When the GTX260 SLI scores the 20 games runs of 21, and the 4850 DOESN'T, Derek is sure to mention not the GTX260, and on the very same page blab the 4850 sapphire "ran every test"...
This is just another red raging fanboy blab - so screw the apostrophe !
Nvidiai DISSED 'em because they can see the articles Derek posts here bleeding red all over the place.
DUH.
I really appreciate the article and all the research and work that went into it. Kudos to you for it.
A small suggestion would be to take into account a minimal playable frame rate in the value and performance per dollar data, where a ZERO would be substituted for the frame rate in instances where a card failed to reach a playable rate for a given game/resolution. I feel this would more accurately reflect the value of the card(s) as 15 FPS in most games presents no value.
Minimum framerates should be more important then average ones even...
Interesting article though! I didn't know the 4850x2 was so competitive...
Only in Crysis it does worse the the 285, which I had in mind for my new pc...
That does make me wonder if the 285 might be a more future proof investment...
Yes, But I meant "minimum" in the sense of what the game needs to be played at, even you measure "average" - I just don't think it's fair to say that the 9800 GTX shows the highest performance per dollar on the Crysis Warhead 2560x1600 chart when it turns in frame rates of 13.5. To me that is ZERO value for the money because it's not playable. Someone wanting to play at those settings would be wasting EVERY dollar spent on the 9800 GTX.
Completely agree. Statistics mean nothing when not taken in a proper context. Zero, NA, or just leaving it blank would be much better. Someone looking to use that card would then click on a lower resolution and see if it is a viable choice. It would reduce the amount of data that needed to be compared as the reader of the article, and make caveats like in the explaination section about comparing between cards/resolutions etc. almost moot.
The framerate charts are all but worthless if you're focusing on how performance scales. Why not some line graphs with all three resolutions shown and card models along the x-axis? Then the reader could see how performance is affected by the lower memory of some models and how it scales linearly with the higher-end cards.
I would have to agree with this. I always enjoyed the broken line graphs that show multiple resolutions and framerates in the same graph. It makes comparisons very easy and more importantly allows EVERYONE to see their particular resolution without having to click on a link for a separate graph.
It's fine to keep your specialized performance/$ and % increase from a single card the way you have it as I understand what you mean about not comparing between resolutions but for the general frame rate comparisons I preferred the old way.
One thing I failed to see which I have seen in previous reviews with X-fire/SLI (or was it with tri/quad setups?) is the stuttering that can be present. I thought it was an Anand article but could have been from another site.
The charts are designed to autopop to 2650x - and we all know the red ragers have screamed the ati cards are great there.
EVERY CHART pops to the favored ati $2,000.00 monitor resolution.
It's not an accident, Derek made them.
Derek, if you want to impress, and this article does with its research, please invest in some writing manuals and learn some grammar.
Take this sentence:
"This unique card really shined and held it's own all the way up to 2560x1600."
Your use of "IT'S" in this instance is incorrect. IT'S is a contraction for IT IS, not a possessive word, which is ITS.
Or take this passage, "It's very surprising to us that AMD hasn't pushed this configuration and that Sapphire are the only manufacturer to have put one of these out there."
Sapphire is a company or organization, I realize that. But in this instance, you're referring to a group in its singular fashion, or as a single unit. That context is seen by the only manufacturer in the sentence.
That sentence should have read. "It's very surprising to us that AMD hasn't pushed this configuration and that Sapphire IS the only manufacturer to have put one of these out there."
Here's the rule for that (taken from both MLA and APA handbooks): If the action of the verb is on the group as a whole, treat the noun as a singular noun. If the action of the verb in on members of the group as individuals, treat the noun as a plural noun.
This means companies, such as Microsoft, IBM, Sapphire, Ford, etc., when being referred to the company as a whole collective, single entity, has to have a singular verb.
But, if you are referring to pieces of the whole, such as "the engineers of Ford are.....", or "The programmmers at Microsoft are.....".
Please invest in some proper English grammar texts and take time to read and learn from them. Your error laden grammar you write and use is quite distracting and detracts from what is supposed to be a professionally run hardware site.
Hire a proofreader or good copy editor if learning proper grammar is too difficult.
I don't really mind Anandtech articles as much in terms of presentation, spelling and graphics. Other sites such as Ars Technica, x-bit labs, and so forth are much worst. The first is first since they've started writing articles concerning everything, it seems.
If I did mind, I say stick to the general guidelines writing manuals, procedures, pamphlets, technical docs, etc. But being online, this isn't the case and won't ever be. Again, I don't mind as much because I do the same thing myself where I hardly pay attention to spelling or grammar when writing online. It's only when I write short stories or for work that I pay attention. Strange but comfortsure does make one do these things :)
"Your error laden grammar you write and use is quite distracting and detracts from what is supposed to be a professionally run hardware site."
That should read "The error laden grammar you use is quite distracting..." or just "Your error laden grammer is quite distracting..."
"Your error laden grammam you write and use..." is redundant.
Perhaps you should learn some grammar yourself before criticizing others about theirs.
So we have to be perfect in every way to point out errors? NBA players shouldn't listen to their coaches because their coaches can't play as well as they do? Game reviewers shouldn't trash a game because they couldn't make a better one?
When it comes to grammatical errors as insignificant as the ones pointed out, yes.
If you're going to be that critical, then you best check your own grammar.
I am curious to see how a pair of radeon 4830 would perform in this lineup. A single one is quite weak at those resolutions, but I am willing to bet a pair of those would hold its own against a single GTX280.
Oh, and it would be much cheaper, too ($180 including the bridge).
You are right that a single 4830 won't be enough perform on par with these guys ... but I don't think two of them would really be worth it against the GTX 280 except maybe at lower resolutions. The 1GB 4830 will run you at least $145, so you're looking at $290 for two of them and the 4850 X2 2GB is the same price. The 512MB 4830 will be limited by memory usage at higher resolutions just like the 4850 512MB.
We might look at the 4830 in CrossFire internally and see if it warrants an update, but so far it isn't in the roadmap for the rest of the series.
I was thinking 512MB 4830s, which are in the $90~$110 price range. That price range is the only reason I mention them, because it puts the price tag of a pair of those in the exact same range as a Radeon 4830 512MB or even a GTX260.
You said that a 4850 1GB doesn't make sense, and that's even more obvious for a 4830.
Most of the issues with the game are gone. There are currently no other MMO's out there that have the graphics or combat system to tax a gpu like this game. Your comment on testing a game that people play is very subjective. There are many MMO's out there that I would not touch....WOW, cough, cough.....but that doesn't mean other people don't enjoy them. I think having this game as one that is regularly benchmarked adds a great deal of value to the article.
It really is a great looking game for an MMO. It's not the most played MMO around, but it is definitely the easiest to test. There is an area near the beginning where the player is alone in the environment and it's always the same time of day and all that stuff ... It takes out some of the factors that make getting consistent data out of other MMOs incredibly difficult.
I've never had any real "issues" with it or with the results either. It's been very consistent as well. It does add value, and it's clear that games can be coded in a way that looks really good and perform like this one, so we feel it's important to getting a better feeling for what's out there and what's possible.
Not really a big deal, but could you cut out the offhand game review comments when introducing benchmarks? I.e.: "Crysis Warhead, while not the best game around..." It feels out of place in a hardware analysis.
Oh yes, and below don't forget the age of conan that favors the ati card - Derek can't stop drooling all over the place.
Then come to COD, where nvidia once again slaughters - red blood everywhere - Derek says "do we really need another war game~" or the like.
Derek is red fan central and cannot stop himself.
This game is poorly programmed in the first place, does it deserve to even be included in the benchmark tests? Yes, it has the programming necessary to for the test but they're poorly programmed.
The fact that CryEngine 2 is taxing on today's hardware (and that Crytek will no doubt use derivatives of it in future games), makes it very useful in benchmarks. I hope reviewers keep using it. But by all means, feel free to disassemble Crytek's binaries and point out their code's weaknesses.
what do u mean they shouldnt include crysis warhead??? its the seminal game to see how graphics performance is to get an idea of how a particular video card will perfrom in the future. Cryengine2 is the most advanced graphics engine on the market. If a video card can provide 30 fps on a cryengine @ your resolution, then its good to last u for atleast 2 years.
I totally disagree with it being the most advanced. It is a decent game engine especially for benchmarking, but....
In all reality the STALKER Clear Sky revamped xray engine is far and away more advanced and superior in almost every way. It is about the same or better in regards to taxing the system (low frame rates does not necessarily translate to the game is taxing the system.). Being that these are also used in similar FPS titles they would make a interesting comparrison.
I would really like to see Anand include or swap a clear sky bench (there is a premade one available), for the Crysis or Crysis warhead. Either way no big deal many other sites post results with a CS bench that view all the time.
I would love to see more info about the 4850 X2 1GB version. For over $50 cheaper, is the 1GB memory enough to compete? Is it worth paying 24% more for the 2GB version that you reviewed here?
I disagree. I see people finally moving away from their older 17-19" flat panels directly into 22" wide screens. 24" and 1920/1200 resolutions are no where near the norm.
Correct, but he said sweet spot because his/her wallet is just getting bulgy enough to comtenplate a movement in that direction... so - even he/she is sadly stuck at "the end user resolution"...
lol
Yes, oh well. I'm sure everyone is driving a Mazerati until you open their garage door....or golly that "EVO" just disappeared... must have been stolen.
The 1GB version should perform very similarly to the two 4850 cards in CrossFire.
The short answer is that the 1GB version won't have what it takes for 2560x1600 but it might work out well for lower resolutions.
We don't have a 1GB version, so we can't get more specific than that, though this is enough data to make a purchasing decision -- just look at the 4850 CrossFire option and take into consideration the cheaper price on the 1GB X2.
I realize this is an older article, however I always find it interesting to read when upgrading cards.
While I find it admirable that Derek has compared the 'older' GTX 280 SLI scaling, it is unfortunate that he hasn't pointed out that it should perform identically to the GTX 285s if the clocks were the same.
This was also passed over in the "worthy successor" article, where it does not compare clock for clock numbers - an obvious test, if we want to discover the full value of the die shrink.
I recently 'upgraded' to 3 GTX 285s from 3 GTX 280s through warranty program with the mfg, and there is little to no difference in performance between the 2 setups. While cabling is more convenient (no 6 to 8 pin adapters), the 285s won't clock any better than my 280s would, Vantage scores are within a couple hundred points of each other at the same clocks (the 280s actually leading), and the temperature and fan speed of the new cards hasn't improved.
I think this is a valuable point in an article that compares performance per dollar, and while slightly outside the scope of the article, I think it's a probabtive observation to make.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
95 Comments
Back to Article
SiliconDoc - Wednesday, March 18, 2009 - link
1.Let me help you out. More people have 1280x1024 monitors, and any who do have flat panels, are most likely stuck at 1680x1050 because the 1920x monitors command a premium.So MOST GAMERS are far below 2650, and 1920, and even 1650, and some can't run 1280x1024.
A common game rez is 1024x768, 800x600 is also used currently on all the high end games - both especially with gamers with brand name store bought systems - we all know the big names ( not the multiple thousand dollar gaming brands - that's one of 50 gamers!). When you're stuck in a lab with $2,000 monitors and then travel to checking out the cebit babes, staying in touch with the average gamer is difficult, to say the least.
2. Even though the 260 passes 20 of 21 tests, and the 4850 passes LESS, Derek the red just HAS to state that the Sapphire passed every test they threw at it. Now the upper number doesn't jibe with that - the one showing the 4850 worked in LESS situations than the GTX260 -
BUT THE RED RAGE FUD NEVER ENDS.
( obviously more than one 4850 brand was in play - NEVER THE LESS - that is the type of CONSTANT red slant that is all over EVERY SINGLE PAGE.)
Hrel - Thursday, March 5, 2009 - link
I found an oddity. This articles posted power consumption doesn't correspond to other articles findings; yes I know I can't cross reference data across articles. In this article:http://www.anandtech.com/video/showdoc.aspx?i=3437...">http://www.anandtech.com/video/showdoc.aspx?i=3437...
The 9800GTX+ uses less energy than the HD4850 and in this multi-GPU article it's the other way around. I know I can't compare total system power findings but I'd think that the cards would at least be in the same order.
Can someone PLEASE clear this up for me? I'm very confused as to which card is more power efficient as the findings have varied so heavily across different articles.
It would probably be a good idea to list the power consumption of only the GPU, instead of the whole system. Doing that would remove variables, which is of course, a goal of any scientific experiment.
Hrel - Friday, February 27, 2009 - link
I just wanted to say, this is a great article. I'm glad you guys finally wrote an article comparing pretty much every card out there that people will consider buying, you could have included some lower end cards and resolutions. But hey, it takes a lot of time to do that many tests accurately; I appreciate all that you did do. It really was a well written and interesting article to read; good job anandtech.P.S. Any news on when new GPU's will be coming from Nvidia/AMD. I've noticed a LOT of price drops and mail in rebates on GPU's recently.
Zak - Wednesday, February 25, 2009 - link
"In general, more than one GPU isn't that necessary for 1920x1200"Depending on the definition of "that":) I disagree, gameplay is so much smoother with two cards on a 24" monitor. I tried playing with one 285 versus two, and while I don't have any real numbers, the differences is very noticeable. Crysis never dips below 40fps. This is my first real SLI since Voodoo II and I'm very glad I did it. You can always buy one card and decide if you want the second one later. I paid under $700 for two 285s and also got a copy of World at War and FarCry2 with two cards (a fluke maybe), that's almost $100 worth of games:)
Z.
magnusr - Wednesday, February 25, 2009 - link
Im currently using an ATI 4850 512MB at 1920x1080 on an x58 chipset system.I just ordered another 4850 for crossifre usage due to this article.
Thanks for the tip. Keep up the good work.
GoodRevrnd - Wednesday, February 25, 2009 - link
I know it's 2 years old, but it is still a relatively taxing game for it's age and is still relevant with the Tales of Valor expansion coming out... so could you PLEASE add Company of Heroes back into your reviews? I would just ask for Dawn of War 2 benchmarks, but it's the same engine and they didn't include a nice performance test. As it stands there are no RTS's in your hardware reviews. I'd just like to see a broader spectrum of games represented.Razorbladehaze - Wednesday, February 25, 2009 - link
I second the motion to include CoH.poohbear - Wednesday, February 25, 2009 - link
i gotta commend u on ur graphs, very clean and very easy to understand unlike some other sites that have confounding line graphs. i especially love how i can choose between different resolutions to see the diff instantly. Kudos to u guys.:)Nighttrojan - Wednesday, February 25, 2009 - link
I upgraded my gpu awhile back and got a MSI 4870 1 gig (ATI) card to replace my old MSI 7600gt 256 mb (Nvidia). However, although I can get better quality graphics with the ATI card, the Nvidia card is actually faster when playing Vanguard soh.... The game loads slower and takes more time shifting graphics (graphic lag), but the graphics look better and I can turn up settings a little higher. Any ideas on what's going on?Early Core2Duo 6600
Early ab9 pro abit motherboard
All drivers up to date
Razorbladehaze - Wednesday, February 25, 2009 - link
Any time you increase graphics settings on a game you increase the amount of information that needs to be off loaded to and from HDD, CPU, RAM, GPU so this is likely you reason for slower load times.But that card is in not actually "faster" than the 4870, in actuality that card is probably 2-3x better at rendering then the 7600 across all games.
Nighttrojan - Wednesday, February 25, 2009 - link
Well, the problem is that I tried it at the same settings too, the 7600gt was still faster. The difference is so great that it takes about 15s for the 4870 to even process a change in settings while with the 7600 gt it's practically instant.mrmarks - Tuesday, February 24, 2009 - link
Is bolting two video cards together really necessary? I've played Call of Duty at max settings on my two year old midrange card and it ran beautifully. These cards seemed to be designed for games not of today or tomorrow, but rather for games that may never exist. Few store fronts sell pc games today, and many of the games produced today are not terribly graphics intensive. Also, most popular pc games are available on consoles, which are much more practical. I know this sounds negative but the truth is that the video card manufactures are just ignoring the current pc game market.SiliconDoc - Wednesday, March 18, 2009 - link
Geeze dude, there are hundreds of PC games out every year. I suppose the couch console has a wider selection, but it's not like we're dead space here. (hahah)I guess it's easier to sell to Santa for the little curtain climbers when even Daddy has a resistence problem. ( Believe me, it's a standing joke amongst friends).
Then we have the kiddies screwing up the computer issue - not a problem when the household heads do so... the console keeps the PC functional, so to speak.
But really, lots of game choices for the PC - the biggest retailer in the USA - you know whom - has loads of PC games. Ever heard of best buy - aww just forget it.
Just go whine somewhere else would ya ?
Elfear - Tuesday, February 24, 2009 - link
^ Go take a 2nd look at the 2560x1600 res graphs. In almost every case the single cards are struggling to keep their heads above water and in a few cases failing miserably. Ideally, minimum framerates would be >60fps in all of today's games for that buttery-smooth feel but we are far from that with even dual graphics cards.I agree with you that dual graphics cards are not needed by most gamers but to say that there is no need period is ignoring reality.
JPForums - Tuesday, February 24, 2009 - link
I'm confused about your opinion on AMD drivers. You made this comment in this article:"Because of AMD's driver issues, we often have to wait when new games come out to enjoy proper CrossFire scaling with them. And when hotfixes come out it takes more than a month (usually more like two) to fully integrate changes into a WHQL driver."
However, you've also made comments in multiple articles similar to this one from GPU Transcoding Throwdown: Elemental's Badaboom vs. AMD's Avivo Video Converter on December 15th, 2008:
"The train wreck that has been the last few months of Catalyst has happened before and it will happen again as long as AMD puts too many resources into pushing drivers out every month and not enough into making sure those drivers are of high enough quality."
So on the one hand it seems you are criticizing AMD for not releasing WHQL drivers soon enough. While on the other hand, you seem to want them to take more time with their drivers. Anandtech is not the kind of site to post contradicting opinions on a whim, so I have to assume that your opinion simply hasn't been stated clearly (or I just missed it).
I remember a comment from an article to the effect of nVidia has a good driver release model. It is interesting to note that nVidia's driver release schedule over the last 6+ months hasn't been all that different from AMD's schedule.
nVidia's driver release schedule:
182.06 February 18, 2009
181.22 January 22, 2009
181.20 January 8, 2009
180.48 November 19, 2008
178.24 October 15, 2008
178.13 September 25, 2008
177.41 June 26, 2008
177.35 June 17, 2008
182.05b February 10, 2009
181.22b January 16, 2009
While nVidia doesn't have a specific release data, they have on average put an update out nearly every month. Being a user of more nVidia hardware than AMD hardware I know that nVidia uses beta drivers in a similar fashion to how AMD uses hot fixes.
In my opinion, AMD needs to ditch the set release date. They should try to target updates about once a month, but release them only when they are ready. I do think their driver team is likely understaffed, but I a think nVidia's The Way it's Meant to be Played program plays a significant role in why nVidia can support certain games well at launch and AMD can't. Though, I can't say whether it's simply good nVidia developer relations or forced negligence towards AMD. I don't really have a problem with them releasing hotfixes, especially given that AMD has had issues supporting some games at launch, but if their driver team and/or developer relations were better, they wouldn't need them as often.
I would, like to hear more clearly stated opinions from Derek and Anand on this subject. Thanks.
SiliconDoc - Wednesday, March 18, 2009 - link
Here's a clearly stated opinion. ATI KEEPS ******* IT UP, while their competition does a good job.Patrick Wolf - Tuesday, February 24, 2009 - link
What's the word on overclocking these multi-gpu setups? I can only speak for the 9800 GX2 but it's a very good OCer. EVGA GX2's lowest end model is 600/1500/1000. The SSC edition is 675/?/1100. Same card, just factory overclocked.I was running the lowest end model @ a stable 715/1787/1050 Currently testing out 720/1800/1065 (w/ the GX2's cover removed) with good results.
mpk1980 - Tuesday, February 24, 2009 - link
i think the 2 gtx 260's in Sli are a good value right now...that setup comes in the top of the charts most of the time competing with the 280 and 285 sli setups and right now...can be had for less than 400 at newegg after rebate (380ish i believe) which is just as cheap at 1 gtx 285 and 120 cheaper than a 295.....i dont think you can go wrong with that right now....and i cant wait to see 3 of those badboys in tri sli :)gigahertz20 - Monday, February 23, 2009 - link
Very few people (like around 1%) that read this article play games at 2560x1600, because that resolution requires a CRT monitor or a super high end LCD monitor (not even sure where to get an LCD that goes that high). I realize this article wanted to push the SLI and crossfire video card configurations and see what kind of FPS they would get at that resolution, but the FPS graphs in this article should be set at 1920x1200 by default, not 2560x1600, since that resolution is useless to almost every gamer.7Enigma - Tuesday, February 24, 2009 - link
I think the point was to show that UNLESS you are running >24" monitors there is no REASON to have CF/SLI. I would normally agree with you but in this particular review I agree with the writer.I still think the broken line graphs showing all resolutions on the same x/y axis was more beneficial.
DerekWilson - Tuesday, February 24, 2009 - link
actually, you can't get CRTs that go that high afaik -- the highest res CRTs I've seen go to 2048x1536 ...30" LCD monitors support this resolution such as the Dell I use. Apple among others also make 30" LCDs with 2560x1600 resolution.
The barrier to entry with 30" LCD monitors that do 2560x1600 is about $1000 ...
Finally - Tuesday, February 24, 2009 - link
As usual, this isn't an article, meant for the proles, but for the aforementioned 1%.I wouldn't read it if they used 8 GPUs and quadrupled that resolution...
Really, what do you wanna prove, Anandtech?
That you can test things, that don't matter to anyone?
That you can test things, because you can test things?
Come on!
Get something with a BIT more information value for the general crowd out here.
For example:
A big lot of us has still old GPUs in use. But if any piece of hardware is older than 2 months, you erase it from the benchmarking process, effectively annihilating any comparisons that could have been made.
SiliconDoc - Wednesday, March 18, 2009 - link
So 66% are 1650 and below, and we know people like to exagerrate on the internet, so the numbers are actually HIGHER in the lower end.So let's round up to 75% at 1650 1280 and 1024.
They didn't offer the 1440x900 monitor rez all over wal mart - not to mention 1280 800 laptops (who await $ tran$fer to a game rig) - .
Yes, perhaps not 1% but 3% isn't much different - this site would COLLAPSE INSTANTLY without the 75%- not true the other way around.
Frankjly I'd like to see the list of cards from both companies that do sli or xfire - I want to see just how big that list is - and I want everyone else to see it.
DerekWilson - Tuesday, February 24, 2009 - link
Actually, from a recent poll we did, 2560x1600 usage is around 3% among AnandTech readers.http://anandtech.com/weblog/showpost.aspx?i=547">http://anandtech.com/weblog/showpost.aspx?i=547
1920x1200 and 1680x1050 are definitely in more use, but this article is also useful for those users.
This article demonstrates the lack of necessity and value in multiGPU solutions at resolutions below 2560x1600 in most cases. This is important information for gamers to consider.
Jamer - Monday, February 23, 2009 - link
What a great article! Absolutely brilliant. This helps so much, bringing simplicity to all those possible GPU choices. Thank you!SirKronan - Monday, February 23, 2009 - link
Any chance we could get some figures on average performance per dollar from the whole suite you through at it? And performance per power consumption figures would be awesome, too.Just some suggestions that would benefit your readers.
I have a hard time not seeing the value in the 295. It's much closer performance-wise to 2x280/285 than one 280/285, yet costs much less, doesn't require an SLI motherboard, and consumes much less power at load and at idle.
It seems that you get a considerably larger performance boost for your money with the 295 than is traditional with the fastest graphics card available. Remember the 8800 Ultra? How much faster was it than the GTX, and how much price difference was there? The 9800GX2 was much worse. $600 for the same performance as two $200 8800GTS, and not much better power consumption numbers either, a very bad buy.
And just because most games out now run fine on a 260 at 1920x1200 and don't need anymore power, some of the value in buying the higher end is longevity. On of my friends actually bought an Ultra nearly two years ago. He's still using it and hasn't need an upgrade near as often as I have, as I usually go the midrange route. I'm always more tempted to upgrade as new things come out because of how much better they usually are than my older midrange hardware.
croc - Monday, February 23, 2009 - link
Overall, an excellent article. But I found it a bit 'cluttered' with all of the bar graphs in three different formats. Perhaps a line graph with all formats might be just as cluttered... Hmm. Maybe one button to change the default resolution for the article instead of the one selection / graph might help? And possibly another button to look at bar or line graphs? Food for thought... My thought. Could I get a CSV file of the raw data?mhouck - Monday, February 23, 2009 - link
Great article Derek. I've been waiting for an update on the state of the multi gpu tech. Thank you for taking the time to include the 3 different resolutions and range of cards. Can't wait to see Tri and Quad gpu setups. Please keep the 1920x1200 resolutions in your upcoming article!sabrewolfy - Monday, February 23, 2009 - link
Great read. The single GPU/multiple GPU option is always a tough decision.On paper the 4850x2 2gb is awesome. Amazon was selling these for $260 AR awhile back. Although I was in the market for a new GPU, I didn't buy one. If you read the newegg reviews, 20% of buyers give it 1 or 2 stars. Issues include heat, noise, and poor driver support. The card is also 11.5" long. I'd have to mangle my hard drive cage to make it fit. At the end of the day, I'd rather spend another $50 (GTX 280) and get a card that runs quiet, cool, and just works without headaches.
dubyadubya - Monday, February 23, 2009 - link
Nvidia cards do not perform AA correctly or at all. This has been a problem since the 1xx.xx version drivers were released right up through the latest 182.06 drivers. 9x.xx drivers and prior do not have this problem. This can easily be reproduced by using a 6,7 or early 8 series card and swapping between a 9x.xx driver and any 1xx.xx driver. This test cant be done on newer cards because 9x.xx drivers do not support the hardware.Best case AA is only acting on objects close to your in game view point. Anything farther away gets no AA at all. Worst case AA does not function at all. This happens using the AA settings in game or through the driver it self. I find this problem most noticeable in racing games as there are lots of straight objects at a distance. ATI cards do not have this problem in my testing.
Nvidia forums has had several threads over the years about this problem. Here is 40 plus page thread about the problem. This thread was closed because someone said a bad word "ATI".
http://forums.nvidia.com/index.php?showtopic=58863...">http://forums.nvidia.com/index.php?showtopic=58863...
Nvidia has known about this problem near forever. I would guess its by design. Doing full screen AA takes horse power so if they limit or eliminate AA their cards will bench faster.
What really sucks is review sites seem not to care about image quality only FPS. While I'm on the subject what about 2d image quality and performance. Some of the newer cards just plain suck as far as 2d performance goes.
Now you may think I'm anti Nvidia well I'm not I'm running a 8800 GT in the box I'm typing this from. I tend to buy what I get the most bang for the buck from though the next card I buy will have working AA if you get the idea.
So Anandtech please start comparing 3d image quality in all reviews. While your at it test basic 2d image quality and 2d performance. Performance Test would be a good measure of 2d performance BTW.
nubie - Sunday, March 1, 2009 - link
Have you ever used a tool or edited the game profile yourself?I had an 8800GTS 320MB that I used with AA extensively (Also with 3D stereoscopic), and I was told on a forum to use nHancer to modify the profile into a specific mode of Anti-Aliasing, I am pretty sure it worked. It was the beta 162.50 Quadro drivers I believe, you can just put your card's id into the inf and they install and work great.
It is possible the drivers work great and the control panel/GUI is piss-poor (a theory that may hold water).
I wish that nVidia would open up the drivers a little so that control freaks like myself could really tweak the settings to where I want them.
Razorbladehaze - Monday, February 23, 2009 - link
Yeah In my main rig right now i have a i7 920 with two 1gb 4850's i recently bought a third 4850 and installed it. There was some funky flickering, that i think was driver related in BF2 and HoI2 in 3-way mode, but most games seemed okay. Funny thing is... same thing happened when i tried a 3870x2 & 3870 in 3 way on my older x38 core2. I am really hoping these next articles will come with some additional commentary on image quality.To the person who stated that the 9800gtx+ was comparable to the 4850x2. What R U thinking???
I have never really had a problem with any crossfire setups before except with 3-way and i wonder if it is the odd gpu count and if 4 would eliminate some issues. Looking forward the the upcoming articles, this is mostly a teaser with information many already knew.
I agree that the new format for graphs looks good line graphs are crap visually, but i think the default should be the 1920x1080/1200 that most people are interested in based on your survey data : )
See I pay attention.
SiliconDoc - Wednesday, March 18, 2009 - link
THANK YOU !" Yeah In my main rig right now i have a i7 920 with two 1gb 4850's i recently bought a third 4850 and installed it. There was some funky flickering, that i think was driver related in BF2 and HoI2 in 3-way mode, but most games seemed okay. Funny thing is... same thing happened when i tried a 3870x2 & 3870 in 3 way on my older x38 core2. I am really hoping these next articles will come with some additional commentary on image quality. "
________
Another PERFECT REASON to not mention "image quality" - the red fan boy wins again - assist +7 by Derek !
Amazing.
Thank you.
MagicPants - Monday, February 23, 2009 - link
Have you tried forcing on transparency super-sampling? If you don't edges defined by transparency in the texture won't AA. By default Nvidia (ATI?) only AA edges defined by depth difference.SiliconDoc - Wednesday, March 18, 2009 - link
I've seen one review on that, with the blown up edged images, and the ati cards don't smooth and blurr as well - they have more jaggies - so they HAVE to leave that out here - cause you know Derek loves that red 4850 and all the red cards -Elfear - Monday, February 23, 2009 - link
Derrick (or anyone else for that matter) can you comment on why the 4870 512MB Crossfire solution generally performed better than the 4870X2?SiliconDoc - Wednesday, March 18, 2009 - link
Or WHY the GTX260 isn't praised to the stars for running 20 of 21 tests successfully - taking THE WIN !I guess it doesn;t matter when a gamer spends hundreds and hundreds on their dual gpu setup then it epic fails at games... gosh that wouldn't be irritating, would it ?
Amazing red bias...chizo pointed out the sapphire 4850 / other 4850 driver issues thankfully, while Derek has a special place in his heart for the bluebacked red card, and says so in the article - then translates that to ALL 4850's.
DREAM ON if you think that would happen with ANY GREEN card Derek has ever tested!
MagicPants - Monday, February 23, 2009 - link
I'd like to see an article that rates overall systems in price to performance. Try to get as high as fps for the least amount of money spent.As one reader mentioned frame rate below 15 fps doesn't count because it's unplayable, so just pick a number between 10 and 15 and subtract it from the fps. Maybe vary it by game. Frame rates over 60fps shouldn't count either because most monitors can't even show that.
This would be interesting because even small tweaks would make a difference e.g. adding a $60 sound card might get you 4 or 5 fps in a few games and might pay for itself.
marsbound2024 - Monday, February 23, 2009 - link
It doesn't look like the GTX 260 Core 216 provides much, if any, tangible benefit over the GTX 260 according to these tests. Sure it had some wins, but they weren't very big ones and it also had some loses--albeit not very big ones either. One would be tempted to just get a GTX 260 or 4850 and wait to upgrade until the next generation of cards come out this summer. The time is getting close, anyways.SiliconDoc - Wednesday, March 18, 2009 - link
Good call.Even the 4830 or the 9800GT twice either, or the 9800gtx gts250 or 9600gt or 9600gso twice each - or the ati the ati - uhh... uh... do the reds have their "midrange" filled up ? Uh.. the 4670 ?
LOL
Yeah, nvidia needs more midrange - right ?
LOL
THE RED LIARS ARE SOMETHING ELSE!
SiliconDoc - Wednesday, March 18, 2009 - link
Oh, I'm sorry go to that new technolo9gy red series the 3000 series, and get that 3780... that covers that GAP the reds have that they constantly WHINE nvidia has but DOES NOT.Yes, go back a full gpu gen for the reds card midrange...
GOOD GOLLY - how big have the reddies been lying? !?
HUGE!
MagicPants - Monday, February 23, 2009 - link
I know you're trying to isolate and rate only the video cards but the fact of the matter is if you spend $200 on a video card that bottlenecks a $3000 system you have made a poor choice. By your metrics integrated video would "win" a number of tests because it is more or less free.You should add a chart where you include the cost of the system. Also feel free to use something besides i7 965. Spending $1000 on a CPU in an article about CPU thrift seems wrong.
oldscotch - Monday, February 23, 2009 - link
I'm not sure that the article was trying to demonstrate how these cards compare in a $3000.00 system, as much as it was trying to eliminate any possibility of a CPU bottleneck.MagicPants - Monday, February 23, 2009 - link
Sure if the article was about pure performance this would make sense, but in performance per dollar it's out of place.If you build a system for $3000 and stick a $200 gtx 260 in it and get 30 fps you've just paid $106 ($3200/30fps)per fps.
Take that same $3000 dollar system and stick a $500 gtx 295 and assume you get 50 fps in the same game. Now you've paid just $70($3500/50fps) per fps.
In the context of that $3000 system the gtx 295 is the better buy because the system is "bottlenecking" the price.
OSJF - Monday, February 23, 2009 - link
What about micro stuttering in MultiGPU configurations?I just bought a HD4870 1GB today, the only reason i didn't choose a MultiGPU configuration was all the talkings about micro stuttering on german tech websites.
DerekWilson - Monday, February 23, 2009 - link
In general, we don't see micro stuttering except at high resolutions on memory intensive games that show average framerates that are already lowish ... games, hardware and drivers have gotten a bit better on that front when it comes to two GPUs to the point where we don't notice it as a problem when we do our hands on testing with two GPUs.chizow - Monday, February 23, 2009 - link
Nice job on the review Derek, certainly a big step up from recent reviews of the last 4-5 months. A few comments though:1) Would be nice to see what happens and start a step-back, CPU scaling with 1 GPU, 2 GPU and 3 CPU. Obviously you'd have to cut down the number of GPUs tested, but perhaps 1 from each generation as a good starting point for this analysis.
2) Some games where there's clearly artificial frame caps or limits, why wouldn't you remove them in your testing first? For example, Fallout 3 allows you to remove the frame cap/smoothing limit, which would certainly be more useful info than a bunch of SLI configs hitting 60FPS cap.
3) COD5 is interesting though, did you contact Treyarch about the apparent 60FPS limit for single-GPU solutions? I don't recall any such cap with COD4.
4) Is the 4850X2 still dependent on custom drivers from Sapphire? I've read some horror stories about official releases not being compatible with the 4850X2, which would certainly put owners behind the 8-ball as a custom driver would certainly have the highest risk of being dropped when it comes to support.
5) Would've been nice to have seen an overclocked i7 used, since its clearly obvious CPU bottlenecks are going to come into play even more once you go to 3 and 4 GPU solutions, while reducing the gain and scaling for the faster individual solutions.
Lastly, do you plan on discussing or investigating the impact of multi-threaded optimizations from drivers in Vista/Win7? You mentioned it in your DX11 article, but both Nvidia and ATI have already made improvements in their drivers, which seem to be directly credited for some of the recent driver gains. Particularly, I'd like to see if its a WDDM 1.0-1.1 benefit from multi-threaded driver that extends to DX9,10,11 paths, or if its limited strictly to WDDM 1.0-1.1 and DX10+ paths.
Thanks, look forward to the rest.
SiliconDoc - Wednesday, March 18, 2009 - link
Thank you very much." 2) Some games where there's clearly artificial frame caps or limits, why wouldn't you remove them in your testing first? For example, Fallout 3 allows you to remove the frame cap/smoothing limit, which would certainly be more useful info than a bunch of SLI configs hitting 60FPS cap.
3) COD5 is interesting though, did you contact Treyarch about the apparent 60FPS limit for single-GPU solutions? I don't recall any such cap with COD4.
4) Is the 4850X2 still dependent on custom drivers from Sapphire? I've read some horror stories about official releases not being compatible with the 4850X2, which would certainly put owners behind the 8-ball as a custom driver would certainly have the highest risk of being dropped when it comes to support.
"
#2 - a red rooster booster that limits nvidia winning by a large margin- unfairly.
#3 - Ditto
#4 - Ditto de opposite = this one boosts the red card unfairly
Yes, when I said "red fan boy" is all over Derek's articles, I meant it.
DerekWilson - Monday, February 23, 2009 - link
thanks for the feedback ... we'll consider some of this going forward.we did what we could to remove artificial frame caps. in fallout 3 we set ipresentinterval to 0 in both .ini files and framerate does go above 60 -- it just doesn't average above 60 so it looks like a vsync issue when it's an LOD issue.
we didn't contact anyone about COD5, though there's a console variable that's supposed to help but didn't (except for the multiGPU soluitons).
we're looking at doing overclocking tests as a follow up. not 100% on that, but we do see the value.
as for the Sapphire 4850 X2, part of the reason we didn't review it initially was because we couldn't get drivers. Ever since 8.12 we've had full driver support for the X2 from AMD. We didn't use any specialized drivers for that card at all.
we can look at the impact of multithreaded optimizations, but this will likely not come until DX11 as most of the stuff we talked about requires DX11 to work. I'll talk to NVIDIA and AMD about current multithreaded optimizations, and if they say there is anything useful to see in current drivers we'll check it out.
thanks again for the feedback
chizow - Monday, February 23, 2009 - link
Oh and specifically, much better layout/graph display with the resolution selections! :)Hauk - Monday, February 23, 2009 - link
To you grammer police... get a life will ya?!?Who gives a rats ass! It's the data!
Your smug comments are of ZERO value here. You want to critique, go to a scholarly forum and do so.
Your whining is more of a distraction! How's that for gramaticly correct?
Slappi - Tuesday, February 24, 2009 - link
It should be grammar not grammer.
SiliconDoc - Wednesday, March 18, 2009 - link
Grammatically was also spelled incorrectly.lol
The0ne - Monday, February 23, 2009 - link
"In general, more than one GPU isn't that necessary for 1920x1200 with the highest quality settings,..."I see many computer setups with 22" LCDs and lower that have high end graphic cards. It just doesn't make sense to have a high end card when you're not utilizing the whole potential. Might as well save some money up front and if you do need more power, for higher resolutions later, you can always purchase an upgrade at a lower cost. Heck, most of the time there will be new models out :)
Then again, I have a qaud-core CPU that I don't utilize too but... :D
7Enigma - Monday, February 23, 2009 - link
Everyone's situation is unique. In my case I just built a nice C2D system (OC'd to 3.8GHz with a lot of breathing room up top). I have a 4870 512meg that is definitely overkill with my massive 19" LCD (1280X1024). But within the year I plan on giving my dad or wife my 19" and going to a 22-24". Using your logic I should have purchased a 4850 (or even 4830) since I don't NEED the power. But I did plan ahead to future proof my system for when I can benefit from the 4870.I think many people also don't upgrade their systems near as frequently as some of the enthusiasts do. So we spend a bit more than we would need to at that particular time to futureproof a year or two ahead.
Different strokes and all that...
strikeback03 - Monday, February 23, 2009 - link
The other side of the coin is that most likely for similar money, you could have bought something now that more closely matches your needs, and a 4870 in a year once it has been replaced by a new card if it still meets your needs.7Enigma - Tuesday, February 24, 2009 - link
Of course. Or I could spend $60 now, another $60 in 3 months, and you see the point. It's all dependant on your actual need, your perceived need, and your desire to not have to upgrade frequently.I think the 4870 is one of those cards like the ATI 9800pro that has a perfect combination of price and performance to be a very good performer for the long haul (similarly to how the 8800GTS was probably the best part from a price/performance/longevity standpoint if you were to buy it the day it first came out).
Also important is looking at both companies and seeing what they are releasing in the next 3-6 months for your/my particular price range. Everything coming out seems to be focused either on the super high end, or the low end. I don't see any significant mid-range pieces coming out in the next 3-6 months that would have made me regret my purchase. If it was late summer or fall and I knew the next round of cards were coming out I *may* have opted for a 9600GT or other lower-midrange card to hold over until the next big thing but as it stands I'll get easily a year out of my card before I even feel the need to upgrade.
Frankly the difference between 70fps and 100fps at the resolutions I would be playing (my upgrade would be either to a 22 or 24") is pretty moot.
armandbr - Monday, February 23, 2009 - link
http://www.xbitlabs.com/articles/video/display/rad...">http://www.xbitlabs.com/articles/video/display/rad...here you go
Denithor - Monday, February 23, 2009 - link
Second paragraph, closing comments:Fourth paragraph, closing comments:
Please remove the apostrophe from the first sentence (where it should read its) and instead move it to the second (which should be we're).
Otherwise excellent article. This is the kind of work I remember from years past that originally brought me to the site.
One thing - would it be too difficult to create a performance/watt chart based on a composite performance score for each single/pair of cards?
I do think you really pushed the 4850X2 a bit too much. The 9800GTX+ provides about the same level of performance (better in some cases, worse in others) and the SLI version manages to kick the crap out of the GTX 280/285 nearly across the board (with the exception of a couple of 2560x1600 memory-constricted cases) at a lower price point. That's actually in my mind one of the best performance values available today.
SiliconDoc - Wednesday, March 18, 2009 - link
Forget about Derek removing the apostrophe, how about removing the raging red fanboy ati drooling ?When the GTX260 SLI scores the 20 games runs of 21, and the 4850 DOESN'T, Derek is sure to mention not the GTX260, and on the very same page blab the 4850 sapphire "ran every test"...
This is just another red raging fanboy blab - so screw the apostrophe !
Nvidiai DISSED 'em because they can see the articles Derek posts here bleeding red all over the place.
DUH.
makdaddy626 - Monday, February 23, 2009 - link
I really appreciate the article and all the research and work that went into it. Kudos to you for it.A small suggestion would be to take into account a minimal playable frame rate in the value and performance per dollar data, where a ZERO would be substituted for the frame rate in instances where a card failed to reach a playable rate for a given game/resolution. I feel this would more accurately reflect the value of the card(s) as 15 FPS in most games presents no value.
Mastakilla - Monday, February 23, 2009 - link
I agreeMinimum framerates should be more important then average ones even...
Interesting article though! I didn't know the 4850x2 was so competitive...
Only in Crysis it does worse the the 285, which I had in mind for my new pc...
That does make me wonder if the 285 might be a more future proof investment...
makdaddy626 - Monday, February 23, 2009 - link
Yes, But I meant "minimum" in the sense of what the game needs to be played at, even you measure "average" - I just don't think it's fair to say that the 9800 GTX shows the highest performance per dollar on the Crysis Warhead 2560x1600 chart when it turns in frame rates of 13.5. To me that is ZERO value for the money because it's not playable. Someone wanting to play at those settings would be wasting EVERY dollar spent on the 9800 GTX.7Enigma - Tuesday, February 24, 2009 - link
Completely agree. Statistics mean nothing when not taken in a proper context. Zero, NA, or just leaving it blank would be much better. Someone looking to use that card would then click on a lower resolution and see if it is a viable choice. It would reduce the amount of data that needed to be compared as the reader of the article, and make caveats like in the explaination section about comparing between cards/resolutions etc. almost moot.Spivonious - Monday, February 23, 2009 - link
The framerate charts are all but worthless if you're focusing on how performance scales. Why not some line graphs with all three resolutions shown and card models along the x-axis? Then the reader could see how performance is affected by the lower memory of some models and how it scales linearly with the higher-end cards.7Enigma - Monday, February 23, 2009 - link
I would have to agree with this. I always enjoyed the broken line graphs that show multiple resolutions and framerates in the same graph. It makes comparisons very easy and more importantly allows EVERYONE to see their particular resolution without having to click on a link for a separate graph.It's fine to keep your specialized performance/$ and % increase from a single card the way you have it as I understand what you mean about not comparing between resolutions but for the general frame rate comparisons I preferred the old way.
One thing I failed to see which I have seen in previous reviews with X-fire/SLI (or was it with tri/quad setups?) is the stuttering that can be present. I thought it was an Anand article but could have been from another site.
SiliconDoc - Wednesday, March 18, 2009 - link
The charts are designed to autopop to 2650x - and we all know the red ragers have screamed the ati cards are great there.EVERY CHART pops to the favored ati $2,000.00 monitor resolution.
It's not an accident, Derek made them.
C'DaleRider - Monday, February 23, 2009 - link
Derek, if you want to impress, and this article does with its research, please invest in some writing manuals and learn some grammar.Take this sentence:
"This unique card really shined and held it's own all the way up to 2560x1600."
Your use of "IT'S" in this instance is incorrect. IT'S is a contraction for IT IS, not a possessive word, which is ITS.
Or take this passage, "It's very surprising to us that AMD hasn't pushed this configuration and that Sapphire are the only manufacturer to have put one of these out there."
Sapphire is a company or organization, I realize that. But in this instance, you're referring to a group in its singular fashion, or as a single unit. That context is seen by the only manufacturer in the sentence.
That sentence should have read. "It's very surprising to us that AMD hasn't pushed this configuration and that Sapphire IS the only manufacturer to have put one of these out there."
Here's the rule for that (taken from both MLA and APA handbooks): If the action of the verb is on the group as a whole, treat the noun as a singular noun. If the action of the verb in on members of the group as individuals, treat the noun as a plural noun.
This means companies, such as Microsoft, IBM, Sapphire, Ford, etc., when being referred to the company as a whole collective, single entity, has to have a singular verb.
But, if you are referring to pieces of the whole, such as "the engineers of Ford are.....", or "The programmmers at Microsoft are.....".
Please invest in some proper English grammar texts and take time to read and learn from them. Your error laden grammar you write and use is quite distracting and detracts from what is supposed to be a professionally run hardware site.
Hire a proofreader or good copy editor if learning proper grammar is too difficult.
Otherwise, this was a great article!
The0ne - Monday, February 23, 2009 - link
I don't really mind Anandtech articles as much in terms of presentation, spelling and graphics. Other sites such as Ars Technica, x-bit labs, and so forth are much worst. The first is first since they've started writing articles concerning everything, it seems.If I did mind, I say stick to the general guidelines writing manuals, procedures, pamphlets, technical docs, etc. But being online, this isn't the case and won't ever be. Again, I don't mind as much because I do the same thing myself where I hardly pay attention to spelling or grammar when writing online. It's only when I write short stories or for work that I pay attention. Strange but comfortsure does make one do these things :)
And yes, I do write all sorts of articles daily.
oldscotch - Monday, February 23, 2009 - link
"Your error laden grammar you write and use is quite distracting and detracts from what is supposed to be a professionally run hardware site."That should read "The error laden grammar you use is quite distracting..." or just "Your error laden grammer is quite distracting..."
"Your error laden grammam you write and use..." is redundant.
Perhaps you should learn some grammar yourself before criticizing others about theirs.
MamiyaOtaru - Tuesday, February 24, 2009 - link
So we have to be perfect in every way to point out errors? NBA players shouldn't listen to their coaches because their coaches can't play as well as they do? Game reviewers shouldn't trash a game because they couldn't make a better one?ggathagan - Tuesday, February 24, 2009 - link
When it comes to grammatical errors as insignificant as the ones pointed out, yes.If you're going to be that critical, then you best check your own grammar.
cptnjarhead - Wednesday, February 25, 2009 - link
Grammar shmammar, you guys need to move out of your mom’s basement and get laid. :)bigboxes - Wednesday, February 25, 2009 - link
+1stym - Monday, February 23, 2009 - link
I am curious to see how a pair of radeon 4830 would perform in this lineup. A single one is quite weak at those resolutions, but I am willing to bet a pair of those would hold its own against a single GTX280.Oh, and it would be much cheaper, too ($180 including the bridge).
Could you possibly include that setup next?
DerekWilson - Monday, February 23, 2009 - link
You are right that a single 4830 won't be enough perform on par with these guys ... but I don't think two of them would really be worth it against the GTX 280 except maybe at lower resolutions. The 1GB 4830 will run you at least $145, so you're looking at $290 for two of them and the 4850 X2 2GB is the same price. The 512MB 4830 will be limited by memory usage at higher resolutions just like the 4850 512MB.We might look at the 4830 in CrossFire internally and see if it warrants an update, but so far it isn't in the roadmap for the rest of the series.
stym - Monday, February 23, 2009 - link
I was thinking 512MB 4830s, which are in the $90~$110 price range. That price range is the only reason I mention them, because it puts the price tag of a pair of those in the exact same range as a Radeon 4830 512MB or even a GTX260.You said that a 4850 1GB doesn't make sense, and that's even more obvious for a 4830.
pmonti80 - Monday, February 23, 2009 - link
I find too that this would be an interesting match at the $200+ pricetag.wilkinb - Monday, February 23, 2009 - link
why not just drop AoC, it was bad when it came out, has always had issues and odd results and no one i know played for more then 2 months...If you want to have a mmo, why not use one that people play? and maybe even more mature in development...
I know you will say it adds value, but you dont know its it bad code or showing a different view.
ajoyner - Tuesday, February 24, 2009 - link
Most of the issues with the game are gone. There are currently no other MMO's out there that have the graphics or combat system to tax a gpu like this game. Your comment on testing a game that people play is very subjective. There are many MMO's out there that I would not touch....WOW, cough, cough.....but that doesn't mean other people don't enjoy them. I think having this game as one that is regularly benchmarked adds a great deal of value to the article.DerekWilson - Monday, February 23, 2009 - link
It really is a great looking game for an MMO. It's not the most played MMO around, but it is definitely the easiest to test. There is an area near the beginning where the player is alone in the environment and it's always the same time of day and all that stuff ... It takes out some of the factors that make getting consistent data out of other MMOs incredibly difficult.I've never had any real "issues" with it or with the results either. It's been very consistent as well. It does add value, and it's clear that games can be coded in a way that looks really good and perform like this one, so we feel it's important to getting a better feeling for what's out there and what's possible.
IKeelU - Monday, February 23, 2009 - link
Not really a big deal, but could you cut out the offhand game review comments when introducing benchmarks? I.e.: "Crysis Warhead, while not the best game around..." It feels out of place in a hardware analysis.SiliconDoc - Wednesday, March 18, 2009 - link
And Derek disses Far Cry 2 and Oblivioin where nvidia slaughters ati - then derek praises Bioshock where ati has an edge.Derek CAN'T HELP HIMSELF.
SiliconDoc - Wednesday, March 18, 2009 - link
Oh yes, and below don't forget the age of conan that favors the ati card - Derek can't stop drooling all over the place.Then come to COD, where nvidia once again slaughters - red blood everywhere - Derek says "do we really need another war game~" or the like.
Derek is red fan central and cannot stop himself.
The0ne - Monday, February 23, 2009 - link
This game is poorly programmed in the first place, does it deserve to even be included in the benchmark tests? Yes, it has the programming necessary to for the test but they're poorly programmed.IKeelU - Monday, February 23, 2009 - link
The fact that CryEngine 2 is taxing on today's hardware (and that Crytek will no doubt use derivatives of it in future games), makes it very useful in benchmarks. I hope reviewers keep using it. But by all means, feel free to disassemble Crytek's binaries and point out their code's weaknesses.Yeah, I thought not.
poohbear - Wednesday, February 25, 2009 - link
what do u mean they shouldnt include crysis warhead??? its the seminal game to see how graphics performance is to get an idea of how a particular video card will perfrom in the future. Cryengine2 is the most advanced graphics engine on the market. If a video card can provide 30 fps on a cryengine @ your resolution, then its good to last u for atleast 2 years.Razorbladehaze - Wednesday, February 25, 2009 - link
Yeah.... NO.I totally disagree with it being the most advanced. It is a decent game engine especially for benchmarking, but....
In all reality the STALKER Clear Sky revamped xray engine is far and away more advanced and superior in almost every way. It is about the same or better in regards to taxing the system (low frame rates does not necessarily translate to the game is taxing the system.). Being that these are also used in similar FPS titles they would make a interesting comparrison.
I would really like to see Anand include or swap a clear sky bench (there is a premade one available), for the Crysis or Crysis warhead. Either way no big deal many other sites post results with a CS bench that view all the time.
DerekWilson - Monday, February 23, 2009 - link
i'll take care of it.Stillglade - Monday, February 23, 2009 - link
I would love to see more info about the 4850 X2 1GB version. For over $50 cheaper, is the 1GB memory enough to compete? Is it worth paying 24% more for the 2GB version that you reviewed here?kmmatney - Monday, February 23, 2009 - link
Especially at the 1920 x 1200 resolution - that resolution is becoming a sweetspot nowadays.just4U - Monday, February 23, 2009 - link
I disagree. I see people finally moving away from their older 17-19" flat panels directly into 22" wide screens. 24" and 1920/1200 resolutions are no where near the norm.SiliconDoc - Wednesday, March 18, 2009 - link
Correct, but he said sweet spot because his/her wallet is just getting bulgy enough to comtenplate a movement in that direction... so - even he/she is sadly stuck at "the end user resolution"...lol
Yes, oh well. I'm sure everyone is driving a Mazerati until you open their garage door....or golly that "EVO" just disappeared... must have been stolen.
DerekWilson - Monday, February 23, 2009 - link
The 1GB version should perform very similarly to the two 4850 cards in CrossFire.The short answer is that the 1GB version won't have what it takes for 2560x1600 but it might work out well for lower resolutions.
We don't have a 1GB version, so we can't get more specific than that, though this is enough data to make a purchasing decision -- just look at the 4850 CrossFire option and take into consideration the cheaper price on the 1GB X2.
politbureau - Tuesday, June 1, 2010 - link
I realize this is an older article, however I always find it interesting to read when upgrading cards.While I find it admirable that Derek has compared the 'older' GTX 280 SLI scaling, it is unfortunate that he hasn't pointed out that it should perform identically to the GTX 285s if the clocks were the same.
This was also passed over in the "worthy successor" article, where it does not compare clock for clock numbers - an obvious test, if we want to discover the full value of the die shrink.
I recently 'upgraded' to 3 GTX 285s from 3 GTX 280s through warranty program with the mfg, and there is little to no difference in performance between the 2 setups. While cabling is more convenient (no 6 to 8 pin adapters), the 285s won't clock any better than my 280s would, Vantage scores are within a couple hundred points of each other at the same clocks (the 280s actually leading), and the temperature and fan speed of the new cards hasn't improved.
I think this is a valuable point in an article that compares performance per dollar, and while slightly outside the scope of the article, I think it's a probabtive observation to make.