How 'bout if you'd overclock the prosessor to something like 4,0 GHz so it wouldn't be such a bottleneck to the Quad-SLI and Quad-CrossfireX configurations? I have tested it myself (with i7 920 and 2 GTX295's), and it really pays off. The performance increases a lot when CPU's clock is raised from 2,66 GHz to 3,8 GHz. It definitely makes a difference (NOTE: 3,6 GHz is still a bottleneck, and maybe 3,8 GHz is too. Couldn't overclock more and test since memory couldn't go any further)
The price/performance charts favours the cheapest cards, but give little useful information.
What really shows the price/performance information, is an XY chart with price vs performance.
With it is easy to see what is the better performer at a given price, and the cheapest option at a given performance. Also shows closely related price/perfornmance options if you can't have access to the best performer, because is not available.
and with XY charts is easy to see the best bang for the buck, because is commonly found at the sharp bending of the lower price evolvent line.
Although your article is really intresting, I would rather see some benchmarks including the people who it might also be interesting beside 30" display owners.
Right know I'm thinking about purchasing a TrippleHead2Go after they updated the firmware and support 3x1680x1050.
Unfortunately even widescreengaming forum can't provide FPS benchmarks for the 5040x1050 resolution.
I'm thinking about going multiGPU but there is no comparison nvida and ati at this resolution.
This article could have been the platform to support surroundgaming and show if 2/4way gpu's make sense in this context.
I'm looking for such an comparison for 2 weeks now and couldn't find anything. And I'm still stuck with my decision if a single gtx295 could deliver a playable performance (disregarding the quality settings for the time beeing) or if I have to look for other sollution like 4way or 2way GTX285.
I very much like the resolution switching for the tables.
This has confirmed what I'd been leaning towards for my next build (Shuttle X58 SFF). I'll be getting one of the following Dual GPU cards to run my 1920x1200 gaming.
GTX295
4870 X2
4850 X2
(I was running two 4850s in a X38 Shuttle SFF for a while before the frequent overheating caused me to switch to a single 4870.)
I think that these Quad 4850 framerates need to have a special label. Using 512MB Radeon 4850s in Quadfire is not a good idea for 2560x1600. 1GB 4850s would have shown the 4850s high resolution muscle way better (as it did with the 4870 1GB cards vs 4870 512MB cards). Scaling would not be so poor.
Best graph layout Ive seen on any site so far, so much easier to pick your desired resolution and have it in front of you instead of picking through a mess of resolutions,great article by the way still consider single card setups offer best bang for the buck and less headaches. So now multi GPU questions are out of the way, how about something regarding whats around the corner? 8,9 and gtx200 is all realistically the same architecture scaled up and shrunk down. Any whispers on new GPU architectures? Starting to feel that after the rush of technological progress the last few years especially ever since the release of 8000 series cards ( long time ago now!) things really seem to have stagnated the last few months. Cheers for a great read Jared.
LOL - It's so much fun when a non-red rooster speculates like the raging red does all over the place.
Thank you.
Yes, ATI has bled BILLIONS the last couple of years, with barely over that in sales per year.
It appears they're spending twice as much as they're selling, and that is probably not a recoverable situation - unless the new lib god Obama and the dem congress has a billion or two or more, "in the package" for them.
Die painfully okay? Prefearbly by getting crushed to death in a
garbage compactor, by getting your face cut to ribbons with a
pocketknife, your head cracked open with a baseball bat, your stomach
sliced open and your entrails spilled out, and your eyeballs ripped
out of their sockets. Fucking bitch
I really hope that you get curb-stomped. It'd be hilarious to see you
begging for help, and then someone stomps on the back of your head,
leaving you to die in horrible, agonizing pain. *beep*
Shut the *beep* up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
Any testing of 8x GPU's?
4x Radeon 4870 x2 cards?
or:
4x nVidia 295 (dual GPU) cards?
Combined with an Intel Skulltrail board using a pair of quad core CPU's.
I'm running tri sli gtx 295's. My energy bill has gone up 110 usd a month since december. With that to think about, wth would someone test 4 gtx295's. Totally inefficient. This article imo was about price/performance through competitors giving us a new way to look at fps with the 100 usd fps chart.
That technology does not yet exist. The skulltrail board supports Quad SLI, meaning, 4 total gpu's (the x2 boards count for 2 each). Nothing supports 8x graphics cards. That would create ridiculous overhead, as you can probably tell from the scaling from going from 2-4 gpus.
This was a GREAT series of articles and I'm so glad you guys decided to make them. I'm pretty sure I've never heard anyone on a hardware review site actually admit it's a wash between AMD/ATI and Nvidia and it all comes down to brand preference; so props for coming out and saying the truth.
One thing I've said many times before in these comments, that I'm still not seeing. "I would really love to see 3D Mark scores for all these cards included with each GPU article." You show the subjective tests of the hardware, the games, please show the objective test for the hardware, 3D Mark.
So yeah, amazing articles, thank you for writing them. And my only, very minor, complaints are that you didn't include hardware down to the 9600GT level(at least)or lower and you didn't include 3D Mark scores.
Yes, I know it's supposed to be a multi-GPU review, but you included enough other single GPU's, I would have really liked to see how the other cards stacked up, kind of a "whole market" GPU comparison.
P.S. Sorry, third complaint, I remembered after mentioning the lower end hardware. Had you included those cards, it would have been nice to see tests at 1440x900 and maybe 1366x768 too; seeing as how that's becoming a standard. And yes, I understand the amount of work that goes into testing that many configurations; and the time required to test at so many resolutions. And... I really truly appreciate all the work put into articles like this; I swear, I recruit more people to come visit this site then a tv ad could.
On an article design note: I really like the comparison for value, based off performance per dollar, or per 100 dollars in this case; very good idea. I also REALLY like that I could switch between resolutions just by clicking a link; I like bar graphs WAY more than Line graphs, ever since First Grade. Later guys, great work!
I have to agree with you on the 3dMark scores (and any of the other major ones Aquamark or something?) I think anyone crazy enough to purchase 4 cards or 2 dual's are probably doing it more for the competition of benchmarking than actual gaming. Or at the minimum of equal importance and so if the quad AMD/Nvidia decision is a wash based on game performance maybe the synthetic benchmarks would sway the decision.
Well you shouldn't. Software, especially benchware, favors this or that method or type of hardware, and given the differences pointed out between the gpu styles of Nvidia and Ati, no test is going to eliminate bias in it's guaging - as should be absolutely obvious to you after seeing massive variance in game scores here for the same two opposing gpu's, and realizing, if you had a scientific mind, that 3dmark also uses a GAME it "created" that will favor one architecture or another, definitively.
So, you may "have to agree" - but you may also "change your mind" about that.
Actually there is no "subjective" tests in this article. Subjective is non-empirical (non data based) testing. Or another aspect of subjective testing is when one would say that subjective is when the outcome reported is not supported by the data because of mitigating other factors (i.e. best card is not ----, because graphical glitches, despite having best FPS) . So FPS in benchmarking as all tests here demonstrate is in fact all objective testing.
Furthermore 3d mark scores are really redundant and not practical. I for one am really glad that Anand have left them out, they are a waste of testing time in most cases. I used to really like the 3d mark scores for benchmarking my own stuff, and used to look forward to them in articles. Over time though i have really noticed that although they do provide a comparison between cards, they do not translate to much in terms of real world performance. The comparison between cards is still easily made using a common benchmark from a game, and it allows more differentiation and demonstrates more "across the board" performance when testing multiple games and, as mentioned in the first line of this paragraph, provides practical results.
Yeah... no. You're wrong. Tests based on games are subjective because the results you get from that testing is subjective to that game. Each game is programmed differently and utilizes the GPU hardware differently. You can three cards, have one card be the fastest by a large margin in one test and be the slowest in another test.
(Subjective: Characteristic of or belonging to reality as PERCEIVED rather than as independent of mind.) The results show up as PERCEIVED by the game, rather from independent results.
(Subjective: Peculiar to a particular individual.) That individual is the game. -These were taken from Merriam/Websters dictionary online.-
(Objective: Expressing or dealing with facts or conditions as perceived without distortion by personal feelings, prejudices, or interpretations.) Testing using only games causes distortion. That distortion is from "feelings", "prejudices" and "interpretations". Feeling of the programmers who wrote the game, some like to program for Nvidia hardware some prefer AMD. Also, some game studios are paid or given preferential treatment to favor one companies hardware over another's. What I just said has to do with prejudice too. Interpretations, Nvidia and AMD hardware is designed differently, a blatant example of this is that AMD uses 800 SP's where Nvidia uses 128 SP's and they both have similar performance; the code of the game, generally DirectX 9, interprets each set of hardware differently ergo we have a non-objective interpretation of the GPU's performance capability.
Games are meant to be played and perform the way each individual game studio wants them too; there are so many variables across companies, and employees and the games themselves you can't possibly use a small subset of video games to determine the performance differences between a set of GPU's. At least not reliably.
(Objective: Limited to choices of fixed alternatives and reducing subjective factors to a minimum.) Every scientific experiment strives to remove variables from the testing process; video games simply don't do that.
3D Mark and the newer 3DMark Vantage are as objective as software testing hardware can be. One test, programmed one way, programmed to only run one way no matter what GPU it is on. Also, 3D Mark is designed to stress the GPU hardware as much as possible, no matter what card it is, which means it will take full advantage of every card you test using it.
No, 3D mark doesn't equate to real world results in any way. But that doesn't matter, it's the most scientific, least variable test anyone can perform on multiple GPU's to determine the performance differences between them. And isn't that all anandtech is trying to do with this whole series of articles? Yes, yes it is. Of course it is always good to look at the games, to see that subjective measurement and to determine which card works best with the games YOU play. But it is imperative to look at 3D mark as well to get a complete idea of the DIFFERENCE IN PERFORMANCE between the cards. To see whole big picture.
To make it simple for you, if one card outperforms another card by 15% or more in 3D Mark, it's a good bet that card will outperform the other card in the majority of the games on the market; regardless or programming inconsistencies.
On another note, most people will never take a resolution beyond 1920x1080, so I'd really like to see more testing at resolutions lower than that; and the inclusion of lower end cards to see if they can play the latest games... even if I do have to lower the resolution a little.
No, I am definitely not wrong. You have obviously never taken any courses in research statistic, or methods. No matter what this dictionary is that you pull this from experimental procedures remain the same and these, for FPS, are OBJECTIVE.
Your logic is OK regarding programing for specific configurations (not really accurate, but that is besides the overall point here), what is important here is your reasoning is completely wrong. You apply it to one set of programs but not to another and futher deny that such logic you posited even exists within one program set. WTH.
Yea come back after completing your courses in research methods and experimental procedures.
Furthermore you may want to buff up on your courses in logic and reasoning.
To quote the soup NAZI (from Seinfeld):
NO SOUP FOR YOU!!!!
Ok after thinking briefly i decided to post again.
Basically what you need to understand for the whole testing aspect, is basically anything that can be operationally defined can be tested objectively.
Furthermore my ideas on your thoughts relating to 3d mark and game benchmarks.
3d mark has what many would say is good reliability in testing terms such as test-retest. but so would any good benchmark for games, especially built-in game benchmarks.
What is most important is validity, that the outcome actually measures, what it is supposed to measure.
3d mark scores are a conglomerate score and actually relate to nothing other than that final score. These measures are taken from data such as a CPU score, FPS score, and other data correct??
To quote you "Every scientific experiment strives to remove variables from the testing process..."
umm hello, how does 3d mark determine its outcome score again, yeah multiple variables, that's right MULTIPLE VARIABLES.
Ok so going back to what is important is Validity: that which is being measured is actually being measured.
A final outcome score in 3d mark is valid only for comparisons to other 3d marks scores. NO IF, AND's, OR BUTTS!!!!
FPS - in each game is a valid OBJECTIVE data outcome score, as it is the operational definition of GPU rendering used here. Running multiple benchmarking runs in each game provides reliability. Measuring FPS across multiple games with a similar game engine lends concurrent validity to the experiments. This type of testing also allows convergent and divergent validity to be addressed here when using multiple game titles with different engines or similar offshoots.
So in laymen's terms for the world of experimental procedures 3d mark can suck my balls
yes 3D mark can only be compared to 3D mark, I already said that, thanks for repeating what a more intelligent person already said as that seems to be what you do; classes, so you DO let other people tell you how and what to think. Anyway, 3D mark rates the graphics card, and scores can be compared across a wide range of cards without worrying about variables like with games, where one game is better than another but only on some GPU's. 3D mark actually has a score for JUST the GPU, but I guess you've never used it.
I already said the only way to get an accurate representation of GPU performance is to test multiple games and 3D mark, it would best if you could just test every game out there, but no one has that kind of time.
so yeah, game testing, even when using the games "in game" benchmark still only measures performance in THAT game, so it means nothing to people who don't play that game. As it can't be used to determine performance in any other game; where as 3D Mark can. One game test could say that the HD4870 is faster and subsequently a more powerful card than the GTX285 even though we all know it's not just because that game runs better on AMD hardware. That's all I was saying and all I was trying to get across, I really wasn't looking for a pissing contest with an anonamyss internet user.
A test in one game, can't give you an accurate idea of how a GPU will perform in other games; 3D Mark, when compared to the scores of other cards, and used in conjunction with FPS scores, CAN!!!!
Oh yeah and in reviewing what I wrote, I glossed over subjective data. and will further punish 3d mark.
most common subjective data sets are: Ranking data, Rating data, interview data, categorical data.
The first two; Ranking and rating is really what the outcome score of 3d mark.
It is a rating on their own scale and it allows them to rank systems by that score.
So with this final bit of info answer if 3d mark outcomes are really subjective or objective(what does it measure besides its own ranking and rating scales).
It's just common, healthy sense that tells me - something is terribly wrong here. I can't imagine anyone in his right mind spending at least $880 for a (non-professional) graphics solution which draws more than 700W under load, and delivers almost no additional performance, compared to far cheaper alternatives! (If you look closely at the charts you'll see that some games are perfectly playable on even single-GPU solutions). Ok, there are some gains, but they are so minimal that the performance/price and performance/power ratios are simply disturbingly low. I compared the situation once to a quad-core CPU that delivers at most, say, 2,2x speedup over single-cores - with dual cores delivering 1,9. Who would buy such a CPU?
Of course, I'm not trying to say people should buying such things; it's the vendors who need to do some hard work on that poor scaling. I would never consider buying 4 GPUs if they deliver anything under 3,5x speedup. It sounds crazy right now, but who knows, there may be interesting things to come.
It would have been since to see some AA scaling. It was made apparent one the first page that scaling was going to be under performing when at 4 gpus from either 2 or 3. I think just about everyone saw that coming, especially with max of 4x AA.
With this much power, it would have been interesting to see what happened to scaling when AA gets turned up. Run the normal 4xAA for all the games. Then, start cranking up the AA and see if that might make the the move from 2 to 4 gpus look better.
I know there are those who, and I'll agree to an extent, say that there isn't a difference from 4x to something like 12x or 24x edge detect (and whatever nVidia uses). Even the move from 4x to 8x is rather small in image improvement. Still though, if your going to drop that kind of money into a set of cards, why not put it to use? While I'm not sure about the 12x to 24x move on edge detect, I do seem to notice things here and there by moving from 4x to 8x and even a few things in 12x AAED. I've seen a hand full of sites look at doing the higher AA levels, but they always do it with 1, or at most 2, gpus when the game is already stressing them at 4x. Why not take a look at these higher levels of AA when there is an abundance of gpu power that seems to be idle.
Overall, it would be an interesting look into something that I'm not sure I've seen any site look at in any kind of depth when getting to the power level of 4, or even 3, gpus.
So are the games not scaling with more gpu power because games/drivers are not optimized for 3 or 4 gpus, or are they not scaling because AA levels are left at "lower" levels? (Outside, of course, being CPU limited because of the abundance of gpu power.)
I have noticed a recent trend where people post comments accusing Anandtech of favoring Intel/Nvidia. Now I absolutely can't stand bias and stopped reading sites like Tom's long ago because of it, but Anandtech has never seemed even remotely unfair. I think the issue is people have noticed a lot of praise for Intel and Nvidia over the past couple of years, but the reason is they have had consistently better performance. When the 4xxx series launched, Anandtech had nothing but praise for it and it's superior value and even criticized Nvidia's stagnant technology and high prices in many articles. So basically what I'm trying to point out is it isn't Anandtech that has been biased against AMD/ATI, its the real world performance that has been skewed and they are simply reporting on it.
Oh good golly. Reviewers are expected to be unbiased, but they almost never are. Unbiased is a cold, boring read for most fans.
Anand used to have a very severe bias in favor of intel many years ago.
Now you mentioned Tom's was biased and you left Tom's and are now here, showing of course YOUR BIAS, because if you dare to notice bias here, then you've got to leave HERE, and go somewhere else.
Believe me, if you love ati, you'll stay for the card reviews - that have a huge bias for ati, in every worded paragraph.
I am curious though how you percieved Tom's bias. I bet he didn't favor the red cards ? Was that it ?
I'm very interested in that.
Would you be so kind as to say what bias you found there ?
It's called being fair and not being biased. They did give the due credit and praise to AMD for RV770 and Phenom II. You probably haven't been reading the articles.
He's a red fan freak-a-doo, with his tenth+ name, so anything he sees is biased against ati.
Believe me, that one is totally goners, see the same freak under krxxxx names.
He must have gotten spanked in a fps by an nvidia card user so badly he went insane.
In the last couple of years, nVidia and Intel have had better performing hardware than the competition.
So I don't see any bias and the charts don't show any either.
Another name so soon raging red fanboy freak ? Going to fantasize about murdering someone again, sooner rather than later ?
If ati didn't suck so badly, and be billion dollar losers, you wouldn't be seeing red, huh, loser.
Wow, what i was really looking forward to here disappeared entirely. I was expecting to see more commentary on the subjective image quality of the benchmarks, and there was even less discussion relating to that then in the past two articles kinda a bummer.
On the side note what was shown was what I expected from piecemeal of a number of other reviews. Nice to see it combined though.
The only nougat of information I found disturbing is to hear the impression that CUDA is better than what ATI has promoted. This in light of my understanding that nVidia just hired a head tech officer from the University where Stream (what ati uses) computing took roots. Albeit that CUDA is just an offshoot of this, it would seem to me that, this hiring would lead me to beleive that nvidia will be migrating towards stream rather than the opposite. Especially if GPGPU computing is to become commonplace.
I think that it would be in nVidia's best interest to do this as I am afraid that Intel is right and that nvidia's future may be bleak if GPGPU computing does not take hold and this is one strategy to migrate towards their rival AMD's GPGPU to reduce resource usage to explore this tech.
Well yeah... i think i went way way off on a tangent on this one so...yeah im done.
Sorry about the lack of image quality discussion. It's our observation that image quality is not significantly impacted by multiGPU. There are some instances of stuttering here and there, but mostly this is in places where performance is already bad or borderline, otherwise we did note where there were issues.
As far as GPGPU / GPU computing, CUDA is a more robust and more widely adopted solution than is ATI Stream. CUDA has made more inroads in the consumer space, and especially in the HPC space than has Stream. There aren't that many differences in the programming model, but CUDA for C does have some advantages over Brook+. I prefer the fact that ATI opens up it's ISA down to the metal (along side a virtual ISA), while NVIDIA only offers a virtual ISA.
The key is honestly adoption though: the value of the technology only exists as far as the end user has a use for it. CUDA leads here. OpenCL, in our eyes, will close the gap between NVIDIA and ATI and should put them both on the same playing field.
Could you please take the time to ask NV why the heck nTune crashes systems so easily? I can't even boot into Windows right now because I set up my profile to start my 8800GTS 640's fan on 100%. I can't hear it and I prefer my card to run cool, especially with NV's known heat issues. It might not be their fault though...FoxConn hasn't updated my bios in about 3 years, even though NewEgg sold me my motherboard about a year and a half ago.
I should point out that was coming from the standpoint of you mentioning NV driver issues. I've had more trouble running older games and stuff. They sacrifice stability and game compatibility for high frame rates in newer games. It's pathetic. I don't want to buy a card to play all the latest games. I buy it to play my slightly older games at what is now good frame rates with all the goodies. Most people DO NOT STAY ON THE BLEEDING EDGE NVIDIA. GET A CLUE!
Your first mistake was thinking you know better than the top gpu maker and their manufacturing partners, and had your hissy fit to crank your fan out on lame years old games - because you're a worry wart and think nvidia would send out overheating cards - even on your far less than cutting edge slow and cool games.
Then we have the obious - the very high temp red cards, that COOK HUMAN SKIN when you touch the HS and run at 70C plus often - and you found it in your whine to claim NV has "known heat issues".
LOL
It is just amazing.
Hey thanks for the extra input on the image quality and addressing this post. Furthermore I hear you in regards to adoption of the tech being most important part GPGPU computing. I just hope that both ATI and nVidia can come together on this one for the consumer.
I enjoy reading your articles and think Anand puts out some quality stuff. I also appreciate you addressing some of the comments posted. Keep up the excellent work.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
44 Comments
Back to Article
Hattiwatti - Thursday, June 11, 2009 - link
How 'bout if you'd overclock the prosessor to something like 4,0 GHz so it wouldn't be such a bottleneck to the Quad-SLI and Quad-CrossfireX configurations? I have tested it myself (with i7 920 and 2 GTX295's), and it really pays off. The performance increases a lot when CPU's clock is raised from 2,66 GHz to 3,8 GHz. It definitely makes a difference (NOTE: 3,6 GHz is still a bottleneck, and maybe 3,8 GHz is too. Couldn't overclock more and test since memory couldn't go any further)marraco - Thursday, March 5, 2009 - link
The price/performance charts favours the cheapest cards, but give little useful information.What really shows the price/performance information, is an XY chart with price vs performance.
With it is easy to see what is the better performer at a given price, and the cheapest option at a given performance. Also shows closely related price/perfornmance options if you can't have access to the best performer, because is not available.
and with XY charts is easy to see the best bang for the buck, because is commonly found at the sharp bending of the lower price evolvent line.
SiliconDoc - Wednesday, March 18, 2009 - link
Let's hope the reviewers here take your information to heart and put it to use.I suspect though the FUD and bias will win out.
Dazzz - Tuesday, March 3, 2009 - link
Although your article is really intresting, I would rather see some benchmarks including the people who it might also be interesting beside 30" display owners.Right know I'm thinking about purchasing a TrippleHead2Go after they updated the firmware and support 3x1680x1050.
Unfortunately even widescreengaming forum can't provide FPS benchmarks for the 5040x1050 resolution.
I'm thinking about going multiGPU but there is no comparison nvida and ati at this resolution.
This article could have been the platform to support surroundgaming and show if 2/4way gpu's make sense in this context.
I'm looking for such an comparison for 2 weeks now and couldn't find anything. And I'm still stuck with my decision if a single gtx295 could deliver a playable performance (disregarding the quality settings for the time beeing) or if I have to look for other sollution like 4way or 2way GTX285.
Any suggestions ?
VooDooAddict - Tuesday, March 3, 2009 - link
I very much like the resolution switching for the tables.This has confirmed what I'd been leaning towards for my next build (Shuttle X58 SFF). I'll be getting one of the following Dual GPU cards to run my 1920x1200 gaming.
GTX295
4870 X2
4850 X2
(I was running two 4850s in a X38 Shuttle SFF for a while before the frequent overheating caused me to switch to a single 4870.)
Antman56 - Sunday, March 1, 2009 - link
I think that these Quad 4850 framerates need to have a special label. Using 512MB Radeon 4850s in Quadfire is not a good idea for 2560x1600. 1GB 4850s would have shown the 4850s high resolution muscle way better (as it did with the 4870 1GB cards vs 4870 512MB cards). Scaling would not be so poor.Otherwise, nice compilation of information. :p
TonkaTuff - Sunday, March 1, 2009 - link
Best graph layout Ive seen on any site so far, so much easier to pick your desired resolution and have it in front of you instead of picking through a mess of resolutions,great article by the way still consider single card setups offer best bang for the buck and less headaches. So now multi GPU questions are out of the way, how about something regarding whats around the corner? 8,9 and gtx200 is all realistically the same architecture scaled up and shrunk down. Any whispers on new GPU architectures? Starting to feel that after the rush of technological progress the last few years especially ever since the release of 8000 series cards ( long time ago now!) things really seem to have stagnated the last few months. Cheers for a great read Jared.DerekWilson - Monday, March 2, 2009 - link
Thank you ;-)Slappi - Sunday, March 1, 2009 - link
I wouldn't touch their cards with a ten foot pole.They are about to collapse under their debt.
SiliconDoc - Wednesday, March 18, 2009 - link
LOL - It's so much fun when a non-red rooster speculates like the raging red does all over the place.Thank you.
Yes, ATI has bled BILLIONS the last couple of years, with barely over that in sales per year.
It appears they're spending twice as much as they're selling, and that is probably not a recoverable situation - unless the new lib god Obama and the dem congress has a billion or two or more, "in the package" for them.
lk7200 - Wednesday, March 11, 2009 - link
Die painfully okay? Prefearbly by getting crushed to death in a
garbage compactor, by getting your face cut to ribbons with a
pocketknife, your head cracked open with a baseball bat, your stomach
sliced open and your entrails spilled out, and your eyeballs ripped
out of their sockets. Fucking bitch
I really hope that you get curb-stomped. It'd be hilarious to see you
begging for help, and then someone stomps on the back of your head,
leaving you to die in horrible, agonizing pain. *beep*
Shut the *beep* up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
http://www.youtube.com/watch?v=Po0j4ONZRGY">http://www.youtube.com/watch?v=Po0j4ONZRGY
I wish you a truly painful, bloody, gory, and agonizing death, *beep*
vailr - Sunday, March 1, 2009 - link
Any testing of 8x GPU's?4x Radeon 4870 x2 cards?
or:
4x nVidia 295 (dual GPU) cards?
Combined with an Intel Skulltrail board using a pair of quad core CPU's.
LinkedKube - Wednesday, March 4, 2009 - link
I'm running tri sli gtx 295's. My energy bill has gone up 110 usd a month since december. With that to think about, wth would someone test 4 gtx295's. Totally inefficient. This article imo was about price/performance through competitors giving us a new way to look at fps with the 100 usd fps chart.Jorgisven - Sunday, March 1, 2009 - link
That technology does not yet exist. The skulltrail board supports Quad SLI, meaning, 4 total gpu's (the x2 boards count for 2 each). Nothing supports 8x graphics cards. That would create ridiculous overhead, as you can probably tell from the scaling from going from 2-4 gpus.Hrel - Sunday, March 1, 2009 - link
This was a GREAT series of articles and I'm so glad you guys decided to make them. I'm pretty sure I've never heard anyone on a hardware review site actually admit it's a wash between AMD/ATI and Nvidia and it all comes down to brand preference; so props for coming out and saying the truth.One thing I've said many times before in these comments, that I'm still not seeing. "I would really love to see 3D Mark scores for all these cards included with each GPU article." You show the subjective tests of the hardware, the games, please show the objective test for the hardware, 3D Mark.
So yeah, amazing articles, thank you for writing them. And my only, very minor, complaints are that you didn't include hardware down to the 9600GT level(at least)or lower and you didn't include 3D Mark scores.
Yes, I know it's supposed to be a multi-GPU review, but you included enough other single GPU's, I would have really liked to see how the other cards stacked up, kind of a "whole market" GPU comparison.
P.S. Sorry, third complaint, I remembered after mentioning the lower end hardware. Had you included those cards, it would have been nice to see tests at 1440x900 and maybe 1366x768 too; seeing as how that's becoming a standard. And yes, I understand the amount of work that goes into testing that many configurations; and the time required to test at so many resolutions. And... I really truly appreciate all the work put into articles like this; I swear, I recruit more people to come visit this site then a tv ad could.
On an article design note: I really like the comparison for value, based off performance per dollar, or per 100 dollars in this case; very good idea. I also REALLY like that I could switch between resolutions just by clicking a link; I like bar graphs WAY more than Line graphs, ever since First Grade. Later guys, great work!
LinkedKube - Monday, March 2, 2009 - link
I agree with the fps per 100 dollar section, very cool. Something new to look at and think about.7Enigma - Sunday, March 1, 2009 - link
I have to agree with you on the 3dMark scores (and any of the other major ones Aquamark or something?) I think anyone crazy enough to purchase 4 cards or 2 dual's are probably doing it more for the competition of benchmarking than actual gaming. Or at the minimum of equal importance and so if the quad AMD/Nvidia decision is a wash based on game performance maybe the synthetic benchmarks would sway the decision.SiliconDoc - Wednesday, March 18, 2009 - link
Well you shouldn't. Software, especially benchware, favors this or that method or type of hardware, and given the differences pointed out between the gpu styles of Nvidia and Ati, no test is going to eliminate bias in it's guaging - as should be absolutely obvious to you after seeing massive variance in game scores here for the same two opposing gpu's, and realizing, if you had a scientific mind, that 3dmark also uses a GAME it "created" that will favor one architecture or another, definitively.So, you may "have to agree" - but you may also "change your mind" about that.
Razorbladehaze - Sunday, March 1, 2009 - link
Actually there is no "subjective" tests in this article. Subjective is non-empirical (non data based) testing. Or another aspect of subjective testing is when one would say that subjective is when the outcome reported is not supported by the data because of mitigating other factors (i.e. best card is not ----, because graphical glitches, despite having best FPS) . So FPS in benchmarking as all tests here demonstrate is in fact all objective testing.Furthermore 3d mark scores are really redundant and not practical. I for one am really glad that Anand have left them out, they are a waste of testing time in most cases. I used to really like the 3d mark scores for benchmarking my own stuff, and used to look forward to them in articles. Over time though i have really noticed that although they do provide a comparison between cards, they do not translate to much in terms of real world performance. The comparison between cards is still easily made using a common benchmark from a game, and it allows more differentiation and demonstrates more "across the board" performance when testing multiple games and, as mentioned in the first line of this paragraph, provides practical results.
Hrel - Thursday, March 5, 2009 - link
Yeah... no. You're wrong. Tests based on games are subjective because the results you get from that testing is subjective to that game. Each game is programmed differently and utilizes the GPU hardware differently. You can three cards, have one card be the fastest by a large margin in one test and be the slowest in another test.(Subjective: Characteristic of or belonging to reality as PERCEIVED rather than as independent of mind.) The results show up as PERCEIVED by the game, rather from independent results.
(Subjective: Peculiar to a particular individual.) That individual is the game. -These were taken from Merriam/Websters dictionary online.-
(Objective: Expressing or dealing with facts or conditions as perceived without distortion by personal feelings, prejudices, or interpretations.) Testing using only games causes distortion. That distortion is from "feelings", "prejudices" and "interpretations". Feeling of the programmers who wrote the game, some like to program for Nvidia hardware some prefer AMD. Also, some game studios are paid or given preferential treatment to favor one companies hardware over another's. What I just said has to do with prejudice too. Interpretations, Nvidia and AMD hardware is designed differently, a blatant example of this is that AMD uses 800 SP's where Nvidia uses 128 SP's and they both have similar performance; the code of the game, generally DirectX 9, interprets each set of hardware differently ergo we have a non-objective interpretation of the GPU's performance capability.
Games are meant to be played and perform the way each individual game studio wants them too; there are so many variables across companies, and employees and the games themselves you can't possibly use a small subset of video games to determine the performance differences between a set of GPU's. At least not reliably.
(Objective: Limited to choices of fixed alternatives and reducing subjective factors to a minimum.) Every scientific experiment strives to remove variables from the testing process; video games simply don't do that.
3D Mark and the newer 3DMark Vantage are as objective as software testing hardware can be. One test, programmed one way, programmed to only run one way no matter what GPU it is on. Also, 3D Mark is designed to stress the GPU hardware as much as possible, no matter what card it is, which means it will take full advantage of every card you test using it.
No, 3D mark doesn't equate to real world results in any way. But that doesn't matter, it's the most scientific, least variable test anyone can perform on multiple GPU's to determine the performance differences between them. And isn't that all anandtech is trying to do with this whole series of articles? Yes, yes it is. Of course it is always good to look at the games, to see that subjective measurement and to determine which card works best with the games YOU play. But it is imperative to look at 3D mark as well to get a complete idea of the DIFFERENCE IN PERFORMANCE between the cards. To see whole big picture.
To make it simple for you, if one card outperforms another card by 15% or more in 3D Mark, it's a good bet that card will outperform the other card in the majority of the games on the market; regardless or programming inconsistencies.
On another note, most people will never take a resolution beyond 1920x1080, so I'd really like to see more testing at resolutions lower than that; and the inclusion of lower end cards to see if they can play the latest games... even if I do have to lower the resolution a little.
Razorbladehaze - Thursday, March 5, 2009 - link
No, I am definitely not wrong. You have obviously never taken any courses in research statistic, or methods. No matter what this dictionary is that you pull this from experimental procedures remain the same and these, for FPS, are OBJECTIVE.Your logic is OK regarding programing for specific configurations (not really accurate, but that is besides the overall point here), what is important here is your reasoning is completely wrong. You apply it to one set of programs but not to another and futher deny that such logic you posited even exists within one program set. WTH.
Yea come back after completing your courses in research methods and experimental procedures.
Furthermore you may want to buff up on your courses in logic and reasoning.
To quote the soup NAZI (from Seinfeld):
NO SOUP FOR YOU!!!!
Razorbladehaze - Thursday, March 5, 2009 - link
Ok after thinking briefly i decided to post again.Basically what you need to understand for the whole testing aspect, is basically anything that can be operationally defined can be tested objectively.
Furthermore my ideas on your thoughts relating to 3d mark and game benchmarks.
3d mark has what many would say is good reliability in testing terms such as test-retest. but so would any good benchmark for games, especially built-in game benchmarks.
What is most important is validity, that the outcome actually measures, what it is supposed to measure.
3d mark scores are a conglomerate score and actually relate to nothing other than that final score. These measures are taken from data such as a CPU score, FPS score, and other data correct??
To quote you "Every scientific experiment strives to remove variables from the testing process..."
umm hello, how does 3d mark determine its outcome score again, yeah multiple variables, that's right MULTIPLE VARIABLES.
Ok so going back to what is important is Validity: that which is being measured is actually being measured.
A final outcome score in 3d mark is valid only for comparisons to other 3d marks scores. NO IF, AND's, OR BUTTS!!!!
FPS - in each game is a valid OBJECTIVE data outcome score, as it is the operational definition of GPU rendering used here. Running multiple benchmarking runs in each game provides reliability. Measuring FPS across multiple games with a similar game engine lends concurrent validity to the experiments. This type of testing also allows convergent and divergent validity to be addressed here when using multiple game titles with different engines or similar offshoots.
So in laymen's terms for the world of experimental procedures 3d mark can suck my balls
Hrel - Saturday, March 21, 2009 - link
yes 3D mark can only be compared to 3D mark, I already said that, thanks for repeating what a more intelligent person already said as that seems to be what you do; classes, so you DO let other people tell you how and what to think. Anyway, 3D mark rates the graphics card, and scores can be compared across a wide range of cards without worrying about variables like with games, where one game is better than another but only on some GPU's. 3D mark actually has a score for JUST the GPU, but I guess you've never used it.I already said the only way to get an accurate representation of GPU performance is to test multiple games and 3D mark, it would best if you could just test every game out there, but no one has that kind of time.
so yeah, game testing, even when using the games "in game" benchmark still only measures performance in THAT game, so it means nothing to people who don't play that game. As it can't be used to determine performance in any other game; where as 3D Mark can. One game test could say that the HD4870 is faster and subsequently a more powerful card than the GTX285 even though we all know it's not just because that game runs better on AMD hardware. That's all I was saying and all I was trying to get across, I really wasn't looking for a pissing contest with an anonamyss internet user.
A test in one game, can't give you an accurate idea of how a GPU will perform in other games; 3D Mark, when compared to the scores of other cards, and used in conjunction with FPS scores, CAN!!!!
Razorbladehaze - Thursday, March 5, 2009 - link
Oh yeah and in reviewing what I wrote, I glossed over subjective data. and will further punish 3d mark.most common subjective data sets are: Ranking data, Rating data, interview data, categorical data.
The first two; Ranking and rating is really what the outcome score of 3d mark.
It is a rating on their own scale and it allows them to rank systems by that score.
So with this final bit of info answer if 3d mark outcomes are really subjective or objective(what does it measure besides its own ranking and rating scales).
npp - Sunday, March 1, 2009 - link
It's just common, healthy sense that tells me - something is terribly wrong here. I can't imagine anyone in his right mind spending at least $880 for a (non-professional) graphics solution which draws more than 700W under load, and delivers almost no additional performance, compared to far cheaper alternatives! (If you look closely at the charts you'll see that some games are perfectly playable on even single-GPU solutions). Ok, there are some gains, but they are so minimal that the performance/price and performance/power ratios are simply disturbingly low. I compared the situation once to a quad-core CPU that delivers at most, say, 2,2x speedup over single-cores - with dual cores delivering 1,9. Who would buy such a CPU?Of course, I'm not trying to say people should buying such things; it's the vendors who need to do some hard work on that poor scaling. I would never consider buying 4 GPUs if they deliver anything under 3,5x speedup. It sounds crazy right now, but who knows, there may be interesting things to come.
SiliconDoc - Wednesday, March 18, 2009 - link
" (If you look closely at the charts you'll see that some games are perfectly playable on even single-GPU solutions). "CORRECTION : All games are perfectly playable on plenty more than just a few single gpu solutions.
YES - that's the real truth.
mastrdrver - Sunday, March 1, 2009 - link
It would have been since to see some AA scaling. It was made apparent one the first page that scaling was going to be under performing when at 4 gpus from either 2 or 3. I think just about everyone saw that coming, especially with max of 4x AA.With this much power, it would have been interesting to see what happened to scaling when AA gets turned up. Run the normal 4xAA for all the games. Then, start cranking up the AA and see if that might make the the move from 2 to 4 gpus look better.
I know there are those who, and I'll agree to an extent, say that there isn't a difference from 4x to something like 12x or 24x edge detect (and whatever nVidia uses). Even the move from 4x to 8x is rather small in image improvement. Still though, if your going to drop that kind of money into a set of cards, why not put it to use? While I'm not sure about the 12x to 24x move on edge detect, I do seem to notice things here and there by moving from 4x to 8x and even a few things in 12x AAED. I've seen a hand full of sites look at doing the higher AA levels, but they always do it with 1, or at most 2, gpus when the game is already stressing them at 4x. Why not take a look at these higher levels of AA when there is an abundance of gpu power that seems to be idle.
Overall, it would be an interesting look into something that I'm not sure I've seen any site look at in any kind of depth when getting to the power level of 4, or even 3, gpus.
So are the games not scaling with more gpu power because games/drivers are not optimized for 3 or 4 gpus, or are they not scaling because AA levels are left at "lower" levels? (Outside, of course, being CPU limited because of the abundance of gpu power.)
xxtypersxx - Sunday, March 1, 2009 - link
I have noticed a recent trend where people post comments accusing Anandtech of favoring Intel/Nvidia. Now I absolutely can't stand bias and stopped reading sites like Tom's long ago because of it, but Anandtech has never seemed even remotely unfair. I think the issue is people have noticed a lot of praise for Intel and Nvidia over the past couple of years, but the reason is they have had consistently better performance. When the 4xxx series launched, Anandtech had nothing but praise for it and it's superior value and even criticized Nvidia's stagnant technology and high prices in many articles. So basically what I'm trying to point out is it isn't Anandtech that has been biased against AMD/ATI, its the real world performance that has been skewed and they are simply reporting on it.SiliconDoc - Wednesday, March 18, 2009 - link
Oh good golly. Reviewers are expected to be unbiased, but they almost never are. Unbiased is a cold, boring read for most fans.Anand used to have a very severe bias in favor of intel many years ago.
Now you mentioned Tom's was biased and you left Tom's and are now here, showing of course YOUR BIAS, because if you dare to notice bias here, then you've got to leave HERE, and go somewhere else.
Believe me, if you love ati, you'll stay for the card reviews - that have a huge bias for ati, in every worded paragraph.
I am curious though how you percieved Tom's bias. I bet he didn't favor the red cards ? Was that it ?
I'm very interested in that.
Would you be so kind as to say what bias you found there ?
deputc26 - Sunday, March 1, 2009 - link
better change that to "CrossfireX"JarredWalton - Sunday, March 1, 2009 - link
Fixed, thanks. Note that it's easier to fix issues if you can mention a page, just FYI. :)askeptic - Sunday, March 1, 2009 - link
This is my observation based on their review over the last couple of yearsssj4Gogeta - Sunday, March 1, 2009 - link
It's called being fair and not being biased. They did give the due credit and praise to AMD for RV770 and Phenom II. You probably haven't been reading the articles.SiliconDoc - Wednesday, March 18, 2009 - link
He's a red fan freak-a-doo, with his tenth+ name, so anything he sees is biased against ati.Believe me, that one is totally goners, see the same freak under krxxxx names.
He must have gotten spanked in a fps by an nvidia card user so badly he went insane.
Captain828 - Sunday, March 1, 2009 - link
In the last couple of years, nVidia and Intel have had better performing hardware than the competition.So I don't see any bias and the charts don't show any either.
lk7200 - Wednesday, March 11, 2009 - link
Shut the *beep* up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
SiliconDoc - Wednesday, March 18, 2009 - link
Another name so soon raging red fanboy freak ? Going to fantasize about murdering someone again, sooner rather than later ?If ati didn't suck so badly, and be billion dollar losers, you wouldn't be seeing red, huh, loser.
JonnyDough - Tuesday, March 3, 2009 - link
Hmm...X1900 series ring a bell? Methinks you've been drinking...Razorbladehaze - Sunday, March 1, 2009 - link
Wow, what i was really looking forward to here disappeared entirely. I was expecting to see more commentary on the subjective image quality of the benchmarks, and there was even less discussion relating to that then in the past two articles kinda a bummer.On the side note what was shown was what I expected from piecemeal of a number of other reviews. Nice to see it combined though.
The only nougat of information I found disturbing is to hear the impression that CUDA is better than what ATI has promoted. This in light of my understanding that nVidia just hired a head tech officer from the University where Stream (what ati uses) computing took roots. Albeit that CUDA is just an offshoot of this, it would seem to me that, this hiring would lead me to beleive that nvidia will be migrating towards stream rather than the opposite. Especially if GPGPU computing is to become commonplace.
I think that it would be in nVidia's best interest to do this as I am afraid that Intel is right and that nvidia's future may be bleak if GPGPU computing does not take hold and this is one strategy to migrate towards their rival AMD's GPGPU to reduce resource usage to explore this tech.
Well yeah... i think i went way way off on a tangent on this one so...yeah im done.
DerekWilson - Monday, March 2, 2009 - link
Sorry about the lack of image quality discussion. It's our observation that image quality is not significantly impacted by multiGPU. There are some instances of stuttering here and there, but mostly this is in places where performance is already bad or borderline, otherwise we did note where there were issues.As far as GPGPU / GPU computing, CUDA is a more robust and more widely adopted solution than is ATI Stream. CUDA has made more inroads in the consumer space, and especially in the HPC space than has Stream. There aren't that many differences in the programming model, but CUDA for C does have some advantages over Brook+. I prefer the fact that ATI opens up it's ISA down to the metal (along side a virtual ISA), while NVIDIA only offers a virtual ISA.
The key is honestly adoption though: the value of the technology only exists as far as the end user has a use for it. CUDA leads here. OpenCL, in our eyes, will close the gap between NVIDIA and ATI and should put them both on the same playing field.
JonnyDough - Tuesday, March 3, 2009 - link
Could you please take the time to ask NV why the heck nTune crashes systems so easily? I can't even boot into Windows right now because I set up my profile to start my 8800GTS 640's fan on 100%. I can't hear it and I prefer my card to run cool, especially with NV's known heat issues. It might not be their fault though...FoxConn hasn't updated my bios in about 3 years, even though NewEgg sold me my motherboard about a year and a half ago.JonnyDough - Tuesday, March 3, 2009 - link
I should point out that was coming from the standpoint of you mentioning NV driver issues. I've had more trouble running older games and stuff. They sacrifice stability and game compatibility for high frame rates in newer games. It's pathetic. I don't want to buy a card to play all the latest games. I buy it to play my slightly older games at what is now good frame rates with all the goodies. Most people DO NOT STAY ON THE BLEEDING EDGE NVIDIA. GET A CLUE!SiliconDoc - Wednesday, March 18, 2009 - link
Your first mistake was thinking you know better than the top gpu maker and their manufacturing partners, and had your hissy fit to crank your fan out on lame years old games - because you're a worry wart and think nvidia would send out overheating cards - even on your far less than cutting edge slow and cool games.Then we have the obious - the very high temp red cards, that COOK HUMAN SKIN when you touch the HS and run at 70C plus often - and you found it in your whine to claim NV has "known heat issues".
LOL
It is just amazing.
Razorbladehaze - Monday, March 2, 2009 - link
Hey thanks for the extra input on the image quality and addressing this post. Furthermore I hear you in regards to adoption of the tech being most important part GPGPU computing. I just hope that both ATI and nVidia can come together on this one for the consumer.I enjoy reading your articles and think Anand puts out some quality stuff. I also appreciate you addressing some of the comments posted. Keep up the excellent work.