MultiGPU Update: Does 3-way Make Sense?

by Derek Wilson on 2/25/2009 2:45 PM EST
Comments Locked

46 Comments

Back to Article

  • magnusr - Saturday, February 28, 2009 - link

    Got my second 4850. Huge fps upgrade in 1920x1080. But noise during gaming has gone up. Idle just a bit.

    It would have been nice if you added noise numbers. Since some would probably prefer getting a fast single gpu like nvidia 285 instead of 2x4850/4870 due to possibly lower noise levels.
  • taltamir - Saturday, February 28, 2009 - link

    the following statement is made:
    It's very difficult to really collect high quality quantitative data that shows microstutter, as the only way to really get a good idea of what's going on is to analyze raw frame data on a per frame basis (which you can't get with FRAPS)


    This is incorrect, FRAPS can capture those, its in the settings, it gives exact MS "time" of each frame, and it gives them in order, in a csv file. You can create a function showing the difference between each pair, giving you the ms per frame, and you can analyze that data.

    please check this thread in anandtech:
    http://forums.anandtech.com/messageview.aspx?catid...">http://forums.anandtech.com/messageview...=31&...
  • kpgoebel - Friday, February 27, 2009 - link

    I just finished reading both pieces - great stuff. the analysis was dense but your pace kept it moving.

    I use three 20" samsungs for my home workstation that doubles as a flight sim trainer since I took up flying. i added two 880GT's to a Q6600 quad core system, in hopes that it could handle triple 1600X1200 screens for accurate flight control response. it's only proven reliable with the textures turned down to medium or low.. which is a bummer because i've invested in some top-notch sceneries.

    any chance you guys will offer some analysis for multiple monitor gaming setups?
  • orionmgomg - Thursday, February 26, 2009 - link

    Thank you for this great article!

    So much information, and very helpful to get an idea of the where to spend money on a build and where the price/value margins lie.

    I cant wait for the Quad shoot out!

    PS: Does anyone know if the ASUS Rampage II Extreme is the only X58 Mobo available right now that fully supports X3 PCIe 2.0 x16?

    The Other ASUS boards say they support: PCIe 2.0 x16/PCIe 2.0 x16/ or PCIe 2.0 x16/PCIe 2.0 x8/PCIe 2.0 x8...

    Contemplating 3 Way SLI...
  • Zorro3740 - Thursday, February 26, 2009 - link

    For triple and quad cross to really be worthwhile a high refresh rate LCD can be a great way to benefit from the huge headroom in performance given from multi GPU setups. I have several PCs with triple crossfire Radeon 4850s on 60 hertz displays. A game like Devil May Cry 4 comes to mind. If you can sustain that ultra high 120 frame limit of these new displays the value of multi GPU will suddenly become that much more apparent. I sure could easily with my Phenom 2 920 or I7 920. With max settings at 1080p you get as much as 200 frames or more. It just moves so smooth at frame rates higher than 60. I imagine it would be similar to the 99 hertz refresh rate of my 24" CRT at 1080p. I stopped using that display because of its dimness but it gave me a damn good idea of what a real fast 120hz LCD might do. How about 120 FPS with 24 sample FSAA in Devil May Cry 4? I'm sure there are many instances where 120 hertz will make a huge difference for Multi GPU enthusiasts.

    I highly recommend that somebody test their multi GPU rig with a 120hz LCD HDTV and tell us about how smooth the action is and how clear it is with higher levels of FSAA. It is a shame that 120hz is not here in the PC monitor world. I have the HP LP3065 that does 2560 * 1600 @ 60hz. I would trade that in for a 120hz 1080p or 1200p LCD model in a flash if I could if it had a high quality backlight. I HATE that the world is becoming a TN world when it comes to monitors. We need 120 hz 8 bit panels and we need to just get the hell away from shoddy 6 bit monitors with light bleed among other unacceptable faults.

    Anyhow I really enjoyed the article and confirmed to me that for the games I play the 4850 is a good multi GPU choice. I just would have liked to see more of an emphasis on future developments like 120hz LCDs and higher levels of FSAA in particular at "lower" resolutions like 1920 * 1080 and 1920 * 1200 with 512 meg setups and 1 gig setups. Multi GPU 4850s are way more affordable than 30" monitors and some people will definitely "settle" for nice 24" 8 bit panels like the Westinghouse L2410NM that rock at 1920 * 1200.
  • far327 - Thursday, February 26, 2009 - link

    Would like to others comments including authors on the differences between some of the numbers of the charts on Anandtech and TheGuruof3d http://www.guru3d.com/article/core-i7-multigpu-sli...">http://www.guru3d.com/article/core-i7-m...li-cross...

    I think an overclocked CPU would of been a much more well rounded approach to this article. Even thought the performance gains aren't huge, let's face it, most people hardcore enough to game with 3 GTX 280/285's is probably overclocking there CPU as well.
  • chizow - Thursday, February 26, 2009 - link

    Haven't really dug into the review yet Derek, but meant to suggest this in the last article. Is it possible to do something about the text in the bar graphs? Its just overly cumbersome trying to make sense with all the extraneous text in there. Some suggestions:

    1) Perhaps you're required to put Vendor + Brand for each config, but can we get an icon for Nvidia or ATI instead of ATI Radeon or Nvidia GeForce for every single line item?

    2) Color code the relevant data for each line item to differentiate. For example:

    Nvidia GeForce [green]GTX 285 3-way[/green] SLI
    ATI Radeon [red]HD 4870 3-way[/red] CrossFire

    Would go a loooooong way into making those graphs easier to parse and digest. Back to reading, thanks.
  • OblivionLord - Thursday, February 26, 2009 - link

    What about the thought of 3way 100% scaling with Lucid Hydra?

    Why not make mention that if that becomes a reality, then you will infact get what you pay for.
  • DerekWilson - Thursday, February 26, 2009 - link

    I love the idea of Lucid and their Hydra technology ... but it isn't here yet. I enjoy writing articles about what their technology could mean and how beneficial and amazing and awesome it would be ... but it's not here yet and we haven't had a hands on chance to see for ourselves that it works as advertised.

    until it's out, SLI and CrossFire are the best we've got and the focus of our articles.

    maybe we'll run into Lucid again at GDC and write up more cool stuff about Hydra though ...
  • Denithor - Thursday, February 26, 2009 - link

    Guru3D did an article at the i7 launch showing how well i7 scales with multiGPU setups compared to C2D & C2Q. Night & day difference, to tell the truth, i7 just pushes the multiGPU rigs much better than the older architecture can.

    My question is this: how does PhII compare to i7 on the multiGPU front? I'd love to see some benching done at high resolution with 2/3/4 card setups on the two CPU architectures to see how they look.
  • 7Enigma - Thursday, February 26, 2009 - link

    http://www.anandtech.com/mb/showdoc.aspx?i=3506&am...">http://www.anandtech.com/mb/showdoc.aspx?i=3506&am...

    Just a quick review (would have preferred to see the upper resolutions tested....wait did I just write that?), but shows that with CF you have a pretty competetive war. I would guess going tri-gpu (if the game supports it well) the i7 would pull ahead by quite a bit, especially when OC'd, but that remains to be seen.
  • Mr Roboto - Thursday, February 26, 2009 - link

    BORING! Not to sound like a total dick but seriously do we need 2 articles (with probably a third to come discussing quad GPU) to find out what a total waste multi GPU is? I mean until NV and ATI invest some serious research into hardware based SLI\XFire I'd never even consider buying two GPU's. The current software based multi GPU setups don't offer enough of a reward. Having to use a software based profile means you'll always be waiting for them to create one for the latest games, while some simply won't get one at all.

    Software in general is lagging so far behind current hardware that there is practically ZERO use for two GPU's let alone 3-4. Someone please name 10 games that can successfully use more than 2 cores at a time. OK now name 10 software applications that do the same (not counting video encoding apps). It's pathetic how bad it is. Look at all the XBox360 ports built on 5 year old hardware that are coming out in rapid succession and you have a pretty glib picture.

    Still waiting for some Lucid Hydra benchmarks showing 100% linear scaling. If it actually materializes on the desktop then I would consider multi GPU.
  • 7Enigma - Thursday, February 26, 2009 - link

    I find these articles well done and beneficial. The stereotype that tri/quad/sometimes even dual is a waste is just that a sterotype. For games that are coded well for multi-gpu and when you are on a large display I'm actually a bit impressed with the multi-gpu stats. A year ago you would likely not see even near the improvement from a single card. It's a testament to the drivers of the gpu manufacturers and as importantly the game programmers.

    What this also shows is games that are poorly suited for multi-gpu (either due to poor coding/drivers or bottlenecked at another point). Sure most of us don't have the 30" display that warrants the tri and upcoming quad articles, but for those that do it is not a waste, and look further down the rabbit hole to see where we will be in another 2 years or so with a single-gpu card.

    We have already pretty much approached a saturation point for mainstream graphics gaming (<$300) under 24" (1920X1200). That has to be dangerous for ATI/Nvidia because we are now becoming limited by our display size as opposed to the historical trend of ever demanding games keeping us at pretty much the same resolution year after year (with slow progress from 800X600 to 1024X768 to the now most common 1280X1024, to the quickly becoming gaming standard 22", and finally within a year or two to 24" gaming goodness).

    What I imagine you will see in the next year or two is heavy reliance on physics calculations on the gpu to burden them greatly bringing us back to the near always present gpu-limited status. If game makers fail to implement some serious gpu demands above and beyond what is being used today pretty soon we won't need to upgrade video cards near as frequently. And for the consumer that is actually a really good thing...
  • Pakman333 - Thursday, February 26, 2009 - link

    "BORING! Not to sound like a total dick"

    You do, and you probably are.

    If you don't care, then why read it? No one is forcing you to.
  • poohbear - Thursday, February 26, 2009 - link

    props for posting such a quick follow up! sometimes hardware sites promise follow-ups & never deliver.;)
  • Jynx980 - Thursday, February 26, 2009 - link

    Nice job on the resolution options for the graphs! This should alleviate frustration on both ends. I hope it's the standard from now on.
  • rcr - Thursday, February 26, 2009 - link

    I read some times ago, that the GTX285 will draw less power than a GTX280. But the graph of power consumption is here also made wih 3xGTX280OC.
  • ilkhan - Thursday, February 26, 2009 - link

    As a reader I hate going back and forth between pages. Id love to see the price for each option on the right hand side of the performance graphs. Would make it super easy to compare the value of each solution, rather than the seperate graphs you are using here.

    And on the plus side for 4 way scaling, at least theres only 4 cards to bench (9800GX2, 4850X2, 4870X2, and the GTX295). Maybe a couple more for memory options, but a lot less than the 3-way here.
  • LiLgerman - Wednesday, February 25, 2009 - link

    i currently have a 4870, 512mb TOP from asus and i was wondering if i can pair that with a 1gb model. Will i have to down clock my 4870 TOP to the speed of the 1gb model if i do crossfire?
    Thanks
  • DerekWilson - Thursday, February 26, 2009 - link

    AMD is good with asynchronous clocking (one can be faster than the other) ... but i believe memory sizes must match -- i know you could still put them in crossfire, but i /think/ that 512MB of the memory on the 1GB card will be disabled to make them work together.

    this is just what i remember off the top of my head though.
  • MagicPants - Wednesday, February 25, 2009 - link

    I've been playing a bit of GTA4 recently, it runs well on my dual 285 system but I've heard there is no SLI support. It might be nice to include a few of these types of games in the mix.

    Honestly the only game I've played where SLI matters (on 1920x1200) is Crysis.
  • MagicPants - Wednesday, February 25, 2009 - link

    Having the cutoff of 25fps really effected the value of cards. It was interesting to see the values at different resolutions as well.

    Now I just want to see an interactive graph where I can enter a game and a resolution and it will tell me what video card is the best value. That's not asking too much is it? :)

    ... or enter a game and resolution and the thing tells me what to put in my system (cpu, memory, motherboard, video card)
  • plonk420 - Wednesday, February 25, 2009 - link

    i'm not even a proponent of SLI/dualGPU until 100% of games work with the technology (and see a worthwhile increase of performance).
  • mastrdrver - Wednesday, February 25, 2009 - link

    I think it would have been interesting to see a 2 and 3 way of the 4830 added to all this. Sure it maybe on the lowend of things, but it could have a great value at maybe 1920 and 1680 compared to the more expensive counterparts.
  • stym - Thursday, February 26, 2009 - link

    I would like to see that too. I am going to buy a new system next month and I am torn between a single 4870 and two 4830. Same price tag, but what about performance? The problem is, it should have been considered in the previous article. Although I am convinced a two-way 4830 crossfire configuration may provide great performance at a budget price, I doubt a 3-way 4830 makes a lot of sense in a system. You would have to buy a MoBo with three x16 PCI Express slots, and I would not pair that with lower end cards.
  • mastrdrver - Friday, February 27, 2009 - link

    It could be a cheap way to go to an i7 platform with power. Spend all the money on the board/memory/cpu and spend ~$300 USD on 3 cards that have a lot of power. If the 4380s scales as well as either the 4850 or 4870, you could have a very powerful but cheap card setup. Not even 300 will buy you a 4870x2. Sure 3 4830s won't beat it, but it will be between a 4870 and the x2. For $300, it sounds like a great deal.
  • Razorbladehaze - Wednesday, February 25, 2009 - link

    "Pairing a single card dual GPU AMD card with a single card single GPU option to get 3-way CrossFireX also seems to have a positive impact on microstutter. "

    I am a little unclear by this statement, I read it as, pairing this combo eliminates the microstutter. But i am concerned that a positive impact could also mean that the FPS in spite of microstutter increases.

    This really was the article of most interest to me, as opposed to the 2-way, or 4-way configurations. I find the graphs to be clear and concise with the information they convey.

    I find it surprising that there is less discussion on image quality or distortions during benches (yes i know it is difficult to qualitative judge this). I find it hard to believe that these configurations run these game without much flaws, glitches, tearing, flickering in image quality, as my experience has been. I suppose though that if all these issues are resulting from driver optimizations as i suspect, then these commonly benchmarked, newer games get those driver tweaks.

    Anyways the only real comments that may be helpful to the actual presentation of material is i agree with the other fellow that the zero point is not contiguous within the graphs. The more accurate the information the better, as opposed to creating a null value, most people understand what is "playable" for their tastes in different genres (at least most people that i believe read these sites). Further I know that my next suggestion is not as mathematically clean as what you have done, but would produce more useful (based upon card prices/selling points) results. Instead of the FPS per $100 spent, change to FPS per $20 or $50 ($50 would be my choice).

  • Antman56 - Wednesday, February 25, 2009 - link

    I wrote an article about it weeks ago. Its a 4850X2 2GB crossfired with a 4850 1GB. Its good.

    http://forums.anandtech.com/messageview.aspx?catid...">http://forums.anandtech.com/messageview.aspx?catid...
  • Denithor - Wednesday, February 25, 2009 - link

    to the third card option - when the addition of that extra card results in decreased performance? Shouldn't those ones get "0" value ratings?
  • DerekWilson - Wednesday, February 25, 2009 - link

    good point ... we'll try and refine it a little more.
  • Snarks - Wednesday, February 25, 2009 - link

    hmm, i find my self questioning these articles more and more..

    but anyway carry on.
  • DerekWilson - Wednesday, February 25, 2009 - link

    what's the question ... seriously, any criticism is helpful. this is the first time we've really done a series like this, and it's a complicated situation with lots of data and lots of analysis ... there's no one way to look at it, and all the feedback i get will help me down the road.

    i don't see the need for this type of article or series very frequently, but we'll have to do it every once in a while just in case something changes. knowing what you guys think is important and what you guys want to read about is key to us getting things done right.
  • Flyboy27 - Wednesday, February 25, 2009 - link

    Sell you an extra card that you don't really need.
  • Flyboy27 - Wednesday, February 25, 2009 - link

    oh yeah... and a more expensive motherboard, power supply, and case.
  • Burrich - Wednesday, February 25, 2009 - link

    Would the recently release Catalyst 9.2 drivers improve any compatibility or fps issues? Their release date was 2/20.
  • 7Enigma - Thursday, February 26, 2009 - link

    Check out xbitlabs' review of the 9.2 drivers. If you have a 4870 X2 then yes it appears to be a nice upgrade for several games with minimal losses in the games it doesn't benefit. But if you are sporting a single 4870 1gig it actually degrades performance more than it improves!

    On the flipside they claim stability is better with the 9.2's so it depends on what you want/need. If you are comfortable with the framerates in the games you currently play then jump on the 9.2's for stability reasons. If you are on the edge of playable performance I would stick with the previous drivers...

    http://www.xbitlabs.com/articles/video/display/cat...">http://www.xbitlabs.com/articles/video/display/cat...
  • DerekWilson - Thursday, February 26, 2009 - link

    That article compares 9.2 to 9.1 ... the 8.12 hotfix would show similar performance improvements over the 9.1 drivers. 9.2 does benefit more games, but these are games that have been more recently released than the ones we tested.

    if they compared the 8.12 hotfix to 9.2, we would expect to see more parity, especially with the games we tested in this article.
  • DerekWilson - Wednesday, February 25, 2009 - link

    The recently released 9.2 catalyst drivers are basically the 8.12 hotfix drivers with some additions to support performance and scaling in recently released titles. So not really.
  • smartalco - Wednesday, February 25, 2009 - link

    I don't like that you use 0 for those that score under 25 FPS, specifically because that is under 25 at the res/settings you use. If a card scores 24 FPS at 1680*1050 with maxed settings, what that really tells you is that if you were to drop to half the AA, or turn down some other setting, is that you could still have a perfectly playable game. It seems to me, that giving them a value rating of 0 is acting like everyone has to play on max settings, and if it doesn't meet the standard, its useless.

    IDK, just me talking, I'm going to be happy with my 4850 for quite some time.
    Still an excellent article.
  • DerekWilson - Wednesday, February 25, 2009 - link

    i've actually got the graphs without the 0 scores in the article front to back -- just commented out at the moment ... i wasn't sure which one to go with until the last minute, and i thought about putting both in (but that wouldbe really redundant for games that no card had trouble with)

    i could do some more complex web programming, but i'm not a web developer and i hate javascript ...

    thanks for the feedback. i'll be taking it into account in the final article on 4-way.

    also, if you wanna see the value numbers for the single and dual cards that scored less than 25 fps, you can still look at the first article and see them.
  • 7Enigma - Thursday, February 26, 2009 - link

    Derek,

    Please keep the new way. I was one of the proponents of the zero rating after the last article and it shows nicely what can and can't handle a given resolution. Sure you can turn down the eye candy, but that's not the point of the article...especially when you are talking about 3-way.

    We could just turn all the settings down to medium and a single 4850 would be the value king! (/sarcasm)
  • Antman56 - Wednesday, February 25, 2009 - link

    I think demonstrating crossfire using 4850s with 1 GB of ram each would have created a "sweet" price/performance result.

    These 4850s show us what happens when there isn't enough memory to compliment a video card's processing potential.
  • Spivonious - Wednesday, February 25, 2009 - link

    I find it interesting that the 4870 512MB leads the value charts at "normal" resolutions (still way higher than I run).

    I think as a follow-up to this series you should look at playability statistics and see if getting anything more than the 4870 512MB is worth it at 1920x1200 and below.
  • fixxxer0 - Wednesday, February 25, 2009 - link

    only if both girls are hot
  • StraightPipe - Wednesday, February 25, 2009 - link

    only one of the girls needs to be hot, the other will be so busy licking your balls it wont make a difference.
  • Jansen - Wednesday, February 25, 2009 - link

    And if it is not a "Devil's 3-way"...

    That's with two guys.

Log in

Don't have an account? Sign up now