At last a return to multi-processor graphics acceleration. More power for the simple expedient of more GPUs. I look forward to getting two of these babies and setting them up for crossfire mode. Quad GPU FTW.
This card seems to check out pretty nicely, and I haven't heard of many problems as yet. I've heard of only one really bad X2 horror story, which was a DOA, and ATI happily replaced the defective card. Anyone who would rather an nvidia 8800 is out of their mind. Not only is it less powerful and less value for money, but nvidia is really stingy about replacement of defects, not to mention they don't have one reliable third party manufacturer, in the face of ATI's 2 really good ones (Sapphire and HIS).
I've never had a problem with any ATI card I've owned, but every single nvidia one has had some horrible design flaw leading to it's horrible, crippling failure, usually through overheating.
It's good to see an honest company making some headway in the market. I would hope that ATI can keep up their lead over nvidia with this (or another) series of card, but that doesn't seem likely with their comparative lack of market share
While it's good to see AMD do something decent that performs well finally, I'm not so sure it's enough at this stage. AMD needs all the good press they can get but $450 seems a bit much when comparing it to the 8800GT @ $250 without a large towering lead. Furthermore the 8800 GT SLI gives it a pretty good stomping @ $500. Hopefully the multi-gpu solution is more painless and driver bugs get worked out soon.
Also, it would be a big disappointment and misstep for Nvidia not to produce a single GPU real soon that ties or supersedes 8800 ultra SLI performance. It's been 15 months since 8800 ultra came out, that's normal time for a generation leap (6800,7800,8800). So unless Nvidia has been resting on it's laurels it should produce a killer GPU, and then if they make it a multi-GPU solution it should by all means smoke this card. Here's hoping Nvidia hasn't been sleeping and just make another minor 8800 revision and call it next gen, then by all means ATI will be a competitor.
Also from a price/performance ratio it would be nice to see how it performs against a 8800GT SLI 256MB in current games. 8800 GT 256MB SLI can be had for $430 and I would guess it would perform better at low-med resolutions than this x2.
I wouldn't spend four hundred bucks for gaming at 1280x*, one single 512MB GT or GTS is more than enough for that.
And as you scale up to 1680*1050, I personally believe that buying 2x256MB is silly. Therefore, I believe that the 512MB one is the only reasonable 8800GT.
Your plan sounds like low-res overdriving and doesn't seem wise to me.
It's not my plan, I just said it would be good for a price/performance comparison for something like a future review.
Although I no longer game at 1280, doesn't mean others don't and 2-8800gt's with more performance and less price may be beneficial for them. Personally I think SLI is a waste anyway, but if you are considering an X2, 8800 SLI is an affordable comparison whether at 256 or 512.
I have searched through anand's archive but I couldn't find the command line parameters used for testing Unreal Tournament 3 in these reviews. Can anybody help?
That PLX bridge chip is PCI-Express 1.1, not 2.0. Using a PCI-Express 1.1 splitter to split 16x 1.1 bandwidth into 8x 1.1 bandwidth to feed two PCI-Express 2.0 capable Radeon HD3870 chips shows AMD's lack of engineering efforts. They basically glued two HD3870 chips together. Talking about the HD3870, it is basically HD2900XT on 55nm process with a beat up 256bit memory bus. Not exactly good engineering if you ask me.(props to TSMC, definitely not to ATI)
Scaling is ok, but not great. The review failed to review dual HD3870x2s in Crossfire, basically to show the 4 Radeon HD3870s in Crossfire mode. I have seen reports say that 4 way scaling is even worse than 2 way. It is scaling in Lg(n) mode. Adding 2 more of the Radeon 3870s is only 40% faster than two way HD3870.
That's always curious to see users talking about "lacking of engineering efforts" and so on, as if they can teach Intel/AMD/nVIDIA how they should implement their technologies. PCI-X 2.0 is now out, and then we start seeing people who look at the previous version as it definitely should be dismissed. Folks, maybe we should at least consider that the 3870 design simply doesn't generate so much bandwidth to justify a PCI-X 2.0 bridge, don't we?
Sorry, but I kinda laugh when I read "engineering" suggestions from users, supposedly like me, independently from how good their knowledge is. We're not talking about a Do-It-Yourself piece of silicon, we are talking about the last AMD/ATi card. And the same, of course, would apply to an nVIDIA one, to Intel stuff, and so on. The most correct approach I think it would be something like this: "Oh, see, they put a PCI-X 1 bridge in there, and look at the performance, when it's not cut by bad driver support, the scaling is amazing... So it's true that PCI-X 2.0 is still overkill, for nowadays cards".
That one sounds rationale to me.
About Crossfire, CrossfireX is needed for two 3870X2 to work together, and I've read in more than one place on the web that drivers for that aren't ready yet. Apart from this, before spitting on scaling and so on, don't forget that all these reviews are done with very early beta drivers...
I seem to remember at least one review pointing out some fairly dire minimum frame rates. Basically, the high and impressive maximum frame rate allowed the card to have more a very good average frame rate despite a few instance of very dire minimum frame rate (if I remember correctly, it sometime dips below that of a single core card).
Did Anand notice anything like this during the test?
That's not exactly right. I mean, the average is not pulled up by any incredible maximum, it works in a different way: the maximum are nothing incredibly higher than the normal average, simply those minimum are rare, due to particular circumstances in which the application (the game) loads up stuff in the card while it's rendering, and it shouldn't do. This is the juice of an ATi's answer posted in the review by DriverHeaven. They told ATi about these instantaneous drops and that was the answer.
Of course, remaining the fact that other cards didn't show similar problems, I believe that the problem is more driver-related, and ATi should really work fast to fix issues like this one.
However, DriverHeaven reported that this issue showed up mostly in scenario-loadings indeed, and therefore wasn't too annoying.
I'd like to see a round up of ATi and nVidia's current mid to high range. Test them in games that allow 8xAA. I am hearing around in quite a few places ATi cards catch up or pass nVidia when you move from 4x to 8xAA. I'd love to see Anand prove this or debunk it.
I have to admit, even being a fan of nVIDIA, that I fell in love with this card. I like it in almost every aspect, I'm astonished by its combination of complexity and smooth working. Well: almost smooth, due to beta drivers. And this is an element of bigger interest too: there's even a bigger distance to put between itself and the rivals, when mature drivers are released.
That said... I'm designing the last points of my new machine, I was ready to go with a 8800GTS 512MB, but suddenly stopped to look at this one... And I think I'll go with it.
But... On which motherboard? These are the thoughts I'd like to share with you, I ask you: did you notice that Anandtech ones are among the best results in the various X2 reviews online? And did you notice they're obtained on a 780i platform?
Initially, the most obvious idea was to buy this card together with an Asus Rampage Formula, so nicely reviewed on these pages. But now, looking at the system used for this review, the question: wouldn't my dear old desired eVGA 780i SLI an even better board for the new system, for this 3870 X2?
Don't need crossfire nor SLI, really.
Reviewer's opinion would be very appreciated too, of course, being the one who planned this "curious" match.
Does anyone know if the Intel X48 chipset will allow two of these cards to run in CrossfireX mode, or will you have to purchase an AMD chipset motherboard?
This seems to be a stopgap solution though its heartening to see AMD/ATI having the faster single card solution now (albeit multi-GPU). With the Nvidia 9800GX2 due soon, the race is sure to get interesting. Hope AMD has something up its sleeves with the R700
This has nothing to do with graphics, but that PLX PEX8547 in the picture is a bridge, not a switch. Unfortunately, PLX doesn't make a Gen 2 PCIE bridge or switch. One would imagine that two GPU's on one board would benefit from the increased bandwidth of Gen 2. The PLX bridge/switch web page says the switches (and not the bridges) give "Low Latency Transfers" and "True Peer-to-Peer Data Transfers", both of which are probably important to this parallel processing application. This could explain why "the HD 3870 X2 is systematically 5% faster on average than the CrossFire solution" (nach Tom) in spite of the 16 lane Gen 1 bottleneck to the north bridge.
I am still confused by the way AMD/ATI is allocating RV670 chips. If you check on newegg, there are well over 100 Sapphire 3870X2s in stock, but the Sapphire 3870 is still out of stock.
The same thing was happening during the initial 3870/3850 launch, where way too many of the 3850 were produced compared to the 3870. Also, none of the 3850s were of the 512MB type.
AMD should be filling the 3870 demand with the RV670 chips first, since it is probably the most profitable and most reasonable product.
The article says: "the performance is indeed higher than any single-card NVIDIA solution out today", but they didn't test an 8800 Ultra. In some cases the card was barely beating the 8800 GTX. How would it stack up against an Ultra? I understand that maybe they didn't compare against an 8800 Ultra because of the price difference, but the reviewer was all excited about AMD/ATI's return to the high-end; at the high-end, cost is no object. And why two 8800 GTs? If this is the high-end, why choose mid-price cards for SLI? Why not use two Ultras?
I'm not saying a single Ultra would have trounced this card, only that I would have liked to see the comparison. I do think, however, that if they'd gone truly high-end on the SLI side, two Ultras might have killed it.
their previous testing showed the ultra as being similar in performance to the 8800 GTS 512 in most tests. So an Ultra (or 2 of them in SLI) might have done better in the AA tests, but not much else it would seem.
Am I the only one tired of all these multicores? I guess programming gets even more complex now. I guess the future all games will have development cycles like Duke Nukem forever -10+ years....?
Are the GPU's hitting the heatwall 2 now?
Soon I'll stop reading these hardware sites. The only reports in the near future will be 'yet another core added.' Yipee.
Coding for a multi-GPU setup is not really any different that coding for a single-GPU one. All the complexity is handled by the driver, unlike with multi-core CPUs.
Have to say they did a good job, not great, but very good. We do need to see the 700 though, as this won't hold them for long.
The other thing both camps need to address is dual monitors using SLI/CF. It's been forever since this tech has been out and it hasn't been fixed. Dual screens are commonplace and people like them. Could be one large and one smaller, or dual midrange but people want the FPS without losing their 2nd screen.
I'm sure there will be a rash of promises to fix this that won't materialize for years :) (as before)
Actually, that was one of the things that was fixed by ATI. Dual screens will work even if it's in a window and _spanning_ the monitors. I'll see if I can find the review that showed that.
The next-gen GPU family at nVidia is in full development. Hold on to your wallet till the middle of this year (2008). You may be in for a very pleasant surprise.
Does Crossfire still not allow you to create your own profiles, like SLI does? I'm still not convinced that ATI has gotten their head around the whole dual gpu driver thing yet.
More exited about nvidia Gf9800 GX2 than the radeon 3870 X2. I hear from my source in nvidia its got a completly diffrent apporach than radeons and gf 7950Gx2
54FPS average is not nearly enough to ensure 100% fluid gameplay. If that was the minimum FPS it'd be close enough, but above 60FPS minimum is preferable. This is a high-end card so expecting extreme performance is hardly unreasonable.
"if AMD can do it, NVIDIA can as well. And we all know how the 3870 vs. 8800 GT matchup turned out."
pls anand give an update on that, as far as i have read no real review was done by anand and the ones floating around on the web concluded all the same... 3870cf is scaling better then sli, now knowing the price of those 2 i don't see why 8800gt sli is faster the 3870x2 with higher r680 clocks, lets see how nvidia will downclock there dual pcb card to get it out.
and knowing that 2x 3850 = 8800gtx with lower price tag
I've seen that review and that's true... if you compare it to this one, you see some diferences..
IMO, Tom's Hardware is Intel(look at the Phenom 9600 Black Ed. Review.. he practically destroys AMD and kiss Intel) and Nvidia Biased... he cant admit the fact that this card is awesome...
its probably the drivers since Anandtech is still not using x64 and with a 1gb card that gives windows less than 3 gb of usable memory. The review says that AMD came with improvments on the last minute so imagine when they perfect them
I wont link the review, doubt it's even possible, but over at the H Brent's review shows the 3870X2 in a much worse light. They show it outright losing to a single 8800GTX in COD4, Crysis, and UT3, while squeaking out a win in HL2.
In the forum review thread when the differences between Brent's review and Anand's was brought up, it was basically claimed by Kyle that Anand's review is illegitimate because he only benchmarks "canned demos" (if you're familiar with H such spiel is nothing new from them). Further Kyle goes so far as to claim "AMD experienced a 60% fps increase in the Crysis canned GPU benchmark, we saw a couple frames a second in real gameplay. "
Kyle also says your COD4 bench, one of the two you guys did that wasn't "canned" and therefore invalid, is also invalid because you only benched a cutscene He hasn't said but I'm assuming the only bench you guys did that meets the Kyle standard would be Bioshock since it is real gameplay timed by fraps.
There are a few platform differences between the reviews, but Kyle has poo-pooed these as not making any major difference.
Thought you guys might be interested..the thread is the 3870X2 review thread at H forums.
HardOCP is just plain disreputable in every way. Their methodology is nonsense, their reviews are completely inadequate, and they continue to exist only because they drum up fake controversies and attempt to assassinate someone's character every few months.
I'm not exaggerating when I say that I take what The Inquirer has to say more seriously.
I'm just shocked by how many people seem to place any relevance on the HOCP garbage. "It's because we're REAL WORLD and eveyone else is lies and fake stuff!" What a laugh. They play ONE resolution on TWO cards and pretend that's testing. Oh, and they don't use the same settings on both cards, they don't run at the same resolutions as previous reviews, they don't use the most comparable card (8800 GTS 512, anyone?), their testing isn't remotely similar to anything at any other site (hence they can just make claims about how they're doing it right and everyone else is wrong).... I could go on.
Anyway, Anand and crew are best served in my opinion by completely ignoring this childish stuff from Kyle and his cohorts. You can choose: either all of the enthusiast sites except HOCP are wrong, or Kyle is wrong. Ocham's Razor suggests that the latter is far more likely.
Yeah, Kyle is currently yelling at people in the forum thread. He has been asked multiple times why Anand, and most other sites, have a completely different view in their benchmarks. (Keep in mind that HardOCP only benchmarked 4 freaking games on 2 cards. According to Kyle, it would have been too much work to benchmark more games or more cards. Awww...not hard work...anything but that. LOL)
Then, he either chants "canned" benchmarks over and over, or he tells the person asking the question to get lost and never come back to HardOCP.
It has even been pointed out more than once that the review sample he received might be the problem. Maybe he should try another card. But he is in full-blown arrogant a$$ mode right now. ;)
Not surprising coming from Kyle -- he comes across that way quite often.
I don't like his reviews much -- not because of the canned benchmarks vs. real-world gameplay they preach about, bur rather the fact that they typically only compare one or two other cards to the one being tested. I have a GTS 320MB, and it would be nice to see whether the GTS 512MB would be worth it. Sadly, no direct comparison was done, because the GTS 320MB doesn't compete with said card.
I generally try to read several reviews to get an idea of a product - one site is never enough for these things. :)
I would really, really like to see a crysis benchmark that actually uses the last level of the game rather than the built in gpu bench. My system (q6600@2.86ghz, 2x 8800GTS 320mb @ 620/930) gets around 40fps with 'high' defaults (actually some very high settings turned on per tweakguides) on both of the default crysis benchies, but only got around 10fps on the last map. Even on all medium, the game only got about 15-20 fps on the last level. Performance was even lower with the release version, the patch improved performance by about 10-15%.
Of course I'd be interested to see how 2 of these cards do in crysis :)
Even though Farcry is still unplayable at 1900x1200 (<30FPS) but its really close. My 8800GT only manages 18FPS on my PC using the same ISLAND_DEM benchmark AT did, so to scale, the 3870 X2 will do about 27FPS for me. Maybe with some overclocking it can hit 30FPS. $450 to find out is a bit hard to swallow though :\
Well, if you're still getting <30 fps on Far Cry, I think you're PC is a bit too outdated to benefit from an upgrade to an HD3870 X2.
I assume you meant Crysis. This game is honestly poorly coded with graphical glitches in the final scenes. With my 8800GT (and a 2.6ghz Opteron 165, 2gb ram), I pull 50 FPS in the Island_Dem benchmark at 1680x1050 with a mixture of medium-high settings, 15 fps more than the 8800GTS 320mb I replaced it with. However, when you get to some of the final levels, these frames drop to more like 30 or less, and I am forced to drop to medium or a combination of medium-low settings at this point (perhaps my 2.6ghz dual core cpu isn't up to 3ghz C2D snuff).
Clearly, the game was rushed to market (not unlike Far Cry). Yes, the visuals are stunning, and the storyline is decent, but I much prefer Call of Duty 4, where the visuals are SLIGHTLY less appealing, but the storyline is better, the game is more realistic, the controls are better, and I can crank everything to max without worrying about slowdowns in later levels. Its the only game I have ever immediately started replaying on veteran.
The point is, no card(s) appears to be able to really play Crysis max everything at 1900x1200 or higher, and in my findings, the built in time demo's do not realistically simulate the games full demands in later levels.
I didn't see an actual noise chart in that review, but from what I understood, the 3870GX2 is louder than an 8800 SLI setup? I wonder if anyone will step in with a decent after market cooler solution. Personally I don't enjoy playing with headphones, so GPU fan noise concerns me.
I don't know. It would have been nice to see power consumption for the 8800GT SLI setup as well as noise for all of them.
I don't know that I buy that power consumption would scale linearly, so it'd be interesting to see the difference between the 3870 X2 and the 8800GT SLI setup.
I'm impressed. Looking at the power consumption figures, and the gains compared to a single 3870, this is pretty good. They got some big performance gains without breaking the bank on power. How would one of these cards overclock though?
At idle, a single-core card draws just 18.7W (or 23W if you look at it through a 82% efficient power supply). How is it that adding a second core increases idle power draw by 41W?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
74 Comments
Back to Article
Cyspeth - Thursday, February 21, 2008 - link
At last a return to multi-processor graphics acceleration. More power for the simple expedient of more GPUs. I look forward to getting two of these babies and setting them up for crossfire mode. Quad GPU FTW.This card seems to check out pretty nicely, and I haven't heard of many problems as yet. I've heard of only one really bad X2 horror story, which was a DOA, and ATI happily replaced the defective card. Anyone who would rather an nvidia 8800 is out of their mind. Not only is it less powerful and less value for money, but nvidia is really stingy about replacement of defects, not to mention they don't have one reliable third party manufacturer, in the face of ATI's 2 really good ones (Sapphire and HIS).
I've never had a problem with any ATI card I've owned, but every single nvidia one has had some horrible design flaw leading to it's horrible, crippling failure, usually through overheating.
It's good to see an honest company making some headway in the market. I would hope that ATI can keep up their lead over nvidia with this (or another) series of card, but that doesn't seem likely with their comparative lack of market share
MadBoris - Saturday, February 2, 2008 - link
While it's good to see AMD do something decent that performs well finally, I'm not so sure it's enough at this stage. AMD needs all the good press they can get but $450 seems a bit much when comparing it to the 8800GT @ $250 without a large towering lead. Furthermore the 8800 GT SLI gives it a pretty good stomping @ $500. Hopefully the multi-gpu solution is more painless and driver bugs get worked out soon.Also, it would be a big disappointment and misstep for Nvidia not to produce a single GPU real soon that ties or supersedes 8800 ultra SLI performance. It's been 15 months since 8800 ultra came out, that's normal time for a generation leap (6800,7800,8800). So unless Nvidia has been resting on it's laurels it should produce a killer GPU, and then if they make it a multi-GPU solution it should by all means smoke this card. Here's hoping Nvidia hasn't been sleeping and just make another minor 8800 revision and call it next gen, then by all means ATI will be a competitor.
MadBoris - Saturday, February 2, 2008 - link
Also from a price/performance ratio it would be nice to see how it performs against a 8800GT SLI 256MB in current games. 8800 GT 256MB SLI can be had for $430 and I would guess it would perform better at low-med resolutions than this x2.Giacomo - Saturday, February 2, 2008 - link
I wouldn't spend four hundred bucks for gaming at 1280x*, one single 512MB GT or GTS is more than enough for that.And as you scale up to 1680*1050, I personally believe that buying 2x256MB is silly. Therefore, I believe that the 512MB one is the only reasonable 8800GT.
Your plan sounds like low-res overdriving and doesn't seem wise to me.
Giacomo
MadBoris - Saturday, February 2, 2008 - link
It's not my plan, I just said it would be good for a price/performance comparison for something like a future review.Although I no longer game at 1280, doesn't mean others don't and 2-8800gt's with more performance and less price may be beneficial for them. Personally I think SLI is a waste anyway, but if you are considering an X2, 8800 SLI is an affordable comparison whether at 256 or 512.
Aver - Friday, February 1, 2008 - link
I have searched through anand's archive but I couldn't find the command line parameters used for testing Unreal Tournament 3 in these reviews. Can anybody help?tshen83 - Thursday, January 31, 2008 - link
That PLX bridge chip is PCI-Express 1.1, not 2.0. Using a PCI-Express 1.1 splitter to split 16x 1.1 bandwidth into 8x 1.1 bandwidth to feed two PCI-Express 2.0 capable Radeon HD3870 chips shows AMD's lack of engineering efforts. They basically glued two HD3870 chips together. Talking about the HD3870, it is basically HD2900XT on 55nm process with a beat up 256bit memory bus. Not exactly good engineering if you ask me.(props to TSMC, definitely not to ATI)Scaling is ok, but not great. The review failed to review dual HD3870x2s in Crossfire, basically to show the 4 Radeon HD3870s in Crossfire mode. I have seen reports say that 4 way scaling is even worse than 2 way. It is scaling in Lg(n) mode. Adding 2 more of the Radeon 3870s is only 40% faster than two way HD3870.
Giacomo - Thursday, January 31, 2008 - link
That's always curious to see users talking about "lacking of engineering efforts" and so on, as if they can teach Intel/AMD/nVIDIA how they should implement their technologies. PCI-X 2.0 is now out, and then we start seeing people who look at the previous version as it definitely should be dismissed. Folks, maybe we should at least consider that the 3870 design simply doesn't generate so much bandwidth to justify a PCI-X 2.0 bridge, don't we?Sorry, but I kinda laugh when I read "engineering" suggestions from users, supposedly like me, independently from how good their knowledge is. We're not talking about a Do-It-Yourself piece of silicon, we are talking about the last AMD/ATi card. And the same, of course, would apply to an nVIDIA one, to Intel stuff, and so on. The most correct approach I think it would be something like this: "Oh, see, they put a PCI-X 1 bridge in there, and look at the performance, when it's not cut by bad driver support, the scaling is amazing... So it's true that PCI-X 2.0 is still overkill, for nowadays cards".
That one sounds rationale to me.
About Crossfire, CrossfireX is needed for two 3870X2 to work together, and I've read in more than one place on the web that drivers for that aren't ready yet. Apart from this, before spitting on scaling and so on, don't forget that all these reviews are done with very early beta drivers...
Giacomo
Axbattler - Tuesday, January 29, 2008 - link
I seem to remember at least one review pointing out some fairly dire minimum frame rates. Basically, the high and impressive maximum frame rate allowed the card to have more a very good average frame rate despite a few instance of very dire minimum frame rate (if I remember correctly, it sometime dips below that of a single core card).Did Anand notice anything like this during the test?
Giacomo - Wednesday, January 30, 2008 - link
That's not exactly right. I mean, the average is not pulled up by any incredible maximum, it works in a different way: the maximum are nothing incredibly higher than the normal average, simply those minimum are rare, due to particular circumstances in which the application (the game) loads up stuff in the card while it's rendering, and it shouldn't do. This is the juice of an ATi's answer posted in the review by DriverHeaven. They told ATi about these instantaneous drops and that was the answer.Of course, remaining the fact that other cards didn't show similar problems, I believe that the problem is more driver-related, and ATi should really work fast to fix issues like this one.
However, DriverHeaven reported that this issue showed up mostly in scenario-loadings indeed, and therefore wasn't too annoying.
For what I've understood.
My two cents,
Giacomo
Blacklash - Wednesday, January 30, 2008 - link
I'd like to see a round up of ATi and nVidia's current mid to high range. Test them in games that allow 8xAA. I am hearing around in quite a few places ATi cards catch up or pass nVidia when you move from 4x to 8xAA. I'd love to see Anand prove this or debunk it.Giacomo - Tuesday, January 29, 2008 - link
I have to admit, even being a fan of nVIDIA, that I fell in love with this card. I like it in almost every aspect, I'm astonished by its combination of complexity and smooth working. Well: almost smooth, due to beta drivers. And this is an element of bigger interest too: there's even a bigger distance to put between itself and the rivals, when mature drivers are released.That said... I'm designing the last points of my new machine, I was ready to go with a 8800GTS 512MB, but suddenly stopped to look at this one... And I think I'll go with it.
But... On which motherboard? These are the thoughts I'd like to share with you, I ask you: did you notice that Anandtech ones are among the best results in the various X2 reviews online? And did you notice they're obtained on a 780i platform?
Initially, the most obvious idea was to buy this card together with an Asus Rampage Formula, so nicely reviewed on these pages. But now, looking at the system used for this review, the question: wouldn't my dear old desired eVGA 780i SLI an even better board for the new system, for this 3870 X2?
Don't need crossfire nor SLI, really.
Reviewer's opinion would be very appreciated too, of course, being the one who planned this "curious" match.
Giacomo
XrayDoc - Tuesday, January 29, 2008 - link
Does anyone know if the Intel X48 chipset will allow two of these cards to run in CrossfireX mode, or will you have to purchase an AMD chipset motherboard?karthikrg - Monday, January 28, 2008 - link
This seems to be a stopgap solution though its heartening to see AMD/ATI having the faster single card solution now (albeit multi-GPU). With the Nvidia 9800GX2 due soon, the race is sure to get interesting. Hope AMD has something up its sleeves with the R700FullHiSpeed - Monday, January 28, 2008 - link
This has nothing to do with graphics, but that PLX PEX8547 in the picture is a bridge, not a switch. Unfortunately, PLX doesn't make a Gen 2 PCIE bridge or switch. One would imagine that two GPU's on one board would benefit from the increased bandwidth of Gen 2. The PLX bridge/switch web page says the switches (and not the bridges) give "Low Latency Transfers" and "True Peer-to-Peer Data Transfers", both of which are probably important to this parallel processing application. This could explain why "the HD 3870 X2 is systematically 5% faster on average than the CrossFire solution" (nach Tom) in spite of the 16 lane Gen 1 bottleneck to the north bridge.Slaimus - Monday, January 28, 2008 - link
I am still confused by the way AMD/ATI is allocating RV670 chips. If you check on newegg, there are well over 100 Sapphire 3870X2s in stock, but the Sapphire 3870 is still out of stock.The same thing was happening during the initial 3870/3850 launch, where way too many of the 3850 were produced compared to the 3870. Also, none of the 3850s were of the 512MB type.
AMD should be filling the 3870 demand with the RV670 chips first, since it is probably the most profitable and most reasonable product.
phaxmohdem - Monday, January 28, 2008 - link
All I can think of is 2 GPU's 1 Cup :Sflat6 - Monday, January 28, 2008 - link
The article says: "the performance is indeed higher than any single-card NVIDIA solution out today", but they didn't test an 8800 Ultra. In some cases the card was barely beating the 8800 GTX. How would it stack up against an Ultra? I understand that maybe they didn't compare against an 8800 Ultra because of the price difference, but the reviewer was all excited about AMD/ATI's return to the high-end; at the high-end, cost is no object. And why two 8800 GTs? If this is the high-end, why choose mid-price cards for SLI? Why not use two Ultras?I'm not saying a single Ultra would have trounced this card, only that I would have liked to see the comparison. I do think, however, that if they'd gone truly high-end on the SLI side, two Ultras might have killed it.
strikeback03 - Monday, January 28, 2008 - link
http://www.anandtech.com/video/showdoc.aspx?i=3175...">http://www.anandtech.com/video/showdoc.aspx?i=3175...their previous testing showed the ultra as being similar in performance to the 8800 GTS 512 in most tests. So an Ultra (or 2 of them in SLI) might have done better in the AA tests, but not much else it would seem.
flat6 - Monday, January 28, 2008 - link
Thanks, I just read it. Good grief. I didn't realize the GTS 512 was that fast. Point taken.poohbear - Monday, January 28, 2008 - link
well its about time, good job amd, lets see u maintain the performance lead damn it!boe - Monday, January 28, 2008 - link
Howdy,I appreciate any benchmarks we can get but if you do a followup on this card with newer drivers, I hope you will consider the following
1. A comparison with a couple of older cards x1900 and 7900
2. A sound measurement of the cards e.g. db at full utilation from 2'
3. Crossfire performance if this card supports it.
4. Benchmarking on FEAR - all bells and whistles turned on
5. DX10 vs. DX9 performance.
Thanks again for creating this article - I'm considering this card.
perzy - Monday, January 28, 2008 - link
Am I the only one tired of all these multicores? I guess programming gets even more complex now. I guess the future all games will have development cycles like Duke Nukem forever -10+ years....?Are the GPU's hitting the heatwall 2 now?
Soon I'll stop reading these hardware sites. The only reports in the near future will be 'yet another core added.' Yipee.
wien - Monday, January 28, 2008 - link
Coding for a multi-GPU setup is not really any different that coding for a single-GPU one. All the complexity is handled by the driver, unlike with multi-core CPUs.FXi - Monday, January 28, 2008 - link
Have to say they did a good job, not great, but very good. We do need to see the 700 though, as this won't hold them for long.The other thing both camps need to address is dual monitors using SLI/CF. It's been forever since this tech has been out and it hasn't been fixed. Dual screens are commonplace and people like them. Could be one large and one smaller, or dual midrange but people want the FPS without losing their 2nd screen.
I'm sure there will be a rash of promises to fix this that won't materialize for years :) (as before)
ChronoReverse - Tuesday, January 29, 2008 - link
Actually, that was one of the things that was fixed by ATI. Dual screens will work even if it's in a window and _spanning_ the monitors. I'll see if I can find the review that showed that.murphyslabrat - Monday, January 28, 2008 - link
Come on AMD, don't die until we get the Radeon 4870 x2!Retratserif - Monday, January 28, 2008 - link
Third to last paragraph.The fact "hat" both?
Overall good article. To bad we didnt get to see temps or overclocks.
PeteRoy - Monday, January 28, 2008 - link
Is Nvidia and ATI will just put more of the same instead of innovate new technologies.Wasn't that what killed 3dfx, Nvidia should know.
kilkennycat - Monday, January 28, 2008 - link
The next-gen GPU family at nVidia is in full development. Hold on to your wallet till the middle of this year (2008). You may be in for a very pleasant surprise.DigitalFreak - Monday, January 28, 2008 - link
Does Crossfire still not allow you to create your own profiles, like SLI does? I'm still not convinced that ATI has gotten their head around the whole dual gpu driver thing yet.Chaitanya - Monday, January 28, 2008 - link
More exited about nvidia Gf9800 GX2 than the radeon 3870 X2. I hear from my source in nvidia its got a completly diffrent apporach than radeons and gf 7950Gx2Ecmaster76 - Monday, January 28, 2008 - link
*only* 54fps at 2560!thats horrible! /sarcasm
I think Quake Wars fans will be just fine.
legoman666 - Monday, January 28, 2008 - link
I was going to post the same thing. Only 54 @ 2560x1600? What more do you really want?wien - Monday, January 28, 2008 - link
54FPS average is not nearly enough to ensure 100% fluid gameplay. If that was the minimum FPS it'd be close enough, but above 60FPS minimum is preferable. This is a high-end card so expecting extreme performance is hardly unreasonable.NullSubroutine - Monday, January 28, 2008 - link
Why not test DX9 Crysis benchmarks? Even in Vista you can do that and see a large performance jump in Crysis.I still fail to see why Vista only scores are being posted on Anandtech when Vista represents such a small percentage of users.
damncrackmonkey - Monday, January 28, 2008 - link
You fail to see why DX10 performance is relevant to a DX10 card?Personally, I like to use the command line. Why don't these reviews address command line text rendering performance?
duploxxx - Monday, January 28, 2008 - link
nice comment at the end...."if AMD can do it, NVIDIA can as well. And we all know how the 3870 vs. 8800 GT matchup turned out."
pls anand give an update on that, as far as i have read no real review was done by anand and the ones floating around on the web concluded all the same... 3870cf is scaling better then sli, now knowing the price of those 2 i don't see why 8800gt sli is faster the 3870x2 with higher r680 clocks, lets see how nvidia will downclock there dual pcb card to get it out.
and knowing that 2x 3850 = 8800gtx with lower price tag
navygaser - Monday, January 28, 2008 - link
When you go over to Tom's Hardware and look at their benchmarks for this card you can see a big difference in the FPS numbers at the same settings.Looks like this card does much better on the AnandTech system. They use 4GB RAM vs. Tom's 2GB and the drivers look to be newer as well.
Could the RAM and the drives account for this big difference?
GenoR32 - Tuesday, February 5, 2008 - link
I've seen that review and that's true... if you compare it to this one, you see some diferences..IMO, Tom's Hardware is Intel(look at the Phenom 9600 Black Ed. Review.. he practically destroys AMD and kiss Intel) and Nvidia Biased... he cant admit the fact that this card is awesome...
i hate that biased reviews... Anand FTW
footballrunner800 - Monday, January 28, 2008 - link
its probably the drivers since Anandtech is still not using x64 and with a 1gb card that gives windows less than 3 gb of usable memory. The review says that AMD came with improvments on the last minute so imagine when they perfect themSunrise089 - Monday, January 28, 2008 - link
The drivers sure could.bill3 - Monday, January 28, 2008 - link
I wont link the review, doubt it's even possible, but over at the H Brent's review shows the 3870X2 in a much worse light. They show it outright losing to a single 8800GTX in COD4, Crysis, and UT3, while squeaking out a win in HL2.In the forum review thread when the differences between Brent's review and Anand's was brought up, it was basically claimed by Kyle that Anand's review is illegitimate because he only benchmarks "canned demos" (if you're familiar with H such spiel is nothing new from them). Further Kyle goes so far as to claim "AMD experienced a 60% fps increase in the Crysis canned GPU benchmark, we saw a couple frames a second in real gameplay. "
Kyle also says your COD4 bench, one of the two you guys did that wasn't "canned" and therefore invalid, is also invalid because you only benched a cutscene He hasn't said but I'm assuming the only bench you guys did that meets the Kyle standard would be Bioshock since it is real gameplay timed by fraps.
There are a few platform differences between the reviews, but Kyle has poo-pooed these as not making any major difference.
Thought you guys might be interested..the thread is the 3870X2 review thread at H forums.
Parhel - Monday, January 28, 2008 - link
HardOCP is just plain disreputable in every way. Their methodology is nonsense, their reviews are completely inadequate, and they continue to exist only because they drum up fake controversies and attempt to assassinate someone's character every few months.I'm not exaggerating when I say that I take what The Inquirer has to say more seriously.
Frumious1 - Monday, January 28, 2008 - link
I'm just shocked by how many people seem to place any relevance on the HOCP garbage. "It's because we're REAL WORLD and eveyone else is lies and fake stuff!" What a laugh. They play ONE resolution on TWO cards and pretend that's testing. Oh, and they don't use the same settings on both cards, they don't run at the same resolutions as previous reviews, they don't use the most comparable card (8800 GTS 512, anyone?), their testing isn't remotely similar to anything at any other site (hence they can just make claims about how they're doing it right and everyone else is wrong).... I could go on.Anyway, Anand and crew are best served in my opinion by completely ignoring this childish stuff from Kyle and his cohorts. You can choose: either all of the enthusiast sites except HOCP are wrong, or Kyle is wrong. Ocham's Razor suggests that the latter is far more likely.
AnnonymousCoward - Tuesday, January 29, 2008 - link
Yeah, one resolution and one competitor card doesn't say much.JWalk - Monday, January 28, 2008 - link
Yeah, Kyle is currently yelling at people in the forum thread. He has been asked multiple times why Anand, and most other sites, have a completely different view in their benchmarks. (Keep in mind that HardOCP only benchmarked 4 freaking games on 2 cards. According to Kyle, it would have been too much work to benchmark more games or more cards. Awww...not hard work...anything but that. LOL)Then, he either chants "canned" benchmarks over and over, or he tells the person asking the question to get lost and never come back to HardOCP.
It has even been pointed out more than once that the review sample he received might be the problem. Maybe he should try another card. But he is in full-blown arrogant a$$ mode right now. ;)
Devo2007 - Monday, January 28, 2008 - link
Not surprising coming from Kyle -- he comes across that way quite often.I don't like his reviews much -- not because of the canned benchmarks vs. real-world gameplay they preach about, bur rather the fact that they typically only compare one or two other cards to the one being tested. I have a GTS 320MB, and it would be nice to see whether the GTS 512MB would be worth it. Sadly, no direct comparison was done, because the GTS 320MB doesn't compete with said card.
I generally try to read several reviews to get an idea of a product - one site is never enough for these things. :)
boe - Monday, January 28, 2008 - link
Usually the stock coolers are pretty dang loud. I'm wondering if any cooling solutions will be available from zalman, artic or the other standards?It would have been sweet if there was a db measurement chart.
Gholam - Monday, January 28, 2008 - link
GeCube has a version that looks like it has 2xZalman VF900-Cu on it. Custom PCB with 4 DVI outputs too.boe - Monday, January 28, 2008 - link
I really appreciate this article.The things I'd really like to see on the next is adding FEAR benchmarks.
I'd also appreciate a couple of older cards added for comparison like the 7900 or the x1900.
Butterbean - Monday, January 28, 2008 - link
"And we all know how the 3870 vs. 8800 GT matchup turned out"Yeah it was pretty close except for Crysis - where Nvidia got busted not drawing scenes out so as to cheat out a fps gain.
Stas - Monday, January 28, 2008 - link
Conveniently the tests showed how *2* GTs are faster in most cases than X2. Power consumption test only shows *single* GT on the same chart with X2.geogaddi - Wednesday, January 30, 2008 - link
Conveniently, most of us can multiply by 2.
ryedizzel - Monday, January 28, 2008 - link
in the 2nd paragraph under 'Final Words' you put:Even more appealing is the fast that the 3870 X2 will work in all motherboards:
but i think you meant to say:
Even more appealing is the fact that the 3870 X2 will work in all motherboards:
you are welcome to delete this comment when fixed.
abhaxus - Monday, January 28, 2008 - link
I would really, really like to see a crysis benchmark that actually uses the last level of the game rather than the built in gpu bench. My system (q6600@2.86ghz, 2x 8800GTS 320mb @ 620/930) gets around 40fps with 'high' defaults (actually some very high settings turned on per tweakguides) on both of the default crysis benchies, but only got around 10fps on the last map. Even on all medium, the game only got about 15-20 fps on the last level. Performance was even lower with the release version, the patch improved performance by about 10-15%.Of course I'd be interested to see how 2 of these cards do in crysis :)
Samus - Monday, January 28, 2008 - link
Even though Farcry is still unplayable at 1900x1200 (<30FPS) but its really close. My 8800GT only manages 18FPS on my PC using the same ISLAND_DEM benchmark AT did, so to scale, the 3870 X2 will do about 27FPS for me. Maybe with some overclocking it can hit 30FPS. $450 to find out is a bit hard to swallow though :\customcoms - Monday, January 28, 2008 - link
Well, if you're still getting <30 fps on Far Cry, I think you're PC is a bit too outdated to benefit from an upgrade to an HD3870 X2.I assume you meant Crysis. This game is honestly poorly coded with graphical glitches in the final scenes. With my 8800GT (and a 2.6ghz Opteron 165, 2gb ram), I pull 50 FPS in the Island_Dem benchmark at 1680x1050 with a mixture of medium-high settings, 15 fps more than the 8800GTS 320mb I replaced it with. However, when you get to some of the final levels, these frames drop to more like 30 or less, and I am forced to drop to medium or a combination of medium-low settings at this point (perhaps my 2.6ghz dual core cpu isn't up to 3ghz C2D snuff).
Clearly, the game was rushed to market (not unlike Far Cry). Yes, the visuals are stunning, and the storyline is decent, but I much prefer Call of Duty 4, where the visuals are SLIGHTLY less appealing, but the storyline is better, the game is more realistic, the controls are better, and I can crank everything to max without worrying about slowdowns in later levels. Its the only game I have ever immediately started replaying on veteran.
The point is, no card(s) appears to be able to really play Crysis max everything at 1900x1200 or higher, and in my findings, the built in time demo's do not realistically simulate the games full demands in later levels.
swaaye - Wednesday, January 30, 2008 - link
Yeah, armchair game developer in action!In what way is CoD4 realistic, exactly? I suppose it does portray actual real military assets. It doesn't portray them in a realistic way, however.
Did you notice that Crysis models bullet velocity? It's not just graphical glitz.
Griswold - Monday, January 28, 2008 - link
Ahh, gotta love the arm chair developers who can see the game code just by looking at the DVD.HilbertSpace - Monday, January 28, 2008 - link
When giving the power consumption numbers, what is included with that? Ie. how many fans, DVD drives, HDs, etc?m0mentary - Monday, January 28, 2008 - link
I didn't see an actual noise chart in that review, but from what I understood, the 3870GX2 is louder than an 8800 SLI setup? I wonder if anyone will step in with a decent after market cooler solution. Personally I don't enjoy playing with headphones, so GPU fan noise concerns me.cmdrdredd - Monday, January 28, 2008 - link
then turn up your speakersdrebo - Monday, January 28, 2008 - link
I don't know. It would have been nice to see power consumption for the 8800GT SLI setup as well as noise for all of them.I don't know that I buy that power consumption would scale linearly, so it'd be interesting to see the difference between the 3870 X2 and the 8800GT SLI setup.
Comdrpopnfresh - Monday, January 28, 2008 - link
I'm impressed. Looking at the power consumption figures, and the gains compared to a single 3870, this is pretty good. They got some big performance gains without breaking the bank on power. How would one of these cards overclock though?yehuda - Monday, January 28, 2008 - link
No, I'm not impressed. You guys should check the isolated power consumption of a single-core 3870 card:http://www.xbitlabs.com/articles/video/display/rad...">http://www.xbitlabs.com/articles/video/...lay/rade...
At idle, a single-core card draws just 18.7W (or 23W if you look at it through a 82% efficient power supply). How is it that adding a second core increases idle power draw by 41W?
It would seem as if PowerPlay is broken.
erikejw - Tuesday, January 29, 2008 - link
ATI smokes Nvidia when it comes to idle power draw.Spoelie - Monday, January 28, 2008 - link
GDDR4 consumes less power as GDDR3, given that the speed difference is not that great.FITCamaro - Monday, January 28, 2008 - link
Also you figure the extra hardware on the card itself to link the two GPUs.yehuda - Tuesday, January 29, 2008 - link
Yes, it could be that. Tech Report said the bridge chip eats 10-12 watts.Zoomer - Saturday, February 2, 2008 - link
Don't forget the higer core clocks? Although you could turn it down when not needed.Howard - Monday, January 28, 2008 - link
Third graph on the HL2 bench page doesn't have a highlighted FPS bar.Howard - Monday, January 28, 2008 - link
None of the Oblivion graphs do, either.houe - Monday, January 28, 2008 - link
take it!