So yeah. On the first page of these comments OrooOroo hit the nail on the head. If you bought 2 Ultras buying a third one (even at the end of its lifecycle) isn't going to bother you. It's like upgrading the twin turbos on a ferrari. No you don't need to but it would be cool if you did! There will always be honda drivers that look at you like your crazy but you're not buying it for them (although if you did I bet thier opinion on 3-way SLI would change).
The article sounds like it was written by someone who knew they would have to return the product and go back to there 22"WS and single 8800GT setup. I love how he/she just brushed off the Bioshock results because they didn't support your arguement and then made some half-baked excuse about how cpu speed had something to do with it and removed it from "how many games benefit from 3xSli" off topic test. Stick to your benches. Thats all you have. If you say one part of your test is faulty why should I believe any of the others are working?
Keep it simple. Just the facts. I bought 3-Way because simply put it IS faster(ugh... I already had 2xSLI GTX and I got the third on ebay for like $380 if that makes anyone on a budget feel any better). If you have to justify the cost you have no business even buying 1 Ultra much less 2 or even thinking (or talking) about the next gen top end because you're not going to buy that either. What your going to do is try to make excuses why no one should buy the card you can't afford until a year later when they come out with something thats in your price range and is almost as fast (eghehm.. 8800GT). You'd do better saving your money for some off brand 17" rims or really nice spinner hubcaps.
No XBox or any console game will do well on PC platform (Assuming it was made for the console first), so shutup about it.
Now, regarding the video cards, Tri SLI is a waste of money, end of discussion. We've beaten this horse for 5 pages now, we can all agree on this.
Moving on, to differentiate between Nvidia and ATI is actually very easy.
If you want high end performance, no matter the cost, go Nvidia dual SLI. If you want high end performance with cost in mind, but still want solid bang for your buck, go with ATI's 3850/3870 lineup in Crossfire. The 790 FX chipset is very nice and the 3850's offer dominating performance in its category and for its cost.
No XBox or any console game will do well on PC platform (Assuming it was made for the console first), so shutup about it.
Now, regarding the video cards, Tri SLI is a waste of money, end of discussion. We've beaten this horse for 5 pages now, we can all agree on this.
Moving on, to differentiate between Nvidia and ATI is actually very easy.
If you want high end performance, no matter the cost, go Nvidia dual SLI. If you want high end performance with cost in mind, but still want solid bang for your buck, go with ATI's 3850/3870 lineup in Crossfire. The 790 FX chipset is very nice and the 3850's offer dominating performance in its category and for its cost.
No XBox or any console game will do well on PC platform (Assuming it was made for the console first), so shutup about it.
Now, regarding the video cards, Tri SLI is a waste of money, end of discussion. We've beaten this horse for 5 pages now, we can all agree on this.
Moving on, to differentiate between Nvidia and ATI is actually very easy.
If you want high end performance, no matter the cost, go Nvidia dual SLI. If you want high end performance with cost in mind, but still want solid bang for your buck, go with ATI's 3850/3870 lineup in Crossfire. The 790 FX chipset is very nice and the 3850's offer dominating performance in its category and for its cost.
the graphics brought are probably the best around these days but this WON'T SIMPLY JUSTIFY THE AMOUNT OF HARDWARE CONSUMED!
C'mon guys..get real!
In my view this path with multiple video cards....is one way wrong street...Multiple GPU on a single board YES! - that would be another story here
Why Doom3 or H2 didn't require SLI or CF to work when they appeared?!
So, CRYTEK thaks but...no thanks! It's not reasonable at all to pay double(to get a SLI config) to play a SINGLE GAME- which in my view is a better looking version of Far Cry - poor story/scenario also - poor idea...You are the one man, one hero, left in the North Pole with a tooth brush in underwear to survive after which you are transfered to an island to fight Rambo style - Me vs ALL- "bring it on you maggots, I'm gonna teach you all...!"
Well this is the funny side of it- if you try to entertain yourself(yes games supposed to be entertaining, just not anymore) you won't be able to...cause you'll be preoccupied by surrounding enemies suit's battery and ammo depletion..weapons and ammo are scarce,and enemies die rather like in Hitman(very hard), you have to empty 3 clips to get 3 guys...wow so much fun..
sorry for the somewhat off topic...
In the closing comments the author is basically complaining about the stagnation of the GPU market. Nvidia with it's 1+ billion dollar cash should develop multi-core GPU dies instead of the same tried and tru $$$ approach of releasing year after year of >$500 video cards. Also notice since ATI is playing 2nd distant fiddle at the high end Nvidia has REALLY slowed down on their improvements. We're looking at a long dark ages in PC gaming until we get a viable competitor to Nvidia. Intel's delay on the 45nm mainstream chip release due to the Phenom failure is another sign we're heading back to >$900+ (remember those dreary P3/early P4 days, until Athlon's started cleaning Intel's clock) mainstream chips with stagnation on the cpu end also.
I'm curious if it is possible to run three 8800GT's, each hooked up to its own monitor (say 20' UXGA LCD), for a nice triple-monitor setup. No SLI whatsoever. If this works as well as the Triplehead-2-Go from Matrox on the software side, I'd very much be interested in getting it.
sadly, i think these kinds of things are what's rapidly getting rid of the 'fun' in staying in the pc gaming scene. i've been playing pc games since about 86 or so (so much longer than many these days, and yet not as long as many others), but only in the last few years have i been getting 'tired' of all the 'improvements' that hardware companies seem to come up with on a montly basis. not to mention the developers who keep giving them reasons to want to come up with new junk.
i finally jumped into the console gaming world, have all 3 consoles, and quite frankly it feels much more relaxing these days to play a console game and know that it'll just 'work'.
there seems to be less and less incentive to waste time with pc gaming every day. as soon as they get real mmo's going on the consoles the pc gaming scene will just fade away finally i think. and i'll be the first to say 'good riddance'.
Thanks for calling it how it is. The final comments are so true it isn't even funny. All of the fanboys need to come to their senses...this is a real inefficient technology, both crossfire and SLI. The theoretical gains vs the actual gains highlight a serious problem with this design.
Think of the physics behind it. No matter what process you are embarking upon, when ever you split something into more pieces you loose efficiency. Yeah so having two cards boosts performance a little...but nothing ground breaking. Having two GPUs running should give 2x performance gains hands down with no exceptions. The 3rd card is a dead weight for some games...how the heck could a company stand behind that as a successful solution? Don't p*ss down my back and tell me it is raining.
I 100% agree with the author...we need something new that is actually worth it. This "lets add more cards" solution is a junk marketing scheme. ATI/nVidia need to work on something huge rather than waste time with crossfire and SLI. The GPU technology needs a change as what the CPUs saw with the quad core...more performance, higher efficiency...not a slap in the face with 30% performance, with 3x power consumption and 4x $$$.
Hothardware is saying Crysis has a bug on Multi-SLI, so they're expecting a patch to enable multi-SLI feature. Could be the explanation on why there's no gain on going 3-way SLI.
No way I'm spending >$1,500 on three video cards just to play a game. All I want is one good $500-600 card that can play Crysis and other newer games at 1920x1200.
And I feel like Pirks above. I already have a Mac for everyday use, switched from Windows after trying out the Vista trainwreck (I used to be a Mac user years ago). I've built a $1,500 PC two months ago just to play games and I find that despite having the second fastest video card (8800GTX) I can't play a lot of latest games well at 1920x1200. And what's Nvidia's solution? $1,500+ 3way SLI. You know what they can do with that?! I'm seriously considering dumping the PC altogether and getting an Xbox too. PC gaming is going downhill quickly, it's getting way too expensive and too frustrating. At this rate it's just a matter of time before all the good games come out for consoles only. I walked in to Game Stop the other day: "I'm sorry but we don't carry PC games any more at this location". Uh? It's like being Mac gamer all over again!
Okay, so you guys probably read already that in the North America Crysis only sold about 33,000 copies, which is a total sales flop. Does it feel like hi-end video cards are finally moving into very narrow niche with people moving to Xbox 360? First Orange Box from Valve, then Gears of war from Epic, then Crysis, and The Darkness (no PC version and not even talk about making one), and The lost planet, etc etc... I had very bad feeling about Crysis, too bad this feeling was not unfounded.
And, well, right now you can get Xbox 360 for $250 (yeah, with coupons and if you're lucky, but... still...), so I don't know guys, I see mass consumers just shying away from hi-end 3D cards more and more ($250 for Xbox 360 and cheap 720p HDTV or $500 for hi-end nVidia card? hmmm... now even _I_ start to think about it), I've heard numerous complaints from my gaming buddies that a lot of PC ports of console games are not.. er.. very high quality (for instance in Gears of war the Hammer of dawn is a joke compared to Xbox version, no Collector's edition for PC, etc etc - many things in console ports look like shit on PC, don't even start to remember Halo 2 PC port, puukeee :bleeeeaaaah: [vomiting violently])
I don't know about you guys but I see Mac and Xbox 360 coming, marching forward, they are just simpler and more for the dumb people so we enthusisats are going into extinction slowly but surely. Newegg now sells electronics, cameras, kitchen stuff and bread machines (I gonna buy one for my wife there BTW, and maybe air conditioner too). I'm not even sure about Mac anymore - maybe I'll get myself a Mini, and leave PC for occassional gaming, IF there is a decent game on a PC coming out. Time to try that Mac/Xbox 360 combo my buddies keep drooling about.
No, I'm not trolling, just talking about my own personal observations. You're more than welcome to criticize and downmod me, guys, I'd love to be wrong on that, actually
You're not the only one. I'm also thinking of getting that rumored ultraportable Mac early next year too and I already have the consoles - and I am very happy with console FPSes on Xbox360. When it comes to games recently the only good exclusive games for PC are RTSes (and MMORPGs) which aren't really my cup of tea. So no, you are not the only one, and I am for one glad about low sales of Crysis. Personally, to me it was the most disappointing game of the year. Anyway, while I'm sad the era of plentiful, deep games is gone (where are space sims? deep RPGs? great adventures like Gabriel Knight? ANY game with good length AND depth?) - it is what it is. It's probably no coincidence that there were many independent studios at that time, now all gobbled up by large conglomerates. When I turn on a console, I am certainly not missing game freezes and crashes such as ones due to factory overclocked (!) memory I'm getting on my 8800GT (had similar issue with 3 consecutive video cards in 2 years now), copy protection schemes (had to change DVD drive for one game!), occasional mandatory beta video drivers and just general fuss and instability. Sure, my overclocked Q6600 is insanely fast if I run photo editing or video encoding or even web browsing, and the games do look great in full resolution - but that's only when everything fully works which is not as often as I'd like. I'm getting too old for this. The only bad thing is that console makers are now adopting PC ways - patches, game freezes, controller dropouts, overheating, noise...
Yeah, I think I'm getting older too - I just want damn computer to work and looks like I gonna get Mac, 'cause I know self-made PC is waaay cheaper, but somehow I'd pay for peace of mind, service etc - just buy Mac with 3-year warranty, you pay a lot but I heard they generally work okay, so... and I still gonna use PC as a second machine - would be fun to compare them and see what each is good for. And as for games you're right, big EA-like publishers are killing inventive original games of the past (American McGee's Alice, Medal of Honor, MDK, Dune 2, UFO, Descent, and many many others) but if publishers are going to pour money in the console market - I better get a console. Games are not going to be great, I agree, but at least they will be CHEAP, compared with nvidia $500-a-year "tax". Anyway, I'll keep PC around and maybe even upgrade it if the decent PC game comes up. Actually I'm waiting for Fallout 3, may be a good reason to upgrade, who knows
...it's far more likely to be used by a (nV) video card functioning as a GPGPU for either gaming --- or in the short-term --- professional desktop applications. nV is making great strides in the professional scientific number-crunching and signal-processing communities with their CUDA toolset running on their current GPU offerings. They currently own ~ 86% of the "workstation graphics" market, but in a rapidly-increasing number of cases, graphics is not the sole function of the current nV workstation hardware. Wait for nVidia's next generation silicon and driver software which will be far more focussed on seamlessly merging GPU and GPGPU functionality. Also, wait for their true next-gen motherboard chip-set and not the cobbled-together "780i" which will implement symmetrical PCIe2.0 on all 3 PCIe x16 slots. Arriving about the same time as their next gen GPU family. Mid-2008 would be my guess.
Funny how your review doesn't address this blatant issue. yes it will run tri sli but don't expect it to do with the same Yorkfield used on the test board they used. Engineering samples of the QX9650 ran fine on the 680i SLI's but were changed with the retail versions. Whether it was Intels pissy way of getting back at Nvidia for not licensing SLI to them or Nvidia's way of making a buck off of selling an almost already obsolete board (nehalems coming next year). At this stage...who cares.
In Crysis, you say that the third card offers a 7% performance boost over the 2 card configuration, however, it is only offering 1 fps more, which is just about 2%. Those numbers should be changed.
Not complaining, but I've noticed the last several GPU articles have been written by Anand, which isn't his normal gig. On top of that we get a reference to another GPU editor from back in the day. What's up?
I'd be interesting to do a comparision between SLI and Crossfire once AMD gets some drivers out that actually support quad SLI. I saw a board on newegg that looks like I'd fit 3 3870s as well.
The 3870s take up 2 slots so I only see boards that fit 3. Most of the boards will take 4 3850s though. Again, I'd like to see the performance number comparisons for scaling purposes.
The 3870s take up 2 slots so I only see boards that fit 3. Most of the boards will take 4 3850s though. Again, I'd like to see the performance number comparisons for scaling purposes.
Well, I’m glad to see this evaluation of 3-way SLI. It just gave me an idea about overcoming performance issues in games like Crysis. There is no need for building ridiculously expensive machines which draws insane amount of power. I have a better solution (although it won’t work for all of you). I’m just not going to buy a game which I can’t play in its full galore on decent system, at mainstream resolution (1680x1050).
I don’t expect the latest and greatest, “show off” kind of a game to be playable at 2560x1600 with highest settings, full AA and AF. Not on a system with Q6600 and single 8800GT. But if you can’t do it on a system like one used by Anand here? Well, then it’s becoming ridiculous.
I’m trying to imagine a proud owner of machine with QX9650 @ 3.33GHz, 3 (that’s THREE) 8800 Ultras and shiny 30-inch monitor, not being able to play a game he just bought. What would be his thoughts about developer of that game? I guess not the pretty ones…
When will this nonsense stop? It is perfectly reasonable for a game company to "permit" users to increase the detail if they so choose. On "high" the game looks and runs great on a sub-$400 video card. In fact, on "high" it looks better than anything out there, on any platform. At least with a "very high" setting available, the game will continue to look good a year from now when other games have caught up.
Uuuh... no, Crysis is not playable at "high" at any decent resolution on my 8800GT and 3.4GHz overclocked quad core Q6600. Decent being 1280 x whatever. And when you drop to medium, the game looks nothing special. Sure, there are a few areas that look great (forest level for example) but overall I was certainly not blown away. Unlike replaying Bioshock in 1920x1200 which this setup is capable of running very smoothly and which looks amazing in DX10. Quite simply, Crysis is one of the worst optimized games ever. At least it doesn't crash, that's something I guess. Looking forward to replaying it in 2 years. Come to think of it, it was the same with Far Cry, it took 2 years to be able to play that game with decent frame rates.
There's nothing that says Crytek can't make a game where maximum detail settings exceed the capacity of every PC currently available. We've seen this in the past (Oblivion for one), and then a year later suddenly the game is more than playable at max settings on less expensive hardware. It doesn't appear that Tri-SLI is fully implemented for many titles, and considering the age of Crysis I'd expect more performance over time. Just like many games don't fully support SLI (or support it at all in some cases) at launch, only to end up greatly benefiting once drivers are optimized.
FWIW, I'm playing Crysis on a single 8800 GTX at High detail settings and 1920x1200 with a Core 2 Duo 3.0GHz 2MB (OC'ed E4400). It might be too sluggish for online play where ping and frame rates matter more, but for single player I'm having no issues with that res/settings. It's a matter of what you feel is necessary. I'm willing to shut off AA to get the performance I want.
First of all, people who will fork over 1500 dollars worth of GPUs will want to play all games at the highest settings. That means highest AA and AF settings. I don't think you used AA and AF in your testing. It is almost pointless to play without AA for such a nice setup at 120fps(bioshock) where you are becoming CPU bound rather than GPU bound.
Secondly, your Crysis test used 1920x1200. Why not 2560x1600? Why not 2560x1600 at 4xAA and 16xAF? Crysis at 1920x1200 without AA and AF are severely CPU bound in your case, as you have witnessed that a faster CPU gave you linear scaling.
Third, there is actually no point of testing triple SLI at any other resolution other than 2560x1600. The target audience triple SLI is aimed at are those with 30 inch Cinema Displays.
I think to be fair, you should rerun the benchmarks in a non-CPU bound situation with AA+AF on, you will see the proper scaling then.
I have a 30inch monitor.
the 7900gtx was killing my frame rate.
i was getting average 25fps @ 2560x1600, medium, 2X AA, 16X aniso...in FEAR Perseus Mandate.
Just bought MSI OC 8800GTS G92 and very happy with it.
Now i can crank up maximum graphic setting, 4X AA, 16X aniso @ average 40fps...very nice. :D
Crysis is a hot engine, i only get 30fps @ medium, AA off.
YES. what is the point of 3 GPU and have your AA/Aniso off?
game will look like crap.
Crysis recommends 4gb of ram.
The point is that running 1600x1200 is really not anything you shouldn't be able to do with one card. Even 1920x1080 in many games is perfect. Showing off 10000000fps means jack, turn the res and AA/AF up and show us what it can push out.
The author missed one advantage of 3-way SLI:
Of course it doesn't make any sense to spend >$1500 on three 8800GTX/Ultras today, but what about those folks that already have a SLI 8800GTX/Ultras?
For them adding a third card could be a reasonable upgrade option in comparison to replacing both cards with new G92 based cards.
3-way SLI isn't for everyone, but it has its advantages.
I was under the impression that bioshock did not support AA in DX10. If that is indeed the case, that's hardly the fault of the benchmarker/reviewer.
Also, I see much merrit in benchmarking at 1920x1200, its a much more common resolution and desktop-friendly resolution given the physical foot print of monitors. Lets be honest, many games aren't sitting 4ft from their displays. At 2-3ft a 24" display which most likely has 1920x1200 is much more comfortable for longer action based viewing. Ideally though they would have a lower dot pitch or simply higher resolution on the smaller screen.
One more thing: you are using Vista Ultimate 32bit with 4GB of memory. Since in 32bit, you have 3 768MB Ultras(2.4GB reserved just for video cards) , the system will only see about 1.5GB of memory. That is not sufficient system memory for high resolution benchmarks, especially Crysis.
quote: For this test, we are using a high end CPU configured with 4GB of DDR2 in an NVIDIA 680i motherboard. While we are unable to make full use of the 4GB of RAM due to the fact that we're running 32-bit Vista, we will be switching to 64-bit within the next few months for graphics. Before we do so we'll have a final article on how performance stacks up between the 32-bit and 64-bit versions of Vista, as well as a final look at Windows XP performance.
Completely valid point about using 32-bit vs. 64-bit and somewhat of a hot topic over in the video forums. Honestly you have $5000+ worth of hardware in front of you, yet getting a 64-bit version of Vista running benchmarks at resolutions/settings where 64-bit and 2GB+ would help the most is too difficult? C'mon guys, seriously this is the 2nd sub-par review in a row (512 GTS review was poor too).
Also, could you clarify the bit about 680i boards being able to accomplish the same thing? Exactly what spurred this change in Tri-SLI support? Driver support? Seems Anand used 169.08 but I thought the 169.25 was the first to officially support Tri-SLI from the patch notes. Or has it always been supported and the 780i just hyping up a selling point that has been around for months? Also, the 780i article hinted there would be OC'ing tests with the chipset and I don't see any here. Going to come in a different article? Thanks.
Yeah, seriously. Especially since the 64bit Crysis executable does away with the texture streaming engine entirely...how can you make a serious "super high end ultimate system" benchmark without utilizing the most optimized, publicly available version of the game? Is it that the 64bit Vista drivers dont support 3-way SLI yet?
Otherwise, putting together a monster rig with 3 $500 videocards and then testing it with 32bit vista seems rather silly....
Address space consumption isn't 1:1 with video memory, it's only correlated, and even less so in SLI configurations where some data is replicated between the cards. I'm not sure what exact value Anand had, but I'm confident Anand had more than 2GB of free address space.
Testing at high resolutions with ultra-insane graphics settings serves one purpose: it makes hardware like Quad-SLI and Tri-SLI appear to be much better than it really is. NVIDIA recommended 8xAA for quad-SLI back in the day just to make sure the difference was large. It did make QSLI look a lot better, but when you stopped to examine the sometimes sub-20 FPS results it was far less compelling.
Run at 4xAA on a 30" LCD at native resolution, and it's more than just a little difficult to see the image quality difference, with sometimes half the frame rate of 4xAA. A far better solution than maxing out every setting possible is to increase quality where it's useful. 4xAA is even debatable at 2560x1600 - certainly not required - and it's the first thing I turn off when my system is too slow for a game. Before bothering with 8xAA, try transparent supersampling AA. It usually addresses the same issue with much less impact on performance.
At the end of the day, it comes down to performance. If you can't enable 8xAA without keeping frame rates above ~40 FPS (and minimums above 30 FPS), I wouldn't touch it. I play many games with 0xAA and rarely notice aliasing on a 30" LCD. Individual pixels are smaller than on 24", 20", 19", etc. LCDs so it doesn't matter as much, and the high resolution compensates for other areas. Crysis at 2560x1600 with Very High settings? The game is already a slide show, so why bother?
faster is faster, the best is expensive and sometimes frivolous. at that price point you arent thinking like a budget buyer anymore. like exotic cars, you can't be that rational about it. its simply power ...power NOW.
If it's true that it's all about power, then just find the most expensive cards you can buy, install them, and don't bother playing anything. Also, tip your salesman a few hundred bucks to make the purchase that much more expensive.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
48 Comments
Back to Article
paydirt - Friday, February 15, 2008 - link
Physics belong on the GPU, Crysis put them on the CPU. (search: AGEIA Crysis)This is partly why framerates stink in Crysis, because it is bogging down a processor that isn't designed to properly handle physics.
LtUh8meDoncha - Monday, January 7, 2008 - link
So yeah. On the first page of these comments OrooOroo hit the nail on the head. If you bought 2 Ultras buying a third one (even at the end of its lifecycle) isn't going to bother you. It's like upgrading the twin turbos on a ferrari. No you don't need to but it would be cool if you did! There will always be honda drivers that look at you like your crazy but you're not buying it for them (although if you did I bet thier opinion on 3-way SLI would change).The article sounds like it was written by someone who knew they would have to return the product and go back to there 22"WS and single 8800GT setup. I love how he/she just brushed off the Bioshock results because they didn't support your arguement and then made some half-baked excuse about how cpu speed had something to do with it and removed it from "how many games benefit from 3xSli" off topic test. Stick to your benches. Thats all you have. If you say one part of your test is faulty why should I believe any of the others are working?
Keep it simple. Just the facts. I bought 3-Way because simply put it IS faster(ugh... I already had 2xSLI GTX and I got the third on ebay for like $380 if that makes anyone on a budget feel any better). If you have to justify the cost you have no business even buying 1 Ultra much less 2 or even thinking (or talking) about the next gen top end because you're not going to buy that either. What your going to do is try to make excuses why no one should buy the card you can't afford until a year later when they come out with something thats in your price range and is almost as fast (eghehm.. 8800GT). You'd do better saving your money for some off brand 17" rims or really nice spinner hubcaps.
borisof007 - Thursday, January 3, 2008 - link
No XBox or any console game will do well on PC platform (Assuming it was made for the console first), so shutup about it.Now, regarding the video cards, Tri SLI is a waste of money, end of discussion. We've beaten this horse for 5 pages now, we can all agree on this.
Moving on, to differentiate between Nvidia and ATI is actually very easy.
If you want high end performance, no matter the cost, go Nvidia dual SLI. If you want high end performance with cost in mind, but still want solid bang for your buck, go with ATI's 3850/3870 lineup in Crossfire. The 790 FX chipset is very nice and the 3850's offer dominating performance in its category and for its cost.
Done.
borisof007 - Thursday, January 3, 2008 - link
No XBox or any console game will do well on PC platform (Assuming it was made for the console first), so shutup about it.Now, regarding the video cards, Tri SLI is a waste of money, end of discussion. We've beaten this horse for 5 pages now, we can all agree on this.
Moving on, to differentiate between Nvidia and ATI is actually very easy.
If you want high end performance, no matter the cost, go Nvidia dual SLI. If you want high end performance with cost in mind, but still want solid bang for your buck, go with ATI's 3850/3870 lineup in Crossfire. The 790 FX chipset is very nice and the 3850's offer dominating performance in its category and for its cost.
Done.
borisof007 - Thursday, January 3, 2008 - link
No XBox or any console game will do well on PC platform (Assuming it was made for the console first), so shutup about it.Now, regarding the video cards, Tri SLI is a waste of money, end of discussion. We've beaten this horse for 5 pages now, we can all agree on this.
Moving on, to differentiate between Nvidia and ATI is actually very easy.
If you want high end performance, no matter the cost, go Nvidia dual SLI. If you want high end performance with cost in mind, but still want solid bang for your buck, go with ATI's 3850/3870 lineup in Crossfire. The 790 FX chipset is very nice and the 3850's offer dominating performance in its category and for its cost.
Done.
LaZr - Thursday, December 27, 2007 - link
Why bua a nvidia when it dosent run 3dmark 2008http://r800.blogspot.com/2007/12/3dmark-vantage-br...">http://r800.blogspot.com/2007/12/3dmark-vantage-br...
Lack of dx 10.1
DiggIt that fanboys!!!!!
falc0ne - Monday, December 24, 2007 - link
the graphics brought are probably the best around these days but this WON'T SIMPLY JUSTIFY THE AMOUNT OF HARDWARE CONSUMED!C'mon guys..get real!
In my view this path with multiple video cards....is one way wrong street...Multiple GPU on a single board YES! - that would be another story here
Why Doom3 or H2 didn't require SLI or CF to work when they appeared?!
So, CRYTEK thaks but...no thanks! It's not reasonable at all to pay double(to get a SLI config) to play a SINGLE GAME- which in my view is a better looking version of Far Cry - poor story/scenario also - poor idea...You are the one man, one hero, left in the North Pole with a tooth brush in underwear to survive after which you are transfered to an island to fight Rambo style - Me vs ALL- "bring it on you maggots, I'm gonna teach you all...!"
Well this is the funny side of it- if you try to entertain yourself(yes games supposed to be entertaining, just not anymore) you won't be able to...cause you'll be preoccupied by surrounding enemies suit's battery and ammo depletion..weapons and ammo are scarce,and enemies die rather like in Hitman(very hard), you have to empty 3 clips to get 3 guys...wow so much fun..
sorry for the somewhat off topic...
Pneumothorax - Thursday, December 20, 2007 - link
In the closing comments the author is basically complaining about the stagnation of the GPU market. Nvidia with it's 1+ billion dollar cash should develop multi-core GPU dies instead of the same tried and tru $$$ approach of releasing year after year of >$500 video cards. Also notice since ATI is playing 2nd distant fiddle at the high end Nvidia has REALLY slowed down on their improvements. We're looking at a long dark ages in PC gaming until we get a viable competitor to Nvidia. Intel's delay on the 45nm mainstream chip release due to the Phenom failure is another sign we're heading back to >$900+ (remember those dreary P3/early P4 days, until Athlon's started cleaning Intel's clock) mainstream chips with stagnation on the cpu end also.ViperV990 - Wednesday, December 19, 2007 - link
I'm curious if it is possible to run three 8800GT's, each hooked up to its own monitor (say 20' UXGA LCD), for a nice triple-monitor setup. No SLI whatsoever. If this works as well as the Triplehead-2-Go from Matrox on the software side, I'd very much be interested in getting it.araczynski - Wednesday, December 19, 2007 - link
sadly, i think these kinds of things are what's rapidly getting rid of the 'fun' in staying in the pc gaming scene. i've been playing pc games since about 86 or so (so much longer than many these days, and yet not as long as many others), but only in the last few years have i been getting 'tired' of all the 'improvements' that hardware companies seem to come up with on a montly basis. not to mention the developers who keep giving them reasons to want to come up with new junk.i finally jumped into the console gaming world, have all 3 consoles, and quite frankly it feels much more relaxing these days to play a console game and know that it'll just 'work'.
there seems to be less and less incentive to waste time with pc gaming every day. as soon as they get real mmo's going on the consoles the pc gaming scene will just fade away finally i think. and i'll be the first to say 'good riddance'.
anyway, just venting. ignore me.
BigMoosey74 - Tuesday, December 18, 2007 - link
Thanks for calling it how it is. The final comments are so true it isn't even funny. All of the fanboys need to come to their senses...this is a real inefficient technology, both crossfire and SLI. The theoretical gains vs the actual gains highlight a serious problem with this design.Think of the physics behind it. No matter what process you are embarking upon, when ever you split something into more pieces you loose efficiency. Yeah so having two cards boosts performance a little...but nothing ground breaking. Having two GPUs running should give 2x performance gains hands down with no exceptions. The 3rd card is a dead weight for some games...how the heck could a company stand behind that as a successful solution? Don't p*ss down my back and tell me it is raining.
I 100% agree with the author...we need something new that is actually worth it. This "lets add more cards" solution is a junk marketing scheme. ATI/nVidia need to work on something huge rather than waste time with crossfire and SLI. The GPU technology needs a change as what the CPUs saw with the quad core...more performance, higher efficiency...not a slap in the face with 30% performance, with 3x power consumption and 4x $$$.
solgae1784 - Tuesday, December 18, 2007 - link
Hothardware is saying Crysis has a bug on Multi-SLI, so they're expecting a patch to enable multi-SLI feature. Could be the explanation on why there's no gain on going 3-way SLI.Zak - Tuesday, December 18, 2007 - link
No way I'm spending >$1,500 on three video cards just to play a game. All I want is one good $500-600 card that can play Crysis and other newer games at 1920x1200.A.
Zak - Tuesday, December 18, 2007 - link
And I feel like Pirks above. I already have a Mac for everyday use, switched from Windows after trying out the Vista trainwreck (I used to be a Mac user years ago). I've built a $1,500 PC two months ago just to play games and I find that despite having the second fastest video card (8800GTX) I can't play a lot of latest games well at 1920x1200. And what's Nvidia's solution? $1,500+ 3way SLI. You know what they can do with that?! I'm seriously considering dumping the PC altogether and getting an Xbox too. PC gaming is going downhill quickly, it's getting way too expensive and too frustrating. At this rate it's just a matter of time before all the good games come out for consoles only. I walked in to Game Stop the other day: "I'm sorry but we don't carry PC games any more at this location". Uh? It's like being Mac gamer all over again!cmdrdredd - Tuesday, December 18, 2007 - link
I feel you on the PCgaming thing. I myself have an Xbox360 and play it much more than I do my PC.I find it pathetic that the best this super expensive top end system can do with crysis maxed out is 43fps.
Zefram0911 - Tuesday, December 18, 2007 - link
IS my RAID array from my evga 680i going to be messed up if I upgrade to the evga 780i board?I'm doing the "step u" that evga offered for the 680i's.
madgonad - Tuesday, December 18, 2007 - link
Just leaving that thing plugged in will cost over $400 in electricity alone. And that is only factoring the computer at idle.Pirks - Tuesday, December 18, 2007 - link
Okay, so you guys probably read already that in the North America Crysis only sold about 33,000 copies, which is a total sales flop. Does it feel like hi-end video cards are finally moving into very narrow niche with people moving to Xbox 360? First Orange Box from Valve, then Gears of war from Epic, then Crysis, and The Darkness (no PC version and not even talk about making one), and The lost planet, etc etc... I had very bad feeling about Crysis, too bad this feeling was not unfounded.And, well, right now you can get Xbox 360 for $250 (yeah, with coupons and if you're lucky, but... still...), so I don't know guys, I see mass consumers just shying away from hi-end 3D cards more and more ($250 for Xbox 360 and cheap 720p HDTV or $500 for hi-end nVidia card? hmmm... now even _I_ start to think about it), I've heard numerous complaints from my gaming buddies that a lot of PC ports of console games are not.. er.. very high quality (for instance in Gears of war the Hammer of dawn is a joke compared to Xbox version, no Collector's edition for PC, etc etc - many things in console ports look like shit on PC, don't even start to remember Halo 2 PC port, puukeee :bleeeeaaaah: [vomiting violently])
I don't know about you guys but I see Mac and Xbox 360 coming, marching forward, they are just simpler and more for the dumb people so we enthusisats are going into extinction slowly but surely. Newegg now sells electronics, cameras, kitchen stuff and bread machines (I gonna buy one for my wife there BTW, and maybe air conditioner too). I'm not even sure about Mac anymore - maybe I'll get myself a Mini, and leave PC for occassional gaming, IF there is a decent game on a PC coming out. Time to try that Mac/Xbox 360 combo my buddies keep drooling about.
No, I'm not trolling, just talking about my own personal observations. You're more than welcome to criticize and downmod me, guys, I'd love to be wrong on that, actually
andrew007 - Tuesday, December 18, 2007 - link
You're not the only one. I'm also thinking of getting that rumored ultraportable Mac early next year too and I already have the consoles - and I am very happy with console FPSes on Xbox360. When it comes to games recently the only good exclusive games for PC are RTSes (and MMORPGs) which aren't really my cup of tea. So no, you are not the only one, and I am for one glad about low sales of Crysis. Personally, to me it was the most disappointing game of the year. Anyway, while I'm sad the era of plentiful, deep games is gone (where are space sims? deep RPGs? great adventures like Gabriel Knight? ANY game with good length AND depth?) - it is what it is. It's probably no coincidence that there were many independent studios at that time, now all gobbled up by large conglomerates. When I turn on a console, I am certainly not missing game freezes and crashes such as ones due to factory overclocked (!) memory I'm getting on my 8800GT (had similar issue with 3 consecutive video cards in 2 years now), copy protection schemes (had to change DVD drive for one game!), occasional mandatory beta video drivers and just general fuss and instability. Sure, my overclocked Q6600 is insanely fast if I run photo editing or video encoding or even web browsing, and the games do look great in full resolution - but that's only when everything fully works which is not as often as I'd like. I'm getting too old for this. The only bad thing is that console makers are now adopting PC ways - patches, game freezes, controller dropouts, overheating, noise...Pirks - Tuesday, December 18, 2007 - link
Yeah, I think I'm getting older too - I just want damn computer to work and looks like I gonna get Mac, 'cause I know self-made PC is waaay cheaper, but somehow I'd pay for peace of mind, service etc - just buy Mac with 3-year warranty, you pay a lot but I heard they generally work okay, so... and I still gonna use PC as a second machine - would be fun to compare them and see what each is good for. And as for games you're right, big EA-like publishers are killing inventive original games of the past (American McGee's Alice, Medal of Honor, MDK, Dune 2, UFO, Descent, and many many others) but if publishers are going to pour money in the console market - I better get a console. Games are not going to be great, I agree, but at least they will be CHEAP, compared with nvidia $500-a-year "tax". Anyway, I'll keep PC around and maybe even upgrade it if the decent PC game comes up. Actually I'm waiting for Fallout 3, may be a good reason to upgrade, who knowskilkennycat - Tuesday, December 18, 2007 - link
...it's far more likely to be used by a (nV) video card functioning as a GPGPU for either gaming --- or in the short-term --- professional desktop applications. nV is making great strides in the professional scientific number-crunching and signal-processing communities with their CUDA toolset running on their current GPU offerings. They currently own ~ 86% of the "workstation graphics" market, but in a rapidly-increasing number of cases, graphics is not the sole function of the current nV workstation hardware. Wait for nVidia's next generation silicon and driver software which will be far more focussed on seamlessly merging GPU and GPGPU functionality. Also, wait for their true next-gen motherboard chip-set and not the cobbled-together "780i" which will implement symmetrical PCIe2.0 on all 3 PCIe x16 slots. Arriving about the same time as their next gen GPU family. Mid-2008 would be my guess.aguilpa1 - Tuesday, December 18, 2007 - link
Funny how your review doesn't address this blatant issue. yes it will run tri sli but don't expect it to do with the same Yorkfield used on the test board they used. Engineering samples of the QX9650 ran fine on the 680i SLI's but were changed with the retail versions. Whether it was Intels pissy way of getting back at Nvidia for not licensing SLI to them or Nvidia's way of making a buck off of selling an almost already obsolete board (nehalems coming next year). At this stage...who cares.ilovemaja - Tuesday, December 18, 2007 - link
that quote: His response? "JESUS". "No", I said, "not even Jesus needs this much power".Is one of the funnyest things i heard in my live.
Thanks for another good article, you are the best.
acejj26 - Tuesday, December 18, 2007 - link
In Crysis, you say that the third card offers a 7% performance boost over the 2 card configuration, however, it is only offering 1 fps more, which is just about 2%. Those numbers should be changed.Sunrise089 - Tuesday, December 18, 2007 - link
Not complaining, but I've noticed the last several GPU articles have been written by Anand, which isn't his normal gig. On top of that we get a reference to another GPU editor from back in the day. What's up?compy386 - Tuesday, December 18, 2007 - link
I'd be interesting to do a comparision between SLI and Crossfire once AMD gets some drivers out that actually support quad SLI. I saw a board on newegg that looks like I'd fit 3 3870s as well.AcydRaine - Tuesday, December 18, 2007 - link
AMD doesn't support "Quad-SLI" at all. There are a few boards on Newegg that will fit 4x3780s. Not just 3.compy386 - Tuesday, December 18, 2007 - link
The 3870s take up 2 slots so I only see boards that fit 3. Most of the boards will take 4 3850s though. Again, I'd like to see the performance number comparisons for scaling purposes.compy386 - Tuesday, December 18, 2007 - link
The 3870s take up 2 slots so I only see boards that fit 3. Most of the boards will take 4 3850s though. Again, I'd like to see the performance number comparisons for scaling purposes.SoBizarre - Tuesday, December 18, 2007 - link
Well, I’m glad to see this evaluation of 3-way SLI. It just gave me an idea about overcoming performance issues in games like Crysis. There is no need for building ridiculously expensive machines which draws insane amount of power. I have a better solution (although it won’t work for all of you). I’m just not going to buy a game which I can’t play in its full galore on decent system, at mainstream resolution (1680x1050).I don’t expect the latest and greatest, “show off” kind of a game to be playable at 2560x1600 with highest settings, full AA and AF. Not on a system with Q6600 and single 8800GT. But if you can’t do it on a system like one used by Anand here? Well, then it’s becoming ridiculous.
I’m trying to imagine a proud owner of machine with QX9650 @ 3.33GHz, 3 (that’s THREE) 8800 Ultras and shiny 30-inch monitor, not being able to play a game he just bought. What would be his thoughts about developer of that game? I guess not the pretty ones…
IKeelU - Tuesday, December 18, 2007 - link
When will this nonsense stop? It is perfectly reasonable for a game company to "permit" users to increase the detail if they so choose. On "high" the game looks and runs great on a sub-$400 video card. In fact, on "high" it looks better than anything out there, on any platform. At least with a "very high" setting available, the game will continue to look good a year from now when other games have caught up.andrew007 - Tuesday, December 18, 2007 - link
Uuuh... no, Crysis is not playable at "high" at any decent resolution on my 8800GT and 3.4GHz overclocked quad core Q6600. Decent being 1280 x whatever. And when you drop to medium, the game looks nothing special. Sure, there are a few areas that look great (forest level for example) but overall I was certainly not blown away. Unlike replaying Bioshock in 1920x1200 which this setup is capable of running very smoothly and which looks amazing in DX10. Quite simply, Crysis is one of the worst optimized games ever. At least it doesn't crash, that's something I guess. Looking forward to replaying it in 2 years. Come to think of it, it was the same with Far Cry, it took 2 years to be able to play that game with decent frame rates.JarredWalton - Tuesday, December 18, 2007 - link
There's nothing that says Crytek can't make a game where maximum detail settings exceed the capacity of every PC currently available. We've seen this in the past (Oblivion for one), and then a year later suddenly the game is more than playable at max settings on less expensive hardware. It doesn't appear that Tri-SLI is fully implemented for many titles, and considering the age of Crysis I'd expect more performance over time. Just like many games don't fully support SLI (or support it at all in some cases) at launch, only to end up greatly benefiting once drivers are optimized.FWIW, I'm playing Crysis on a single 8800 GTX at High detail settings and 1920x1200 with a Core 2 Duo 3.0GHz 2MB (OC'ed E4400). It might be too sluggish for online play where ping and frame rates matter more, but for single player I'm having no issues with that res/settings. It's a matter of what you feel is necessary. I'm willing to shut off AA to get the performance I want.
tshen83 - Tuesday, December 18, 2007 - link
First of all, people who will fork over 1500 dollars worth of GPUs will want to play all games at the highest settings. That means highest AA and AF settings. I don't think you used AA and AF in your testing. It is almost pointless to play without AA for such a nice setup at 120fps(bioshock) where you are becoming CPU bound rather than GPU bound.Secondly, your Crysis test used 1920x1200. Why not 2560x1600? Why not 2560x1600 at 4xAA and 16xAF? Crysis at 1920x1200 without AA and AF are severely CPU bound in your case, as you have witnessed that a faster CPU gave you linear scaling.
Third, there is actually no point of testing triple SLI at any other resolution other than 2560x1600. The target audience triple SLI is aimed at are those with 30 inch Cinema Displays.
I think to be fair, you should rerun the benchmarks in a non-CPU bound situation with AA+AF on, you will see the proper scaling then.
Thanks,
eternalkp - Tuesday, December 25, 2007 - link
very good point TshenI have a 30inch monitor.
the 7900gtx was killing my frame rate.
i was getting average 25fps @ 2560x1600, medium, 2X AA, 16X aniso...in FEAR Perseus Mandate.
Just bought MSI OC 8800GTS G92 and very happy with it.
Now i can crank up maximum graphic setting, 4X AA, 16X aniso @ average 40fps...very nice. :D
Crysis is a hot engine, i only get 30fps @ medium, AA off.
YES. what is the point of 3 GPU and have your AA/Aniso off?
game will look like crap.
Crysis recommends 4gb of ram.
kmmatney - Tuesday, December 18, 2007 - link
"Third, there is actually no point of testing triple SLI at any other resolution other than 2560x1600"The point was testing at settings that are "playable". Who cares if the framerate goes from 8 to 12 @ 2560 x 1600. Its unplayable.
I don't see how even an "enthusiast" wouldn't see triple SLI as a wate of money, though.
cmdrdredd - Tuesday, December 18, 2007 - link
The point is that running 1600x1200 is really not anything you shouldn't be able to do with one card. Even 1920x1080 in many games is perfect. Showing off 10000000fps means jack, turn the res and AA/AF up and show us what it can push out.defter - Tuesday, December 18, 2007 - link
The author missed one advantage of 3-way SLI:Of course it doesn't make any sense to spend >$1500 on three 8800GTX/Ultras today, but what about those folks that already have a SLI 8800GTX/Ultras?
For them adding a third card could be a reasonable upgrade option in comparison to replacing both cards with new G92 based cards.
3-way SLI isn't for everyone, but it has its advantages.
praeses - Tuesday, December 18, 2007 - link
I was under the impression that bioshock did not support AA in DX10. If that is indeed the case, that's hardly the fault of the benchmarker/reviewer.Also, I see much merrit in benchmarking at 1920x1200, its a much more common resolution and desktop-friendly resolution given the physical foot print of monitors. Lets be honest, many games aren't sitting 4ft from their displays. At 2-3ft a 24" display which most likely has 1920x1200 is much more comfortable for longer action based viewing. Ideally though they would have a lower dot pitch or simply higher resolution on the smaller screen.
tshen83 - Tuesday, December 18, 2007 - link
One more thing: you are using Vista Ultimate 32bit with 4GB of memory. Since in 32bit, you have 3 768MB Ultras(2.4GB reserved just for video cards) , the system will only see about 1.5GB of memory. That is not sufficient system memory for high resolution benchmarks, especially Crysis.chizow - Tuesday, December 18, 2007 - link
Derek Wilson in 8800GT Review:Completely valid point about using 32-bit vs. 64-bit and somewhat of a hot topic over in the video forums. Honestly you have $5000+ worth of hardware in front of you, yet getting a 64-bit version of Vista running benchmarks at resolutions/settings where 64-bit and 2GB+ would help the most is too difficult? C'mon guys, seriously this is the 2nd sub-par review in a row (512 GTS review was poor too).
Also, could you clarify the bit about 680i boards being able to accomplish the same thing? Exactly what spurred this change in Tri-SLI support? Driver support? Seems Anand used 169.08 but I thought the 169.25 was the first to officially support Tri-SLI from the patch notes. Or has it always been supported and the 780i just hyping up a selling point that has been around for months? Also, the 780i article hinted there would be OC'ing tests with the chipset and I don't see any here. Going to come in a different article? Thanks.
blppt - Tuesday, December 18, 2007 - link
Yeah, seriously. Especially since the 64bit Crysis executable does away with the texture streaming engine entirely...how can you make a serious "super high end ultimate system" benchmark without utilizing the most optimized, publicly available version of the game? Is it that the 64bit Vista drivers dont support 3-way SLI yet?Otherwise, putting together a monster rig with 3 $500 videocards and then testing it with 32bit vista seems rather silly....
Ryan Smith - Tuesday, December 18, 2007 - link
Address space consumption isn't 1:1 with video memory, it's only correlated, and even less so in SLI configurations where some data is replicated between the cards. I'm not sure what exact value Anand had, but I'm confident Anand had more than 2GB of free address space.JarredWalton - Tuesday, December 18, 2007 - link
Testing at high resolutions with ultra-insane graphics settings serves one purpose: it makes hardware like Quad-SLI and Tri-SLI appear to be much better than it really is. NVIDIA recommended 8xAA for quad-SLI back in the day just to make sure the difference was large. It did make QSLI look a lot better, but when you stopped to examine the sometimes sub-20 FPS results it was far less compelling.Run at 4xAA on a 30" LCD at native resolution, and it's more than just a little difficult to see the image quality difference, with sometimes half the frame rate of 4xAA. A far better solution than maxing out every setting possible is to increase quality where it's useful. 4xAA is even debatable at 2560x1600 - certainly not required - and it's the first thing I turn off when my system is too slow for a game. Before bothering with 8xAA, try transparent supersampling AA. It usually addresses the same issue with much less impact on performance.
At the end of the day, it comes down to performance. If you can't enable 8xAA without keeping frame rates above ~40 FPS (and minimums above 30 FPS), I wouldn't touch it. I play many games with 0xAA and rarely notice aliasing on a 30" LCD. Individual pixels are smaller than on 24", 20", 19", etc. LCDs so it doesn't matter as much, and the high resolution compensates for other areas. Crysis at 2560x1600 with Very High settings? The game is already a slide show, so why bother?
0roo0roo - Tuesday, December 18, 2007 - link
faster is faster, the best is expensive and sometimes frivolous. at that price point you arent thinking like a budget buyer anymore. like exotic cars, you can't be that rational about it. its simply power ...power NOW.crimson117 - Tuesday, December 18, 2007 - link
If it's true that it's all about power, then just find the most expensive cards you can buy, install them, and don't bother playing anything. Also, tip your salesman a few hundred bucks to make the purchase that much more expensive.0roo0roo - Tuesday, December 18, 2007 - link
look, its not like you don't get any advantage from it. its not across the board at this point, but its still a nice boost for any 30" gamer.seriously, there are handbags that cost more than this sli stuff.
JarredWalton - Tuesday, December 18, 2007 - link
Next up from AnandTech: Overclocked Handbags!Stay tuned - we're still working on the details...