So you're saying that when my 1st 9800gx2 gets here in the mail next week, that it actually help my landlord on his heating bill? That's good because he pays for the electric too, and I feel kinda bad already. Since while heating bills in Interior Alaska are nasty, the electric is just as .. but anyways.
Great review. Though I won't be able to run these in quad on my current mobo. But then, I never liked the FPS's much since quake, which brings me to my point:
These days I play insane amounts of Supreme Commander. It is CPU intensive. If you are looking for another Benchmark when you start cranking up the GHz, ...well just give it a look.
But to say a bit: 8 player is CPU pain. I only have dual cores, but my first dual cores were AMD (my games never needed the most rocking FPS) I had to buy into go C2D, just for game speed..forget the eye-candy.
In short I read this article and went..huh...me too.
Anyways, I am one of those Dell 3007 owner's, and I learned a lot. TY. And I don't live in North Pole AK, I only work there.
i must say that so far i haven't noticed any performance gains over one 9800gx2, and in some games like The Witcher i have actually had a performance drop of 10-15 fps. In crysis, the frame rate in v high settings is disappointingly low, sometimes dropping to 15fps in large environments. i think maybe there is another bottleneck in my system
Over at tweaktown they're coming up with the same results as Anand did even with the 9800GTX.
Is it possible a lot of the problems are because of the G92\94 has a smaller 256-Bit memory interface? Could it cause that much of a difference especially combined with less VRAM. Just a thought.
we see high quality and very high quality settings have an increasing impact on CPU ... the higher the visual quality, the higher the impact on CPU ... it's kind of counter intuitive, but it's a fact.
you actually might not be able to hit the CPU limit with very high settings because the GPU limit might be below the CPU limit... Which seems to be somewhere between 40 and 50 fps (depending on the situation) with a 3.2GHz intel processor.
I like Hard OCP's review because while the Quad SLI solution does put up higher numbers then a 8800 Ultra SLI solution they pointed out some serious problems in all games besides Crysis.
It appears that memory is a bottleneck and many games have severe momentary drops in FPS at high resolutions and/or with AA, making the gaming experience worse then an 8800 Ultra SLI solution. I strongly recommend folks take a look at Hard OCP's review.
AnandTech's review only covers average FPS which does not address nor reveal the kinds of issues the Quad SLI solution is having.
Thanks for mention of the HardOCP review. A lot better than Anands.
Very disappointed with Anand on this article. I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this. When they must have run in to it at 2560x1600, and just about all the games they tested at 2560x1600 will get killed with AA because of the memory bottleneck. I'm talking from trying this myself. If i knew about it i wouldn't have got a GX2 as it's pretty pointless with a 30" monitor.
So are they getting paid by NV or what?
"Very disappointed with Anand on this article. I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this."
They are trying to answer their own questions and solve their own problems right now. This comment section is for comments on the article, not your personal technical support line to Anand.
When i said "I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this."
I meant, why has this website not addressed this issue in any of there GX2 articles. As any decent website would, especially being as it's not hard to run in to. I mean a high-end card like this would be for people with 24" and 30" monitors like myself. As anything lower than 1920x1200 a GTX would be more than enough. Yet the card has massive memory bandwidth problems and/or not enough usable VRAM. So frame rates get completely crippled when a certain amount of AA is used on games.
Making the card pretty pointless.
you posted your comment 20 hrs ago. I don't see any comments posted by any Anandtech staff in this review since that time, so no guarantee they have actually read your comment.
I honestly don't know why the hardware vendors keep trying to push these high end solution that aren't stable and don't work the way they are supposed to. Exactly who is the market for $500 motherboards that require expensive RAM designed for servers, $1000 CPUs that will be outperfromed by mainstream CPUs in a year or less, and $600 video cards that have stability issues and driver problems even when not run in SLI mode.
I would really love it Anandtech and other hardware sites would come out and give these products the 1 out of 10 or 2 out of 10 review scores that they deserve, and tell the hardware companies to spend more time developing solutions for real entusiasts with mainstream pocketbooks, instead of wasting their engineering resources on these high end solutions that nobody wants.
Dunno if anyone has actually bought Skulltrail, but obviously people do buy $1000 CPUs and $500 video cards, as some GX2 owners have already posted in this thread, and some people previously bought the Ultra even when it was well over $500. Not something I would do, but there are obviously those with the time and money to play with this kind of thing.
To the writer of this article, or anyone else... i have this strange problem with the GX2.
In the article you're getting 60+ FPS @ 2560x1200 res with 4xAA. Now if i do this i get nowhere near that frame rate. Without AA it's perfect, but with AA it completely kills it at that res.
At first i was thinking that the 512MB usable VRAM was not enough for 2560x1600 + AA so i get a slide slow with 4xAA at that res. But from these tests, you're not getting that.
What backed up my theory for this is that with games with higher res/bigger textures, even 2xAA would kill frame rates. I'd go from 60+ FPS to literally single digit FPS just by turning on 2xAA @ 2560x1200. But with older games with lower res textures i can turn the AA up a lot higher before this happens.
Does anyone know what the problem could be? Because i'd really like to run games at 2560x1600 with AA, but cannot at the moment.
I'm on Vista SP1, with 4GB RAM, and a Quad @ 3.5GHz.
I've also tried 3 different sets of drivers.
OK, it is about time that Socketable GPU's are on the market, how about a mATX board with a edge connected PCIe x32 slot? Make the video board ATX compliant (IE video board + mATX board = ATX compliant).
Then we can finally cool these damn video cards, and maybe find a way of getting them power that doesn't get in the way.
You purchase the proper daughterboard for your class (main-stream, enthusiast, overclocker/benchmarker), and it will come with the ram slots(or onboard ram) and proper voltage circuitry. Then you can change just the GPU when you need to upgrade.
I know it would be hard to implement and confusing, but it would be less confusing than the current situation once we got used to it, and it would be a hell of a lot more sane.
You could use a PCI-e extender to connect it to existing PCI-e mATX or full ATX motherboards.
It is either this, or put the GPU socket on the damn motherboard already, it is 2008, it needs to happen. (If the videocard was connected over Hypertransport III you wouldn't have any problem with bandwidth). The next step really needs to be a general-purpose processor built into the video card, that way you aren't cpu/system bound like the current ridiculous setup (The 790i chipset seems to be helping with the ability to have the northbridge do batch commands across the GPU's, but minimum we need to see some Physics moving away from the CPU, general physics haven't changed since the universe started, why waste a general purpose CPU on them?)
Just to re-iterate, why am I buying a new card every time when they just seem to be recycling the reference design? All GPU systems need the same as a motherboard, Clean voltage to the main processor and its memory, as well as a bus for the RAM and some I/O. The 7900 8800 and 9600 are practically the same boards with the same memory bus, can't we have it socketed?
I am all for SLI/Crossfire or whatever you can afford to do but why are we starting to lose focus on single card solution users? I'm sure if I really wanted to sacrafice I could save up for a multiple GPU system but a single g92 for less than half the price for a relatively small performance hit in crysis. And yes I love it but I'm saying it, I just think Crysis is a poorly optimized game to some degree. Give use new and not reused single card solutions! (9 series)
I have been thinking the same thing lately, but last week after reading about Intel's plans to use one of it's cores in CPU's as a graphics unit, I started thinking about the ramifications of this. I am willing to bet that Nvidia is trying to develop a hybrid CPU/GPU to compete on the same platform that Intel and AMD will eventually have. If this is true, that it's probably a reasonable explanation as to why there has been a severe lack of all new GPU's since the launch of the 8xxx series a year ago.
I understand crysis is a good game to test the muscle power of video cards, but if anybody out there hasn't played the game yet and wants the best setup for it, please don't spend $1200 in video cards. I've played through half life 2 numerous times, call of duty 4 numerous times, and crysis only once. Once you get over the wow factor of the graphics, it's not that amazing of an experience....
Now I look at this and I say the FPS scaling you get by adding a second card is generally around 50% to 60% in the best case scenario. If we assume that, then 2x 8800Ultra would be getting around 25FPS, which is getting into playable range especially with the motion blur that Crysis uses. Obviously this is assuming decent scaling, but this data just screams give it a try.
On a slightly realated note, I also see that the same Crysis chart shows that two cards scale roughly linearly with resolution up to to 2560x1600 (8800GT and 8800Ultra) while the others show a sharp drop at 2560x1600 (9800GX2, all AMD cards). This makes me ask the question what's different about these two groups of cards. One common feature I note is that the cards that scale linearly are all using PCIe 2.0, while the ones that have a sharp drop off @2560x1600 are using PCIe 1.x (the 9800GX2 is externally PCIe 2.0, but internally the two cards are connected via PCIe 1.x). Mabey it has nothing to do with the type of PCIe connection, but it certainly correlates.
Basically all this makes me think that for gaming at 2560x1600 I'm likely to be better off with two 8800Ultra's (or even 8800 GTX's) than I am with one or even two 9800GX2's (and since I and a lot of people interested gaming on high end rigs at 2560x1600 likely have a 8800GTX/Ultra already it would be far cheaper as well). This is of course all speculation since there are no reported benchmarks for 8800GTX/Ultra in SLI mode in these comparisons, which is why I like to request them. :)
that compares 2x 9800GX2 with 2x 8800GTX's. The short summary is 2x 8800GTX's are better than 2x 9800GX2 at hi-res gaming. The 9800GX2's often have higher average frame rates than the 8800GTX's, but the 8800GTX's have much more consistent frame rates (the 9800GX2's often had there frame rates crash to unacceptable levels for short periods of time, whereas the 8800GTX were playable throughout).
Essentially it looks like I am better off getting a second 8800GTX rather than 1 or 2 9800GX2's for gaming at 2560x1600, and it's way cheaper to boot.
I will still wait till next week to see how the 9800GTX performs, but given the leaked info on it and recent history of anemic releases by NVIDIA I'm not holding out much hope for the 9800GTX.
in the second paragraph you noted the skulltrail as having 2x lga775 sockets, but i'm pretty sure it has lga771 sockets only. if i'm mistaken, i apologize, if i'm right, please correct the error so other less knowledgeable readers don't receive false information.
You guys shouldn't be using Skulltrail to benchmark games. It's not a gaming platform. Most games run slower on it than a single socket quad core systems due to the FB DIMMs. It provides a sub-optimal environment for both SLI and crossfire which negates any value that it might have for levelling the playing field there. I think the author is letting his personal desire to use the Skulltrail system get in the way of doing a proper review. The fact of the matter is that Skulltrail is slow for games and doesn't reflect how the vast majority of people would run their SLI and crossfire setups.
As to the multi-GPUness, I think you'd have to be mad to buy them given the price and horrendous scaling. As always, the generation cards will mostly outperform a multi-GPU systems at less cost, less power consumption and more consistent performance across all games.
This is my point. The testers here had "some problems" and these guys are very experienced and tecnically savy. They also have access to alot of PSU's ram etc etc to try if things dont work right. If it were a car or a television it would be sent back as what it is, a failure, and a lemon. Why do we accept it with PC parts.?
Case in point, I have the 9800 GX2:
* I can not run multiple monitors with SLI enabled. So I have to swap between my 24" monitor and my Wacom Cintiq 21". When I change over, the drivers won't auto-detect the resolution, and uses resolutions and hertz the Wacom doesn't support, and I get an "out of signal" error. I have to disable SLI to use my $2,500 art tablet as a secondary monitor.
* I'm a graphic designer, and I can't take screenshots anymore without them coming out garbled like this:
http://www.inkjammer.com/broken_screencaps.jpg">http://www.inkjammer.com/broken_screencaps.jpg
I could find workarounds, get a screen cap program or just disable SLI, but this is all basic functionality gone bad.
There are a LOT of little problems that could impede testing without being visible. The fact that SLI breaks basic functions like multi-monitor setups and screen capture in Vista is puzzling. These drivers feel like betas lacking basic functionality. If I even try Crysis with 8X FSAA my entire system crashes.
I adopted from an 8800 GTX to a 9800 GX2, and I'm really frustrated with the drivers - there are a lot of issues with them running it in "SLI". There card has a lot of raw performance, but seeing it doubled up with two cards...
Costs aside, it really seems like anything beyond 2 GPUs at this point and time is rather useless. The technology is there, but the drivers are still too immature and the rest of the tech it requires to be useful hasn't caught up to speed.
I dissagree, Its in every developers best intrest to flex there emuscles when they can, It keeps the compotition between them going and also keeps prices down and the next techno advances coming to us.
What I do like in alot of articals like this one and few others Ive read is the idea of the 3 hardware giants almost putting there heads together to solve a common problem area.
I wanted to put a quote here about a mention of Crysis being a single thred program but couldnt find it again, Did I read this wrong or is it true that 7 cores generaly sat idle, That would be bad programming not a harware limitation, A spec of how CPU utilisasion would also be good in the testing of these game demo's
The other comment I wanted to bring up was that FBDDR is slower then UBDDR, This could be the limiting factor and along with the formentioned why the swap to the N780 setup did better.
"I wanted to put a quote here about a mention of Crysis being a single thred program but couldnt find it again, Did I read this wrong or is it true that 7 cores generaly sat idle, That would be bad programming not a harware limitation, A spec of how CPU utilisasion would also be good in the testing of these game demo's "
it's not bad programming really... there are some things you just CAN'T split up to run in multiple threads without adding more sync overhead than performance from parallelism.
In any case, I did a quick test here ---
Crysis seems to have 3 main gameplay threads that do most of the heavy lifting. One bounces around at some pretty high utilization.
The other 5 threads are sitting at between 10 and 20% utilization.
Overall during gameplay on skulltrail we see total cpu utilization (average of all cores) at between 20% and 30%.
Moving beyond 4 cores should (and does) have zero impact on crysis with this information.
Two cores would likely even provide enough power to get by as two of the 3 cores that were more than 20% active sat betwenen 30% and 50% utilization each. Taking these two threads, if they were to run on one core, you'd never see more than 80% utilization.
How many people own skull trail platforms and have dual 9800GX2's? Ten. There are ten people that have this setup. For everyone else, it's a pipe dream so far fetched I think I'd have better chances winning the local lottery than owning this kind of system.
Seriously though, there are significantly more cons than pros when using skull trail to benchmark video card performance. The raw power of 8 CPU's is great in theory but it's not translating in real world gaming applications (in some cases it's hurting).
Video card reviews would be served better with the fastest quad core CPU available, accompanied with the highest performance motherboard out, and an excellent CPU cooler to allow for maximum overclock.
lol i agree no one will be able to afford this type of setup and if they did it is a waste of money.
I agree that anandtech should post realistic equipment aimed at the enthusiastic croud rather than the rich kid with skulltrail. Try a qx9650 and e8500 chips and see what happens with the benches.
Graphics drivers are not mature enough for the multi sli technology, and games are not mature enough for 4 cores - this review makes spending all that money look nothing but a waste of hard earned cash!!!
I wholeheartedly agree. Not only is it too expensive, but its not practical. What enthusiast you know will actually buy a setup like that? None I know, and prolly not any on the forums either.
Sure, you want to test apples to apples... But the true apples to apples test is the hardware that people can get off newegg or zzf. 780i's and x38s with cheap but good DDR2 and DDR3 (well skip the cheap on DDR3 lol) and a nice penryn core cpu
The reason they use skulltrail on all of the recent graphics card benchmarks is because it's the only chipset that supports SLI and Crossfire. It's the only way you'll see an apples to apples comparison. So stop your complaining.
I agree on that. My E6600 w/9800 GX2 doesn't get near the performance Anandtech got in their review. In fact, the performance was still great, but really disappointing by comparison. Then I realized the benchmark was done with 6+ more cores than I have.
The huge CPU power slightly skews realistic performance expectations on an otherwise high end PC. Great for showing card potential, not great for performance you can realistically expect.
Which was along the lines of the primary the point I was making. Why not just use the highest performance motherboard available and a single quad core processor overclocked like crazy? At least, in that regard, you're still using the best processor and best mobo out, which both can be had in a custom system for what the (almost) general masses can afford.
I think there is a time and place for extreme high end reviews. But when extremely high end hardware is used in EVERY review, applicable performance expectations to the masses don't exist. I like your reviews; you're thorough you write well, it's just that reading these types of reviews consistently is more like listening to an extremely wealthy individual brag about all his toys. And by no means am I calling you a snob - hardware reviews are a part of your job as well as a priviledge. I will, never in the next few years, meet anyone with a system set up to be as expensive as what hardware reviewers regularly test with.
Sorry I used the word review in every single sentence. I was typing in a hurry and I didn't proof read.
And to once again make it clear, you do a great job reviewing hardware and I enjoy all the article put out by everyone on anandtech. I just question the use of extreme high end hardware in EVERY review (like the 9600GT vs. 3870, is the skull trail necessary there?)
the major reason i want to use skulltrail is to compare crossfire to sli.
there are plenty of reasons i'd rather use another platform, but i'd love it if either AMD or NVIDIA would take a step outside of the box and enable either their platform to support other multiGPU configurations or enable their drivers and cards to run in multiGPU configurations on other platforms.
It's a slippery slope. We want to be able to say Nvidia is better than ATI (or vice versa), or that future scaling due to architecture will make A better than B. The truth as you pointed out is that it doesn't happen this way in all cases. You can have a A64 3200+ (like I currently do) and throw a GX2 at it and it probably won't perform better than a 9600GT. But that doesn't mean its an equal card, just that in a particular situation it performed the same. It's up to the buyer to do their homework and figure out whether its worth it to drop $650 when their system in current $$$ might not be worth the price of the card....
Hocp tried this tactic with the launch of the C2D's and got a lot of heat (I agreed with the anger). They were trying to show that most games are GPU limited and so the new CPU's showed no benefit. Of course it was only a small selection of games, and they didn't take into consideration the 2 largest cpu-straining genre's (RTS and flight sim), and they were running at very high resolutions (which obviously would make things GPU bound).
One of gamespot.com's good features is in their game coverage (pretty much all I use them for). They put out hardware guides that do exactly what you want. They take a specific game and throw a battery of systems at it to see what makes a difference. Those guides will show you just how much improvement you can expect when going from a certain graphics card to a better one on a specific cpu platform. You'll be able to see whether your cpu/mobo combination is at the end of its usable lifespan (ie upgrading other components such as GPU/ram no longer yield a large improvement due to cpu/other bottleneck).
I would say while like you I read these reviews knowing I'll likely never own one of these cards/cpu's, it does give a good picture of who is at the top, and therefore, who has the potential to outlast the other in a system before an upgrade is needed.
With all that said, there is clearly a major problem going on and so all the data generated in this and possibly the previous single GX2 review also benchmarked on the skulltrail platform needs to be taken with a grain of salt, realizing that the numbers could very likely be invalid.
I doubt there is a major problem that would invalidate the data ... and I'm not just saying that because I spent weeks testing and troubleshooting :-)
certainly something is going on, but in spite of the fact that 780i performs a bit higher, performance characteristics are exactly the same -- if there is an issue with my setup it is not platform specific.
Nice job on the review Derek, looks like you ran into some problems but I'd guess testing these new pieces of hardware make it worthwhile.
It really looks like Quad SLI scaling is really poor right now, do you think its a case of drivers needing to mature, CPU bottleneck, or frame buffer limitations? I know Vista should be maxed at 4 frame buffers, but there seems to be very little scaling beyond a single GX2 in everything except Crysis (and COD4). In some games, performance actually decreases with the 2nd GX2.
Also, seeing the massive performance difference between Skulltrail and 780i, is it even worthwhile to continue using Skulltrail as a test platform? I understand it makes it more convenient for you guys to test between different GPU vendors, but a 25% difference in Crysis between an NV SLI solution and Intel's SLI solution is rather drastic, and that's *after* you factor in the 2nd CPU for Skulltrail. Does ATI suffer a similar performance hit when compared against its best performing chipset platform?
I would've liked to have seen Tri-SLI compared in there. Personally I think Tri-SLI with 8800 GTX/Ultra and soon, 9800 GTX will outperform Quad-SLI as it seems the drivers are a bit more mature for Tri-SLI and scaling was better as well. SLI performance with those parts is slightly better already than the GX2 and adding that third card should give Tri-SLI the lead over Quad-SLI.
Lastly, how was your actual gameplay experience with these high-end parts? Micro-stutter is a buzz word that has been gaining steam lately with multi-GPU solutions. Did you notice any in your testing? It looks like frame buffer size really kills all of these 512MB parts at 2560, would you consider games at that resolution unplayable? It seems many who considered 2-GX2 or 2-X2 would have done so to play at 2560. If that resolution is unplayable, you're looking at an even smaller window of consumers that would actually buy and benefit from such an expensive set-up.
in my gameplay experience, i had no difficulty with micro stutter and the 9800gx2 in quad sli.
i will say that i have run into the problem on crossfirex in oblivion with 4 way at very high res. it wasn't that pronounced or detrimental to the overall experience to me, but i'll make sure to mention it when i run into this problem in the future.
Nvidia should go back to making good stable video cards with good IQ instead of flexing their e muscles with crap like this that no one will ever want or need. Nvidia had problems with power issues and vista driver issues on 8 series cards (G92)that they should be working on.
After reading almost a dozen reviews of SLI, Tri-SLI, Quad-SLI, and CrossFire running Crysis and handful of other games, it seems that there is something terribly wrong with the all the benchmarks. Test results show that raw multi-GPU horsepower, even when coupled with multi-CPUs, just isn't delivering the kinds of numbers that most of us were expecting. The potential computing power that this kind of hardware can deliver just doesn't show in the numbers. Something is really, really wrong with one of these components thats disrupting the whole point of going for more than one CPU/GPU.
What I'd like to see is some definitive study showing where the problem(s) is and who to blame. Is it the CPU? GPU? Memory? System Bus? PCI-E? Drivers? DirectX? Windows? or the game/application itself?
After all these tests and benchmarks run by really, really smart people, someone out there ought to be able to deduce who messed up in all this business.
What I dislike about many reviews, is that they test Crysis on "HIGH" settings. There's a major difference between "HIGH" which doesn't use AA, and HIGH with 16x Q AA.
What I dislike about many reviews, is that they test Crysis on "HIGH" settings. There's a major difference between "HIGH" which doesn't use AA, and HIGH with 16x Q AA.
I agree with this, there's a doubling of raw power with two cards (and a doubling of the price), and until I see double the performance in scaling, SLI and CF can go to hell
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
54 Comments
Back to Article
DDH III - Thursday, October 23, 2008 - link
So you're saying that when my 1st 9800gx2 gets here in the mail next week, that it actually help my landlord on his heating bill? That's good because he pays for the electric too, and I feel kinda bad already. Since while heating bills in Interior Alaska are nasty, the electric is just as .. but anyways.Great review. Though I won't be able to run these in quad on my current mobo. But then, I never liked the FPS's much since quake, which brings me to my point:
These days I play insane amounts of Supreme Commander. It is CPU intensive. If you are looking for another Benchmark when you start cranking up the GHz, ...well just give it a look.
But to say a bit: 8 player is CPU pain. I only have dual cores, but my first dual cores were AMD (my games never needed the most rocking FPS) I had to buy into go C2D, just for game speed..forget the eye-candy.
In short I read this article and went..huh...me too.
Anyways, I am one of those Dell 3007 owner's, and I learned a lot. TY. And I don't live in North Pole AK, I only work there.
luther01 - Friday, May 9, 2008 - link
hi guys. i have a quad 9800 gx2 system as follows:q6600 processor @2.8GHz
4 gig pc-6400 RAM
780i motherboard
2x 9800gx2
i must say that so far i haven't noticed any performance gains over one 9800gx2, and in some games like The Witcher i have actually had a performance drop of 10-15 fps. In crysis, the frame rate in v high settings is disappointingly low, sometimes dropping to 15fps in large environments. i think maybe there is another bottleneck in my system
Das Capitolin - Sunday, April 27, 2008 - link
It appears that NVIDIA is aware of the problem, yet there's no word on what or when it might be fixed.http://benchmarkreviews.com/index.php?option=com_c...">http://benchmarkreviews.com/index.php?o...&tas...
Mr Roboto - Sunday, March 30, 2008 - link
Over at tweaktown they're coming up with the same results as Anand did even with the 9800GTX.Is it possible a lot of the problems are because of the G92\94 has a smaller 256-Bit memory interface? Could it cause that much of a difference especially combined with less VRAM. Just a thought.
http://www.tweaktown.com/articles/1363/1/page_1_in...">http://www.tweaktown.com/articles/1363/1/page_1_in...
ianken - Thursday, March 27, 2008 - link
...logging CPU loading shows it evenly loaded across four cores at about 50%. But then I'm running a lowly 8800GTS(640mb) SLI setup.The numbers in this review just seem kinda off.
DerekWilson - Friday, March 28, 2008 - link
what settings were you testing on?we see high quality and very high quality settings have an increasing impact on CPU ... the higher the visual quality, the higher the impact on CPU ... it's kind of counter intuitive, but it's a fact.
you actually might not be able to hit the CPU limit with very high settings because the GPU limit might be below the CPU limit... Which seems to be somewhere between 40 and 50 fps (depending on the situation) with a 3.2GHz intel processor.
seamusmc - Wednesday, March 26, 2008 - link
I like Hard OCP's review because while the Quad SLI solution does put up higher numbers then a 8800 Ultra SLI solution they pointed out some serious problems in all games besides Crysis.It appears that memory is a bottleneck and many games have severe momentary drops in FPS at high resolutions and/or with AA, making the gaming experience worse then an 8800 Ultra SLI solution. I strongly recommend folks take a look at Hard OCP's review.
AnandTech's review only covers average FPS which does not address nor reveal the kinds of issues the Quad SLI solution is having.
B3an - Wednesday, March 26, 2008 - link
Thanks for mention of the HardOCP review. A lot better than Anands.Very disappointed with Anand on this article. I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this. When they must have run in to it at 2560x1600, and just about all the games they tested at 2560x1600 will get killed with AA because of the memory bottleneck. I'm talking from trying this myself. If i knew about it i wouldn't have got a GX2 as it's pretty pointless with a 30" monitor.
So are they getting paid by NV or what?
Very disappointed.
AssBall - Wednesday, March 26, 2008 - link
"Very disappointed with Anand on this article. I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this."They are trying to answer their own questions and solve their own problems right now. This comment section is for comments on the article, not your personal technical support line to Anand.
B3an - Thursday, March 27, 2008 - link
When i said "I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this."I meant, why has this website not addressed this issue in any of there GX2 articles. As any decent website would, especially being as it's not hard to run in to. I mean a high-end card like this would be for people with 24" and 30" monitors like myself. As anything lower than 1920x1200 a GTX would be more than enough. Yet the card has massive memory bandwidth problems and/or not enough usable VRAM. So frame rates get completely crippled when a certain amount of AA is used on games.
Making the card pretty pointless.
strikeback03 - Wednesday, March 26, 2008 - link
you posted your comment 20 hrs ago. I don't see any comments posted by any Anandtech staff in this review since that time, so no guarantee they have actually read your comment.gudodayn - Tuesday, March 25, 2008 - link
< Layzer253 ~ No, they shouldnt. It works just fine >Given that 9800x2 is a power card, more powerful than 3870x2 that 9800x2 will run into CPU limitations..........
Then in theory, when using the same platform (same CPU, RAM, etc.), shouldn't 9800x2 score at least within the ball park range of the 3870x2??
It would just mean with a faster CPU, 9800x2 will have lots of room for improvement whereas the 3870x2 wouldn't!!
But that's not what's happening here though, is it??
For some reason, 3870x2 in Crossfire is scaling a lot better than 9800x2 in SLI ~ in a lot of the tests!!
Either SLI drivers are messed up or 9800x2 cant run quad GPUs effectively.........
LemonJoose - Tuesday, March 25, 2008 - link
I honestly don't know why the hardware vendors keep trying to push these high end solution that aren't stable and don't work the way they are supposed to. Exactly who is the market for $500 motherboards that require expensive RAM designed for servers, $1000 CPUs that will be outperfromed by mainstream CPUs in a year or less, and $600 video cards that have stability issues and driver problems even when not run in SLI mode.I would really love it Anandtech and other hardware sites would come out and give these products the 1 out of 10 or 2 out of 10 review scores that they deserve, and tell the hardware companies to spend more time developing solutions for real entusiasts with mainstream pocketbooks, instead of wasting their engineering resources on these high end solutions that nobody wants.
strikeback03 - Wednesday, March 26, 2008 - link
Dunno if anyone has actually bought Skulltrail, but obviously people do buy $1000 CPUs and $500 video cards, as some GX2 owners have already posted in this thread, and some people previously bought the Ultra even when it was well over $500. Not something I would do, but there are obviously those with the time and money to play with this kind of thing.B3an - Tuesday, March 25, 2008 - link
To the writer of this article, or anyone else... i have this strange problem with the GX2.In the article you're getting 60+ FPS @ 2560x1200 res with 4xAA. Now if i do this i get nowhere near that frame rate. Without AA it's perfect, but with AA it completely kills it at that res.
At first i was thinking that the 512MB usable VRAM was not enough for 2560x1600 + AA so i get a slide slow with 4xAA at that res. But from these tests, you're not getting that.
What backed up my theory for this is that with games with higher res/bigger textures, even 2xAA would kill frame rates. I'd go from 60+ FPS to literally single digit FPS just by turning on 2xAA @ 2560x1200. But with older games with lower res textures i can turn the AA up a lot higher before this happens.
Does anyone know what the problem could be? Because i'd really like to run games at 2560x1600 with AA, but cannot at the moment.
I'm on Vista SP1, with 4GB RAM, and a Quad @ 3.5GHz.
I've also tried 3 different sets of drivers.
nubie - Tuesday, March 25, 2008 - link
OK, it is about time that Socketable GPU's are on the market, how about a mATX board with a edge connected PCIe x32 slot? Make the video board ATX compliant (IE video board + mATX board = ATX compliant).Then we can finally cool these damn video cards, and maybe find a way of getting them power that doesn't get in the way.
You purchase the proper daughterboard for your class (main-stream, enthusiast, overclocker/benchmarker), and it will come with the ram slots(or onboard ram) and proper voltage circuitry. Then you can change just the GPU when you need to upgrade.
I know it would be hard to implement and confusing, but it would be less confusing than the current situation once we got used to it, and it would be a hell of a lot more sane.
You could use a PCI-e extender to connect it to existing PCI-e mATX or full ATX motherboards.
It is either this, or put the GPU socket on the damn motherboard already, it is 2008, it needs to happen. (If the videocard was connected over Hypertransport III you wouldn't have any problem with bandwidth). The next step really needs to be a general-purpose processor built into the video card, that way you aren't cpu/system bound like the current ridiculous setup (The 790i chipset seems to be helping with the ability to have the northbridge do batch commands across the GPU's, but minimum we need to see some Physics moving away from the CPU, general physics haven't changed since the universe started, why waste a general purpose CPU on them?)
nubie - Tuesday, March 25, 2008 - link
Just to re-iterate, why am I buying a new card every time when they just seem to be recycling the reference design? All GPU systems need the same as a motherboard, Clean voltage to the main processor and its memory, as well as a bus for the RAM and some I/O. The 7900 8800 and 9600 are practically the same boards with the same memory bus, can't we have it socketed?tkrushing - Tuesday, March 25, 2008 - link
I am all for SLI/Crossfire or whatever you can afford to do but why are we starting to lose focus on single card solution users? I'm sure if I really wanted to sacrafice I could save up for a multiple GPU system but a single g92 for less than half the price for a relatively small performance hit in crysis. And yes I love it but I'm saying it, I just think Crysis is a poorly optimized game to some degree. Give use new and not reused single card solutions! (9 series)tviceman - Tuesday, March 25, 2008 - link
I have been thinking the same thing lately, but last week after reading about Intel's plans to use one of it's cores in CPU's as a graphics unit, I started thinking about the ramifications of this. I am willing to bet that Nvidia is trying to develop a hybrid CPU/GPU to compete on the same platform that Intel and AMD will eventually have. If this is true, that it's probably a reasonable explanation as to why there has been a severe lack of all new GPU's since the launch of the 8xxx series a year ago.dlasher - Tuesday, March 25, 2008 - link
page 1, paragraph 2, line 1:Is:
"...it’s not for the feint of heart.."
Should be:
"...it’s not for the faint of heart.."
iceveiled - Tuesday, March 25, 2008 - link
I understand crysis is a good game to test the muscle power of video cards, but if anybody out there hasn't played the game yet and wants the best setup for it, please don't spend $1200 in video cards. I've played through half life 2 numerous times, call of duty 4 numerous times, and crysis only once. Once you get over the wow factor of the graphics, it's not that amazing of an experience....mark3450 - Tuesday, March 25, 2008 - link
In this and the last article on the 9800GX2 the following benchmarking data on Crysis @2560x1600 has shown up in a chart.9800GX2 - 8.9FPS
8800Ultra - 16.3FPS
8800GT - 12.3FPS
Now I look at this and I say the FPS scaling you get by adding a second card is generally around 50% to 60% in the best case scenario. If we assume that, then 2x 8800Ultra would be getting around 25FPS, which is getting into playable range especially with the motion blur that Crysis uses. Obviously this is assuming decent scaling, but this data just screams give it a try.
On a slightly realated note, I also see that the same Crysis chart shows that two cards scale roughly linearly with resolution up to to 2560x1600 (8800GT and 8800Ultra) while the others show a sharp drop at 2560x1600 (9800GX2, all AMD cards). This makes me ask the question what's different about these two groups of cards. One common feature I note is that the cards that scale linearly are all using PCIe 2.0, while the ones that have a sharp drop off @2560x1600 are using PCIe 1.x (the 9800GX2 is externally PCIe 2.0, but internally the two cards are connected via PCIe 1.x). Mabey it has nothing to do with the type of PCIe connection, but it certainly correlates.
Basically all this makes me think that for gaming at 2560x1600 I'm likely to be better off with two 8800Ultra's (or even 8800 GTX's) than I am with one or even two 9800GX2's (and since I and a lot of people interested gaming on high end rigs at 2560x1600 likely have a 8800GTX/Ultra already it would be far cheaper as well). This is of course all speculation since there are no reported benchmarks for 8800GTX/Ultra in SLI mode in these comparisons, which is why I like to request them. :)
-Mark
mark3450 - Tuesday, March 25, 2008 - link
Turns out hardocp has a review up athttp://enthusiast.hardocp.com/article.html?art=MTQ...">http://enthusiast.hardocp.com/article.html?art=MTQ...
(can't insert a proper link for some reason)
that compares 2x 9800GX2 with 2x 8800GTX's. The short summary is 2x 8800GTX's are better than 2x 9800GX2 at hi-res gaming. The 9800GX2's often have higher average frame rates than the 8800GTX's, but the 8800GTX's have much more consistent frame rates (the 9800GX2's often had there frame rates crash to unacceptable levels for short periods of time, whereas the 8800GTX were playable throughout).
Essentially it looks like I am better off getting a second 8800GTX rather than 1 or 2 9800GX2's for gaming at 2560x1600, and it's way cheaper to boot.
I will still wait till next week to see how the 9800GTX performs, but given the leaked info on it and recent history of anemic releases by NVIDIA I'm not holding out much hope for the 9800GTX.
-Mrk
zshift - Tuesday, March 25, 2008 - link
in the second paragraph you noted the skulltrail as having 2x lga775 sockets, but i'm pretty sure it has lga771 sockets only. if i'm mistaken, i apologize, if i'm right, please correct the error so other less knowledgeable readers don't receive false information.Tilmitt - Tuesday, March 25, 2008 - link
You guys shouldn't be using Skulltrail to benchmark games. It's not a gaming platform. Most games run slower on it than a single socket quad core systems due to the FB DIMMs. It provides a sub-optimal environment for both SLI and crossfire which negates any value that it might have for levelling the playing field there. I think the author is letting his personal desire to use the Skulltrail system get in the way of doing a proper review. The fact of the matter is that Skulltrail is slow for games and doesn't reflect how the vast majority of people would run their SLI and crossfire setups.As to the multi-GPUness, I think you'd have to be mad to buy them given the price and horrendous scaling. As always, the generation cards will mostly outperform a multi-GPU systems at less cost, less power consumption and more consistent performance across all games.
tynopik - Tuesday, March 25, 2008 - link
faint of heart, not feint ;)cactusjack - Tuesday, March 25, 2008 - link
This is my point. The testers here had "some problems" and these guys are very experienced and tecnically savy. They also have access to alot of PSU's ram etc etc to try if things dont work right. If it were a car or a television it would be sent back as what it is, a failure, and a lemon. Why do we accept it with PC parts.?Inkjammer - Tuesday, March 25, 2008 - link
Case in point, I have the 9800 GX2:* I can not run multiple monitors with SLI enabled. So I have to swap between my 24" monitor and my Wacom Cintiq 21". When I change over, the drivers won't auto-detect the resolution, and uses resolutions and hertz the Wacom doesn't support, and I get an "out of signal" error. I have to disable SLI to use my $2,500 art tablet as a secondary monitor.
* I'm a graphic designer, and I can't take screenshots anymore without them coming out garbled like this:
http://www.inkjammer.com/broken_screencaps.jpg">http://www.inkjammer.com/broken_screencaps.jpg
I could find workarounds, get a screen cap program or just disable SLI, but this is all basic functionality gone bad.
There are a LOT of little problems that could impede testing without being visible. The fact that SLI breaks basic functions like multi-monitor setups and screen capture in Vista is puzzling. These drivers feel like betas lacking basic functionality. If I even try Crysis with 8X FSAA my entire system crashes.
dare2savefreedom - Tuesday, March 25, 2008 - link
Did you report this to nvidia:http://www.nvidia.com/object/vistaqualityassurance...">http://www.nvidia.com/object/vistaqualityassurance...
Inkjammer - Tuesday, March 25, 2008 - link
I adopted from an 8800 GTX to a 9800 GX2, and I'm really frustrated with the drivers - there are a lot of issues with them running it in "SLI". There card has a lot of raw performance, but seeing it doubled up with two cards...Costs aside, it really seems like anything beyond 2 GPUs at this point and time is rather useless. The technology is there, but the drivers are still too immature and the rest of the tech it requires to be useful hasn't caught up to speed.
Lorne - Tuesday, March 25, 2008 - link
I dissagree, Its in every developers best intrest to flex there emuscles when they can, It keeps the compotition between them going and also keeps prices down and the next techno advances coming to us.What I do like in alot of articals like this one and few others Ive read is the idea of the 3 hardware giants almost putting there heads together to solve a common problem area.
I wanted to put a quote here about a mention of Crysis being a single thred program but couldnt find it again, Did I read this wrong or is it true that 7 cores generaly sat idle, That would be bad programming not a harware limitation, A spec of how CPU utilisasion would also be good in the testing of these game demo's
The other comment I wanted to bring up was that FBDDR is slower then UBDDR, This could be the limiting factor and along with the formentioned why the swap to the N780 setup did better.
DerekWilson - Thursday, March 27, 2008 - link
"I wanted to put a quote here about a mention of Crysis being a single thred program but couldnt find it again, Did I read this wrong or is it true that 7 cores generaly sat idle, That would be bad programming not a harware limitation, A spec of how CPU utilisasion would also be good in the testing of these game demo's "it's not bad programming really... there are some things you just CAN'T split up to run in multiple threads without adding more sync overhead than performance from parallelism.
In any case, I did a quick test here ---
Crysis seems to have 3 main gameplay threads that do most of the heavy lifting. One bounces around at some pretty high utilization.
The other 5 threads are sitting at between 10 and 20% utilization.
Overall during gameplay on skulltrail we see total cpu utilization (average of all cores) at between 20% and 30%.
Moving beyond 4 cores should (and does) have zero impact on crysis with this information.
Two cores would likely even provide enough power to get by as two of the 3 cores that were more than 20% active sat betwenen 30% and 50% utilization each. Taking these two threads, if they were to run on one core, you'd never see more than 80% utilization.
tviceman - Tuesday, March 25, 2008 - link
How many people own skull trail platforms and have dual 9800GX2's? Ten. There are ten people that have this setup. For everyone else, it's a pipe dream so far fetched I think I'd have better chances winning the local lottery than owning this kind of system.Seriously though, there are significantly more cons than pros when using skull trail to benchmark video card performance. The raw power of 8 CPU's is great in theory but it's not translating in real world gaming applications (in some cases it's hurting).
Video card reviews would be served better with the fastest quad core CPU available, accompanied with the highest performance motherboard out, and an excellent CPU cooler to allow for maximum overclock.
charlie brown - Wednesday, April 2, 2008 - link
lol i agree no one will be able to afford this type of setup and if they did it is a waste of money.I agree that anandtech should post realistic equipment aimed at the enthusiastic croud rather than the rich kid with skulltrail. Try a qx9650 and e8500 chips and see what happens with the benches.
Graphics drivers are not mature enough for the multi sli technology, and games are not mature enough for 4 cores - this review makes spending all that money look nothing but a waste of hard earned cash!!!
SniperWulf - Tuesday, March 25, 2008 - link
I wholeheartedly agree. Not only is it too expensive, but its not practical. What enthusiast you know will actually buy a setup like that? None I know, and prolly not any on the forums either.Sure, you want to test apples to apples... But the true apples to apples test is the hardware that people can get off newegg or zzf. 780i's and x38s with cheap but good DDR2 and DDR3 (well skip the cheap on DDR3 lol) and a nice penryn core cpu
legoman666 - Tuesday, March 25, 2008 - link
The reason they use skulltrail on all of the recent graphics card benchmarks is because it's the only chipset that supports SLI and Crossfire. It's the only way you'll see an apples to apples comparison. So stop your complaining.Inkjammer - Tuesday, March 25, 2008 - link
I agree on that. My E6600 w/9800 GX2 doesn't get near the performance Anandtech got in their review. In fact, the performance was still great, but really disappointing by comparison. Then I realized the benchmark was done with 6+ more cores than I have.The huge CPU power slightly skews realistic performance expectations on an otherwise high end PC. Great for showing card potential, not great for performance you can realistically expect.
DerekWilson - Tuesday, March 25, 2008 - link
My numbers do not change if I pull one processor.I tested that -- number of cores do not matter. Only speed of the cores.
tviceman - Tuesday, March 25, 2008 - link
Which was along the lines of the primary the point I was making. Why not just use the highest performance motherboard available and a single quad core processor overclocked like crazy? At least, in that regard, you're still using the best processor and best mobo out, which both can be had in a custom system for what the (almost) general masses can afford.I think there is a time and place for extreme high end reviews. But when extremely high end hardware is used in EVERY review, applicable performance expectations to the masses don't exist. I like your reviews; you're thorough you write well, it's just that reading these types of reviews consistently is more like listening to an extremely wealthy individual brag about all his toys. And by no means am I calling you a snob - hardware reviews are a part of your job as well as a priviledge. I will, never in the next few years, meet anyone with a system set up to be as expensive as what hardware reviewers regularly test with.
tviceman - Tuesday, March 25, 2008 - link
Sorry I used the word review in every single sentence. I was typing in a hurry and I didn't proof read.And to once again make it clear, you do a great job reviewing hardware and I enjoy all the article put out by everyone on anandtech. I just question the use of extreme high end hardware in EVERY review (like the 9600GT vs. 3870, is the skull trail necessary there?)
DerekWilson - Wednesday, March 26, 2008 - link
we've looked at using the hp blackbird ...the major reason i want to use skulltrail is to compare crossfire to sli.
there are plenty of reasons i'd rather use another platform, but i'd love it if either AMD or NVIDIA would take a step outside of the box and enable either their platform to support other multiGPU configurations or enable their drivers and cards to run in multiGPU configurations on other platforms.
7Enigma - Tuesday, March 25, 2008 - link
It's a slippery slope. We want to be able to say Nvidia is better than ATI (or vice versa), or that future scaling due to architecture will make A better than B. The truth as you pointed out is that it doesn't happen this way in all cases. You can have a A64 3200+ (like I currently do) and throw a GX2 at it and it probably won't perform better than a 9600GT. But that doesn't mean its an equal card, just that in a particular situation it performed the same. It's up to the buyer to do their homework and figure out whether its worth it to drop $650 when their system in current $$$ might not be worth the price of the card....Hocp tried this tactic with the launch of the C2D's and got a lot of heat (I agreed with the anger). They were trying to show that most games are GPU limited and so the new CPU's showed no benefit. Of course it was only a small selection of games, and they didn't take into consideration the 2 largest cpu-straining genre's (RTS and flight sim), and they were running at very high resolutions (which obviously would make things GPU bound).
One of gamespot.com's good features is in their game coverage (pretty much all I use them for). They put out hardware guides that do exactly what you want. They take a specific game and throw a battery of systems at it to see what makes a difference. Those guides will show you just how much improvement you can expect when going from a certain graphics card to a better one on a specific cpu platform. You'll be able to see whether your cpu/mobo combination is at the end of its usable lifespan (ie upgrading other components such as GPU/ram no longer yield a large improvement due to cpu/other bottleneck).
I would say while like you I read these reviews knowing I'll likely never own one of these cards/cpu's, it does give a good picture of who is at the top, and therefore, who has the potential to outlast the other in a system before an upgrade is needed.
With all that said, there is clearly a major problem going on and so all the data generated in this and possibly the previous single GX2 review also benchmarked on the skulltrail platform needs to be taken with a grain of salt, realizing that the numbers could very likely be invalid.
DerekWilson - Tuesday, March 25, 2008 - link
I doubt there is a major problem that would invalidate the data ... and I'm not just saying that because I spent weeks testing and troubleshooting :-)certainly something is going on, but in spite of the fact that 780i performs a bit higher, performance characteristics are exactly the same -- if there is an issue with my setup it is not platform specific.
I'm still tracking issues though ...
chizow - Tuesday, March 25, 2008 - link
Nice job on the review Derek, looks like you ran into some problems but I'd guess testing these new pieces of hardware make it worthwhile.It really looks like Quad SLI scaling is really poor right now, do you think its a case of drivers needing to mature, CPU bottleneck, or frame buffer limitations? I know Vista should be maxed at 4 frame buffers, but there seems to be very little scaling beyond a single GX2 in everything except Crysis (and COD4). In some games, performance actually decreases with the 2nd GX2.
Also, seeing the massive performance difference between Skulltrail and 780i, is it even worthwhile to continue using Skulltrail as a test platform? I understand it makes it more convenient for you guys to test between different GPU vendors, but a 25% difference in Crysis between an NV SLI solution and Intel's SLI solution is rather drastic, and that's *after* you factor in the 2nd CPU for Skulltrail. Does ATI suffer a similar performance hit when compared against its best performing chipset platform?
I would've liked to have seen Tri-SLI compared in there. Personally I think Tri-SLI with 8800 GTX/Ultra and soon, 9800 GTX will outperform Quad-SLI as it seems the drivers are a bit more mature for Tri-SLI and scaling was better as well. SLI performance with those parts is slightly better already than the GX2 and adding that third card should give Tri-SLI the lead over Quad-SLI.
Lastly, how was your actual gameplay experience with these high-end parts? Micro-stutter is a buzz word that has been gaining steam lately with multi-GPU solutions. Did you notice any in your testing? It looks like frame buffer size really kills all of these 512MB parts at 2560, would you consider games at that resolution unplayable? It seems many who considered 2-GX2 or 2-X2 would have done so to play at 2560. If that resolution is unplayable, you're looking at an even smaller window of consumers that would actually buy and benefit from such an expensive set-up.
seamusmc - Wednesday, March 26, 2008 - link
chizow check out Hard OCP's review in regards to 'micro' stutter. They do a great job of presenting the issue and how it affects gameplay.They feel the problem is due to the smaller amount of memory/memory bandwidth on the GX2 as opposed to an 8800 GTX/Ultra.
DerekWilson - Wednesday, March 26, 2008 - link
in my gameplay experience, i had no difficulty with micro stutter and the 9800gx2 in quad sli.i will say that i have run into the problem on crossfirex in oblivion with 4 way at very high res. it wasn't that pronounced or detrimental to the overall experience to me, but i'll make sure to mention it when i run into this problem in the future.
cactusjack - Tuesday, March 25, 2008 - link
Nvidia should go back to making good stable video cards with good IQ instead of flexing their e muscles with crap like this that no one will ever want or need. Nvidia had problems with power issues and vista driver issues on 8 series cards (G92)that they should be working on.raymondse - Tuesday, March 25, 2008 - link
Crysis this, Crysis that. SLI this, CrossFire that...After reading almost a dozen reviews of SLI, Tri-SLI, Quad-SLI, and CrossFire running Crysis and handful of other games, it seems that there is something terribly wrong with the all the benchmarks. Test results show that raw multi-GPU horsepower, even when coupled with multi-CPUs, just isn't delivering the kinds of numbers that most of us were expecting. The potential computing power that this kind of hardware can deliver just doesn't show in the numbers. Something is really, really wrong with one of these components thats disrupting the whole point of going for more than one CPU/GPU.
What I'd like to see is some definitive study showing where the problem(s) is and who to blame. Is it the CPU? GPU? Memory? System Bus? PCI-E? Drivers? DirectX? Windows? or the game/application itself?
After all these tests and benchmarks run by really, really smart people, someone out there ought to be able to deduce who messed up in all this business.
Das Capitolin - Wednesday, April 2, 2008 - link
What I dislike about many reviews, is that they test Crysis on "HIGH" settings. There's a major difference between "HIGH" which doesn't use AA, and HIGH with 16x Q AA.Here's an example of the difference it makes.
http://benchmarkreviews.com/index.php?option=com_c...">http://benchmarkreviews.com/index.php?o...;Itemid=...
Das Capitolin - Wednesday, April 2, 2008 - link
What I dislike about many reviews, is that they test Crysis on "HIGH" settings. There's a major difference between "HIGH" which doesn't use AA, and HIGH with 16x Q AA.Here's an example of the difference it makes.
piroroadkill - Tuesday, March 25, 2008 - link
I agree with this, there's a doubling of raw power with two cards (and a doubling of the price), and until I see double the performance in scaling, SLI and CF can go to hellLayzer253 - Tuesday, March 25, 2008 - link
No, they shouldnt. It works just fineThatguy97 - Sunday, June 28, 2015 - link
still a fast setup today...Nfarce - Thursday, April 7, 2022 - link
On what planet even seven years ago? My 970s in SLI back then wrecked this for less than half the price.