The only thing i'm still wondering about is what the performance difference between the 1800XT and the 7800GTX 512 would be if a dual core processor was used.
Since Nvidia makes use of dual core with their drivers, the gap might be even wider.. but who knows?
I doubt Nvidia cared about it's price/performance one bit. This card is just a spin-off to show what 512 MB's can do....(nothing). You give a GTX a 120 MHz core speed bump with a 200 MHz RAM bump and of course it's going to perform better than the GTX 256. Pure marketing. 700 dollars? What target market is this? And even better that they decided to keep the name to make it more confusing. Let's see how much money we can squeeze out of you by putting in more RAM which Vendors all ready have ordered in Bulk 8 months ago.
awesome card, i'm a bit confused though as to what nvidia was trying to release it as though.
the boosted clock speeds are typical of the mid-quarter 'refresher' products (same core, with boosted stats) but it also is posing as the same product but in a 512mb variation.
i think it should have a new model name (*agree* with derek!)
LOL! Thats what I thought, I wonder if he read the same review as we did? 2x GTs SLId are either within a couple of fps, or between 10-20 fps higher. Considering the GT is just over 200 here, I can get 2 and add an extra 50 for an sli board (over a non sli) and still save 50 notes.
As for soundcards, well its not a soundcard review so I dont want sound complicating anything. If I was really perdantic i'd say ATs review of the motherboard showed the a8n32 to be several FPS above all the rest of the mobos currently and would skew the data higher than normal boards, but thats not the point, the point is about comparability on the same platform.
Finally, another thanks goes to Derek for the higher resolution coverage - although some here are mystified why resolutions above 1600x1200 are mentioned, well some of us have TFTs with native resolutions of 1920x1200 and anything lower has to be either stretched or windowed- thats not what I paid all that money to do, I want fullscreen lovelyness at full res and right now, thanks to Dereks patience, I can see that the minimum setup to enjoy all current games with FSAA at 1920x1200 is 2xGTs. Even the GTX512 isnt quite man enough.
RE: Newegg has it!!!! by viciousvee on: Nov 14, 2005 10:53 PM
Might want to read the Review again (just a thought) but the GT SLI pretty much beats the GTX 512mb ("NON SLI") by a few FPS... So I stand by what I said, Get "TWO" GT's (7800) and Call it a day....
I have been an open critic of some of your recent work, but this review was fairly solid. Don't sweat someone saying a sentence was a run-on. The review read as an unbiased work and was inquisitive where it needed to be.
Thanks for including the disclaimer for multiple-clocks on lowering the core clock. The addition of the 256mb vs. 512mb benchmarks was a welcome addition.
Eh. I dont see the point of playing at 1600 yet, so I wouldnt be TOO concerned with needing to upgrade. Obviously, this is an amazing card though, so that cant be said enough.
anything over 300 dollars for a non all-in-wonder vid card is a ripoff though. x800XL is still more than adequate. Dont listen to the hype.
um, if you read the review, they said that they had to run two different circuts in their testing room due to the power draw, on two different PSUs. I guess that you could put two measuring devices up, but IDK how accurate that's going to be...
Nice card BTW. Makes my 9800pro look like Intel's intergrated graphics...
yeah, i saw that part. but they don't have to do it on just the SLI'd 512. in fact, they could do it with any card, subtract out two cards worth of power to find out the base consumption of the system, and then subtract from the 512 to get just the 512's consumption.
Sounds to me like the AT lab in question really needs to be looked over by an electrician if the mains circuit can't supply enough power to run a few computers. Here in the UK, even domestic households should have 13 amp sockets (which with our 230V mains voltage is equivalent to 3KW power), and a typical ring main circuit will be rated for some 30amps (or 7KW). You can run a helluva lot of computers with up to 7KW of juice available on each ring, and this is just an average home. Of course the electric bill will be a bit scary if you do use that much :)
I know things aren't so good in the US as you are only on 110V mains-supply so either the wires need to be a lot thicker to carry the higher current (nearly 30 amps for 3KW, and over 60 amps for 7KW), but still you shouldn't have a problem drawing say 2KW or so. Or is it normal for US mains wiring to be rubbish?
Considering this cooler uses heat pipes how does the orientation of the card affect performance? It looks to me that if positioned with the cooler on the downside as in a normal ATX tower the heat is traveling downwards instead of up as it should. This should in theory affect performance negatively. Granted the angel and distance is not that great but it would be nice to know.
The reason I ask it that many of the test done on review sites are done on an open test bed with desktop style orientation of the motherboard. How AnandTech test I don’t know but if orientation affect cooling the reviews seen today might be off in temperatures, sound and I guess overclocking.
Other comments:
As always on AT I miss minimum fps and/or some time scale to see how much of the time the card drops under say 30 fps or whatever is “unplayable” in different titles.
CPU scaling would be nice to but I guess that is for another article.
I also really think you should consider building up some testing methodology for sound. It does not have to be exact. Just use quiet watercooling like the reserator from Zalman and passive PSU and sound proof the test area a bit and buy a good soundmeter and you should be set to go. Considering the budget of AT and the considerable benefit for the readers I can’t se how that would be a high cost. Noise matters!
Just thought I'd drop by and tell you all that Black & White 2 runs just fine on R9600XT/A643000+(S754) machine at 1280*1024, so it's not as demanding as you made it out to be, if you only turn off some GPU raping quality settings. Thumbs up to Lionhead for that.
Nice card nVidia! Now that you've crushed ATi AGAiN, how 'bout you get to work on the 7600GT eh???
Yea ATi have become the Intel of GPUs (high clocks, not so good performance, failed launches), and nVidia looks even better than AMD (relatively low clocks, high performance, great launches). But let's not forget ATi has the XBOX360 just around the corner and nVidia still has some time to play till PS3 comes out, so I'd guess they had more spare time on their hands or something.
As for my pseudo X850 XT looking like a dog in this graphs, let's not forget we're talking 1600x1200 @ HIGHEST QUALITY!!! here. It's still a fine card for the money (born from X800GTO2 ;-) though CoD2 is a real performance hog and I don't really know why. Probably some stupid quality setting, we'd be better off without.
2005 was nVidia's year but I'm willing to bet in 2006 we'll see ATi coming back strong. R580 should be the breakthrough I think.
quote: Just thought I'd drop by and tell you all that Black & White 2 runs just fine on R9600XT/A643000+(S754) machine
"Just thought I'd drop by and tell you all that Black & White 2 runs just fine on R9600XT/A643000+(S754) machine"
I've got Black and White 2, and a Radeon 9550 and it works generally fine...problem is, I can't change detail levels of features such as vegetation or water detail to at least a decent state...it's unavailable to me, and I've sent an email to Lionhead and they haven't replied yet...do you need some high-end card to get it to work or something? Lionhead obviously didn't realize that not everyone in the world's going to buy a new computer everyday...which really does annoy me.
At least one picture I've seen (Ars) shows vents in the second slot backplane. The
air seems to blow in both directions. Which is better than nothing, but I might
still have to make some kind of ducting to stop the fans at the front of my
case blowing into the open end of the shroud. This is one reason I prefer water
cooling, but I'm too wary of cooking the RAM if it's not fully cooled.
Good review, good to see some high(er) resolutions being benchmarked.
Thanks for the efforts, people.
Just wondering, have any of the cards other specs changed? Is it still
one dual-link and one single link DVI (the latter run from the chip,
the former from external SiI parts)? (Since the 512MB 6800Ultra was dual
link and the Quadro FX4500 is dual dual link, I thought I'd check.) I'm
still hoping someone will get around to testing the G70's DVI quality on
the single link output for me, since the issue with the 6800.
I don't suppose nVidia took the opportunity to stick some of SiI's
HDCP-capable TMDS transmitters on it, did they? They're playing
catch-up with the X1800, and it would be a good time for them to
spend the extra few dollars on fixing it.
I'd be quite interested in some audio measurements of the fan, too.
Speaking of which, is the airflow actually useful with the Quadro
fan? I've got a lot of air blowing from the front of my case to the
back, and I've suspected that the overheating issues I've seen with
my 6800 are because the card's fan is fighting the case airflow
(for some reason nVidia's fans seem to blow the wrong way round).
For the price this is really for people with a "I don't care what it cost" (big) budget! Get 2 GT's (7800 ones) and call it a day. N.E ways Good article but I would like to see more benches with WOW (World of warcraft, even though they don't support SLI setup) and with 2 setups rather than one, one with the AMD 3500+ and the 57!
You mean, get two GT's (which would cost about the same as one of these, while offering far less performance? No thanks, if I were to spend $6-700, I'd go for the faster solution. Which means this card.
As for the rest, well, why is it relevant? AT is a hardware site, reviewing hardware. They're not benchmarking games to find "the best WoW card", they're benchmarking to find the best card overall. As for the CPU's, what would it add to a review of a card like this? Again, the purpose isn't to tell you "how many fps would you gain if you upgraded your CPU to a FX57?". It's to test this card versus the competition.
Might want to read the Review again (just a thought) but the GT SLI pretty much beats the GTX 512mb ("NON SLI") by a few FPS... So I stand by what I said, Get "TWO" GT's (7800) and Call it a day....
[Q=Spoonbender]As for the rest, well, why is it relevant? AT is a hardware site, reviewing hardware. They're not benchmarking games to find "the best WoW card", they're benchmarking to find the best card overall. As for the CPU's, what would it add to a review of a card like this? Again, the purpose isn't to tell you "how many fps would you gain if you upgraded your CPU to a FX57?". It's to test this card versus the competition.
As for the rest....
You might not understand by what I said... unlike my self and the Millions that play WOW, the game can give your GPU a run for your money, especially when you are in an Instance or even In IF, so It would be nice to see how the nVidia/Ati GPU performs. Not everyone has an FX57 so I was just thinking that it would be cool to see how "well" the GPU's perform with a HIGHEnd Cpu vs a MIDRange CPU is all... don't like it... well then keep it moving.
Im honestly sick of ATi and Nvidia producing equipment like this, at very low volume, and very high prices. It is not the fault of anandtech, but this review is like a review of two 50 million dollar cars, and finding out which one goes faster or is quicker.
I honestly dont care if people are willing to pay $700 for a video card, these products have become something for only the rich and glamorous of the computing world. I dont see a point in producing and selling these products but simply show "who is better".
I had been more impressed with Nvidia because of thier ability to sell thier GTX in volume, from the day they were released, and always gave ATi crap because they have had problems with this as of late. However, both card companies seem to have monopolized the very super uber top end of video cards, and use each others' product pricing/superiority to justifiy their pricing, and its rediculous.
Next generation, there will be new super king, worth 800 dollars, have only 15 produced and trump anything out there. The generation after that, 900, and so on and so on.
Your entire arguement is wrong, If people are willing to pay for that kind of performance then why shouldnt there be a product for them? Yes the pricing is very high but so are some peoples incomes(or parents) It is only for the rich and glamorous of the computing world, so what so are the two 50 million dollar cars your describe
Im sick of people getting all hyped about high end hardware, chances are your just jealous that you dont or cant have it. Get over it,
quote: It is only for the rich and glamorous of the computing world,
You don't have to be rich to afford a $700 video card. You just have to have $700. :) I've spent FAR more than that on things around the house that I consider to be of less value to me than a sparkly new kick ass video card.
Why should the products be only for the rich and glamorous of the computing world? Do these cards cost any more to produce than other cards such as the GTX or the GT? Even if they justify a small % increase in cost, does this equal 40 to 50 percent increase in price?
If you dont have a problem with such monopolization of a market, then I'm sure you dont have a problem unecessiarily paying extra for things you dont need? Your movie tickets are now not $10 dollars, they are now $15, why? Because they can and people will buy it. You gas prices are now $3.50 a gallon, why? Because people will pay it. Price of fast food is now not $5 dollars per meal, but 7.50, why? Not because there is any reason or increase in cost, but simply because people will pay it.
They have no right to complain, they have a choice and the industry has no responsibility to give fairness to the consumer, because this is America, we can exploit whomever we want.
I think the essence of what you're trying to say is not that you're comparing the few % difference between $500+ cards but rather the amount of media exposure that they get relative to the amount of sales they generate. That is, if nvidia sells 10,000 of these cards they would be laughing, and there are a whole boatload of reviews priasing nvidia for their prowess. Meanwhile the larger population that are buying 6800gs cards and below; which may generate more than 100,000 sales may have a less informed choice.
this is actually a strategy based on nvidia and ati's part that dates way back and also expands into all sectors of the economy. The thinking behind it is that if you have a class leading product, then the few buyers of such product will priase it so much that people who may not be able to afford the uber expensive product will still buy a cheaper product from the same brand. cant afford a 7800gtx? the 6600gt will be better than the x800xl though because the 7800gtx is best right? - that's what they want you to think.
/end repeating drivel ;)
What are you talking about? It's not like these are the only options. You can still buy cheap cards if you want to, so your examples are completely ridiculous. An accurate example would be cars - do you think it costs millions of dollars to make top of the line sports cars? No, but they cost that much because people are willing to pay for them. Still, not everyone can afford that so there are plenty of 10-20K cars for the masses.
And yes, these cards do cost significantly more to produce, because there will be a large number of defective parts when manufacturing a top of the line product. The cost of a card isn't how much silicon is in it, it's how much silicon was used to produce it.
Looks good, but why test only 4xAA? I'd expect people who are interested in those cards will want to set all graphics quality options as high as possible - at least 8xAA and 16xAF or something.
Pity that wasn't tested, but other than that a good review.
AFAIK 4xAA is the last level of AA that's constant between ATI and NV. The X850 tops out at 6xAA(which NV doesn't have), then there's 8xS, and the list goes on...
Thats a beast no less. The only thing ATI can do now is kick off that mysterious R580 and it better have a few more pipes than the 520 at the same or even higher clock speeds - and no paperlaunch this time. Or just give up and get the launch right for the next generation...
Is there any particular reason for only showing nvidia SLI results and no crossfire numbers at all?
This is something we discussed when working on this article, and there's really no purpose in testing a Crossfire setup at this point. The X1800 Crossfire master cards are not available yet to test an X1800 setup, and as we noted in our X850 Crossfire review, an X850 setup isn't really viable(not to mention it tops out at 1600x1200 when we test 2 higher resolutions).
Very interesting to see that 512MB has little to no impact on the performance - it is instead almost entirely the clock speed of the GPU and the RAM that makes the difference.
Also, I think this is the first time in PC gaming history where I've seen testing done where video cards more than ~9 months old are all essentially 'obsolete' as far as performance. Even the 7800 GT which only even came out maybe six months ago is already near the bottom of the stack at these 1600x1200 tests, and considering that's what anyone with a 19" or greater LCD wants to ideally play at, that's a bit scary. Then you realize that the 7800GT is around $330 for that bottom-end performance and it just goes up from there. It's really $450-550 for solid performance at that resolution these days. That's disappointing.
no one with a 19" desktop LCD is playing a game at any higher than 1280x1024, in which case this card is basically a waste of money. i have a 20" widescreen lcd and i find myself playing in 1280x1024 a lot because the games often don't expand the field of view, rather they just narrow the screen vertically.
We have seen this in every successive generation of video cards. Unless your running AA at high res (ie over 1280x1024), RAM size has little impact on performance. Heck, 64mb is probably enough for the textures in most games.
You really should have included COD2 in the tests. I remember seeing a test on another site that showed COD2 benefited GREATLY from 512mb vs 256mb of ram.
Actually, we were hoping to bring you CoD2 benchmarks for this review, but it didn't pan out. We do not equip our video testbeds with sound cards, so that we can more accurately compare cards; the problem with this is that we could not get CoD2 to run without sound, and we ran out of time unable to find a solution. It's still something we'd like to benchmark in the future if we get the chance though.
Rename FEAR.EXE to anything else .exe (PHEAR.EXE, TEST.EXE, whatever) when benchmarking ATI cards if you're running any of the latest ATI driver sets since they have yet to fix a faulty "IF" code from the FEAR demo that is hindering performance in the full version game. (The fix did not make the latest driver release earlier this week.) It has shown to improve performance by as much as 15fps.
I don't know about that FEAR 'fix' though. I mean how many card owners/PC users will actually know to do that? I think it's more legit to leave the bug in the testing - it is a legitimate bug afterall - and wait for the new Catalyst release where it will be 'fixed' and show the increased performance. Or if that's too strong against ATI, publish an article with benchmarks in FEAR highlighting that bug. But for standard comparisson benchmarks, I think it's best if they're done in as much of an 'out-of-the-box,' load it and play situation as possible.
I disagree with the 'out-of-box' notion. A product can't ship as a turd, but this is an enthusiast site. Enthusiasts should have the knowledge to use the proper drivers (not always the latest, which is why I say proper).
Well but has this site even published anythign on that fix? Not to my knowledge. I only know abotu it because I'm on the B3D forums where it originated. I imagine that whoever knows it here knows about it from the AT forums. But the fact is that if you're going to include the 'fix' in benchmarks, you might as well have an article preceding it announcing that this fix even exists, don't you think? Not everyone's a forum-goer; I know there was a time once not-too-long ago were I just went to tech sites and rad the articles, not the forums.
First the article describing this fix to the masses - *then* the banchmarks incorporating it. Don't you think that makes sense?
I wish these posts could be edited after the fact, but alas they can not. Anyway sorry for the bad spelling above.
Basically though, if we're talking about 'enthusiast' sites, the sites should be publishing 'enthusiast' news like the fear.exe fix, right? Then after that article I could agree with it's inclusion in benchmarks, because a precedent has been established.
or they could just write a blurb in the article, when they do the fear benches, that you can rename fear to anything else and fix the problem. and then bench it both ways.
Seriously though, it deserves it's own article. If it doesn't deserve that, it doesn't deserve benches mixed in with a 'general' comparison. The vast majority of people don't even read the associated text with benchmarks anyway, so it would probably go unnoticed by quite a few if it just had a short explanation on the FEAR page of a banchmark round-up.
I see his point and agree with it (but I don't agree with his style).
Sound off is just a way to inflate the numbers, and make the cards look good. To me, a true gamer, I want to know in-game LOWEST fps, with all the options on like I would play with during the actual game.
I don't buy the idea that sound gets in the way of the results due to inconsistencies among sound cards. When it's ATI vs nVidia, they use the same freakin' sound for both tests.
Actually, disabling sound makes more sense if you're wanting to isolate the differences between platforms. For example, when comparing chipsets or processor manufacturers, you would want the sound disabled.
I think, though, with a video card, I need to see what it's going to do in gaming conditions, and I want to see the lowest level of performance as well as the average so that I know what to expect.
Now, do you think an IHV wants to support a website throwing out their lowest numbers when the competition is showing their highest numbers on another site? Anandtech gets privileged information, but with that comes some expectation of promotion. Basically, my idea of benchmarking prolly ain't gonna happen (unless I get rich and fund it myself).
quote: Sound off is just a way to inflate the numbers, and make the cards look good.
You sir get a double BZZZTTTT!!!! They're NOT inflating numbers, they're isolating the GPU's as much as possible to test the GPU's without external interference from other components. This is NOT a system test, this is a GPU test. This not that difficult to grasp.
We're comparing GPUs here. The bottleneck is the CPU and so it makes perfect sense to disable any utilization caused by audio in order to show what these GPUs are capable of - not the system as a whole.
How ridiculous is it that you would defend the review of a GPU that doesn't even show how it would perform in the real world playing the games?
These stupid dick-measuring tests are so utterly pointless when it doesn't help the readership of the site - consumers - to determine how this card will perform in the real world, especially when we know that sound enabled can DRASTICALLY change the CPU usage during gameplay.
It's not THAT hard to have them review with a popular onboard sound solution (ALC850 seems to be on almost every NF4Ultra solution right now) and a popular peripheral sound solution (X-Fi for example) to give the readership a USEFUL review of how the card performs in the real world.
Feel free to keep doing all the current bragging rights tests, but maybe just once have a useful review based in reality.
Dude, clam down. There is no way to do system level tests as all systems are different. Do you want them to test every Dell, HP, Gateway, IBM, etc. configuration possible as well as every DIY configuration possible? Things like motherboards, PSUs, memory types (as well as memory timings), sound cards, etc. all effect performance and barely any one on these forums and elsewhere have the exact same rig. Also, adding sound will decrease performance by 5-10 fps. So just interpolate the results.
This article is testing the graphics cards only on a "reference" test system. The title implies as much and no false pretenses were brought forth. If you want to see fps during gameplay, go to hardocp.com. They have a pretty good method similar to what you are asking for.
If Anandtech gave you want you wanted, you would then just complain that they didn't use the right memory timings, the right hard drive, the right motherboard, etc. (and by right I mean what you have in your computer). :)
Sorry but that's a bogus excuse. they take the time to test all these systems with DOOM3 even though Quake4 is out when they could simply test just Quake4 instead of both. They take the time to do other extra tests. Simply testing with a popular onboard sound solution and a popular peripheral sound solution is NOT asking a lot and it's most certainly NOT asking for omg memory timings and omg Dell/HP. Stop exaggerating a simpler request into the realm of the unreasonable.
Your "simple request" would triple the number of benchmarks.
>> One set with on-board sound.
>> One set with add-in sound.
>> One set with no sound.
And what will it achieve? It'll tell you that on-board sound stinks and add-in sound doesn't. It muddies the scores of the individual components which is what we're comparing here ... one GPU vs the rest.
For the effects of sound on system performance, read a sound card review.
For the effects of memory on system performance, read a memory review.
For the effects of GPU on system performance, read a GPU review.
For the effects of CPU on system performance, read a CPU review.
For the effects of HDD on system performance, read a HDD review.
For the effects of [component] on system performance, read a [component] review.
When I'm selecting components to put in my system I want to know their individual, isolated score. That's more valuable to me than the mish-mash scores.
Well, I just don't see what enabling sound would show other than translating all the scores 5-10 fps lower. The same sound solution should hurt performance the same regardless of what video card you use so why don't you just extrapolate the results by taking into account a small performance hit with sound.
At any rate, you would still be making the same buying decision (the reason for reading a hardware review in the first place) based on the relative performance of each video card tested with respect to each other.
All enabling sound with onboard audio solutions would do is shift any bottleneck further towards the CPU. The whole point of graphics-card reviews is to show the card performs, not whether the rest of the system is holding it back. Thats why resolutions up to 2048x1536 are tested.
If you want to minimise audio CPU usage you don't have to spend much money. An Audigy 2 will off-load almost almost all sound overhead from the CPU. An X-Fi will do the same but also supports future EAX models, if they make any difference. If your CPU is holding back your gaming performance and you're using onboard audio, the easiest way to better performance is with a proper soundcard.
quote: The whole point of graphics-card reviews is to show the card performs, not whether the rest of the system is holding it back. Thats why resolutions up to 2048x1536 are tested.
Dude, how the card actually performs in gaming is how it performs with sound enabled. All we're REALLY seeing in this review is a GPU isolation test. Might as well leave it to the manufacturer to do such things, since it's just dick waving. I'd wager consumers would find it much more useful to see how the card ACTUALLY performs.
quote: All we're REALLY seeing in this review is a GPU isolation test.
EXACTLY!!!! Which IS the point of these tests! They are intentionally isolating the GPU's because...that's what they're testing! LOL! Anand's been testing the latest and greatest for years now. This is NOT something new here.
Holy crap nearly 300watts of power just for the GPU! This could be the first card that really puts a gaming system into the realm of NEEDING a 500watt high-efficiency PSU.
Read he article, it's system power. Meaning at the outlet. Actual power drawn from the PSU by the system components assuming ~75% efficiency would be around 210 which isn't all that much if you think about it. A 400w PSU is plenty for this system.
The mainstream people are still looking for the best value for $200. I hope ati doesn't overreact and start releasing a bunch of vid cards to gain the title back. Wait 4-6 months for the next iteration. The mainstream wants 60 fps @ 1024. Offer the best bang for the dollar and we'll rave about it.
Don't waste money on tons of iterations. Just lower the cost of current generation to compete. Anand will do a FPS vs $$ soon enough, cause that's the real measure of value.
For $700 buy an xbox 360 or take a Winter vacation.
Well, my 6800GT can't give me playable in 1280x1024 for CoD2, so that's what we've come to. I would have thought that a card like this wouldn't be needed until sometime next year, but already the level of hardware required by games has takena significant jump since Doom 3.
Some users prefer to run their 17" or 19" LCD at the native resolution (1280x1024). This means they want good performance at that resolution. As for those that have bigger screens, they want even better performance.
Even so, there are lots of good games that run ok on old video cards (even budget old video cards). But if someone chooses a certain level of quality (antialising, resolution, HDR, ...) they want, is great to have a site that present different options (cards).
Isn't the PS3 supposed to be using a 24-pipe nVidia core running at 550MHz as well? If so, that would almost certainly mean that this card is faster as I bet they are using very similar cores, but the 7800GTX512 has much faster memory than the PS3.
And of course there's always SLI if you want even more performance...
heh ... sli ... let's see, $1400 on two video cards or on 3 or 4 next gen consoles ... or on lots of other cool hardware/software/tvs/movies/games ... whatever
I guess if you buy this card you´re doing so partly because you´re interested in running games in the hiqhest quality settings. But afaik it can´t do OpenEXR HDR and AA like in Far Cry, so I think this card is somewhat of a contradiction. Surely it depends on how the appliction uses HDR, like Valve showed with HDR and AA for everyone in Lost Coast. But I would say, not a very futureproof card then, as everyone predicts HDR will be big in games, and I guess a lot of them will use OpenEXR. Still, it will top the charts, for what that´s worth.
And about the extra memory, how about taking the card for a spin with Call of Duty 2? Seems that game takes advantage of 512 MiB.
The advantage ATI offers is MSAA with floating point HDR. We've already seen a game (Black and White 2) that employs AA and HDR by using Supersample FSAA, and as you pointed out Valves Source engine avoids full float render targets and still gets good results.
The performance hit is larger with SSAA, but it is certainly possible to have HDR and AA without the ability to do MSAA on floating point/multiple render targets. And the sheer brute strenth the 7800 GTX 512 has can easily be spent on SSAA as shown (again) by Black and White 2.
i'm an ati fan but this is ridicoulous. ati just gets crushed and crushed. even the regular 7800 gtx gets crushed. but i knew something like this would happen if the 7800 was cranked up to a clockspeed close to the x1800xt. those extra 8 pipes and the extra memory bandwidth just lead to the same thing: crushing all opponents lol. man. is ati the new intel? i hope not :(. but thats how its looking currently :'(.
ha ha i guess my x800xt aiw isn't looking so hot right now :-D.
w00t! I traded my matching-numbers first-run GeForce 3 (Before they were TI'd) in for a 5900. im not upgrading till socket M2 comes a-rolling into the bargain bin.
quote: That is some pretty amazing performance. It makes my ATi X800Xl look rather pathetic...sighs
It's called "marketing". Don't succumb to it.
It it a fast card? Heck yeah. Is it necessary? Far from it. I have an ATI X800XL as well, and I don't plan on switching until I have to. Game developers will continue to make games compatible with our cards for some time to come, and the only thing we'll be missing is Shader Model 3.0. So far, what I have seen of it hasn't been a big enough improvement to encourage me to go out and plunk cash down on a new card. And seeing as my gaming is now measured in hours per week (as opposed to hours per day, like when I worked in a computer store) I couldn't justify spending that kind of bread on something that isn't constantly in use.
I think the 7800GTX 512 is a neat looking toy. But that's just it: it's a toy. I'd rather cover two car payments or two-thirds of a mortgage payment, things I NEED to spend money on
quote: At $700 we are a little weary of recommending this part to anyone but the professional gamers and incredibly wealthy. The extra performance just isn't necessary in most cases.
I agree. I also think the $600 dollar pricetag on the x1800xt is a bit much as well.
i've been asking them to hire an editor for a few years now, but i'm pretty sure they haven't taken my advice yet. every once in a while they post an article that is just unreadable due to the run on and compound sentences.
It's worse when you have a journalism degree. It drives you up a wall to read so many grammatical and spelling errors. Even so I'd rather they put the money into doing more 'real world' style tests instead of just these Top of the Charts/ GPU with the Biggest Dick contests.
It could be worse, at least they updated the article to change this comical spelling mistake. Puts them in line with the rest of the computer industry where the testing phase of the development cycle is outsourced to the consumer ;-)
This thing just destroys every other single card, and every other SLI configuration in almost every test! Yikes. I guess it had better do so at $700 apiece, though.
Maybe this will push the price of the 7800GT and GTX models down in a couple of weeks? Cost-conscious buyers like myself can only hope so.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
97 Comments
Back to Article
Gogar - Saturday, November 19, 2005 - link
The only thing i'm still wondering about is what the performance difference between the 1800XT and the 7800GTX 512 would be if a dual core processor was used.Since Nvidia makes use of dual core with their drivers, the gap might be even wider.. but who knows?
Regs - Friday, November 18, 2005 - link
I doubt Nvidia cared about it's price/performance one bit. This card is just a spin-off to show what 512 MB's can do....(nothing). You give a GTX a 120 MHz core speed bump with a 200 MHz RAM bump and of course it's going to perform better than the GTX 256. Pure marketing. 700 dollars? What target market is this? And even better that they decided to keep the name to make it more confusing. Let's see how much money we can squeeze out of you by putting in more RAM which Vendors all ready have ordered in Bulk 8 months ago.Regs - Friday, November 18, 2005 - link
In other words you are paying 300 dollars more for a GTX with a better HSF unit.cryptonomicon - Tuesday, November 15, 2005 - link
awesome card, i'm a bit confused though as to what nvidia was trying to release it as though.the boosted clock speeds are typical of the mid-quarter 'refresher' products (same core, with boosted stats) but it also is posing as the same product but in a 512mb variation.
i think it should have a new model name (*agree* with derek!)
MadAd - Tuesday, November 15, 2005 - link
LOL! Thats what I thought, I wonder if he read the same review as we did? 2x GTs SLId are either within a couple of fps, or between 10-20 fps higher. Considering the GT is just over 200 here, I can get 2 and add an extra 50 for an sli board (over a non sli) and still save 50 notes.As for soundcards, well its not a soundcard review so I dont want sound complicating anything. If I was really perdantic i'd say ATs review of the motherboard showed the a8n32 to be several FPS above all the rest of the mobos currently and would skew the data higher than normal boards, but thats not the point, the point is about comparability on the same platform.
Finally, another thanks goes to Derek for the higher resolution coverage - although some here are mystified why resolutions above 1600x1200 are mentioned, well some of us have TFTs with native resolutions of 1920x1200 and anything lower has to be either stretched or windowed- thats not what I paid all that money to do, I want fullscreen lovelyness at full res and right now, thanks to Dereks patience, I can see that the minimum setup to enjoy all current games with FSAA at 1920x1200 is 2xGTs. Even the GTX512 isnt quite man enough.
MadAd - Tuesday, November 15, 2005 - link
OOps, that was in reply to:RE: Newegg has it!!!! by viciousvee on: Nov 14, 2005 10:53 PM
Might want to read the Review again (just a thought) but the GT SLI pretty much beats the GTX 512mb ("NON SLI") by a few FPS... So I stand by what I said, Get "TWO" GT's (7800) and Call it a day....
Eidolon - Monday, November 14, 2005 - link
I am guessing you cannot SLI this card with an existing 7800 GTX 256MB. Is this correct?Fluppeteer - Tuesday, November 15, 2005 - link
ISTR Tom's tried it, and no, you can't. (Presumably the core is tweakedjust enough that it doesn't work.)
Maybe nVidia will fix that in a new driver?
jeffrey - Monday, November 14, 2005 - link
Derek,I have been an open critic of some of your recent work, but this review was fairly solid. Don't sweat someone saying a sentence was a run-on. The review read as an unbiased work and was inquisitive where it needed to be.
Thanks for including the disclaimer for multiple-clocks on lowering the core clock. The addition of the 256mb vs. 512mb benchmarks was a welcome addition.
That's all,
Jeffrey
ElJefe - Monday, November 14, 2005 - link
Eh. I dont see the point of playing at 1600 yet, so I wouldnt be TOO concerned with needing to upgrade. Obviously, this is an amazing card though, so that cant be said enough.anything over 300 dollars for a non all-in-wonder vid card is a ripoff though. x800XL is still more than adequate. Dont listen to the hype.
stephenbrooks - Monday, November 14, 2005 - link
Shh don't say that or they'll deliberately start making games with detail that can only be seen at 1600x1200 to drive hardware sales... :)ElFenix - Monday, November 14, 2005 - link
by testing power consumption with and without SLI/crossfire you can figure out the consumption of a single card with pretty decent accuracy.Leper Messiah - Tuesday, November 15, 2005 - link
um, if you read the review, they said that they had to run two different circuts in their testing room due to the power draw, on two different PSUs. I guess that you could put two measuring devices up, but IDK how accurate that's going to be...Nice card BTW. Makes my 9800pro look like Intel's intergrated graphics...
ElFenix - Tuesday, November 15, 2005 - link
yeah, i saw that part. but they don't have to do it on just the SLI'd 512. in fact, they could do it with any card, subtract out two cards worth of power to find out the base consumption of the system, and then subtract from the 512 to get just the 512's consumption.it's just a thought.
PrinceGaz - Tuesday, November 15, 2005 - link
Sounds to me like the AT lab in question really needs to be looked over by an electrician if the mains circuit can't supply enough power to run a few computers. Here in the UK, even domestic households should have 13 amp sockets (which with our 230V mains voltage is equivalent to 3KW power), and a typical ring main circuit will be rated for some 30amps (or 7KW). You can run a helluva lot of computers with up to 7KW of juice available on each ring, and this is just an average home. Of course the electric bill will be a bit scary if you do use that much :)I know things aren't so good in the US as you are only on 110V mains-supply so either the wires need to be a lot thicker to carry the higher current (nearly 30 amps for 3KW, and over 60 amps for 7KW), but still you shouldn't have a problem drawing say 2KW or so. Or is it normal for US mains wiring to be rubbish?
bob661 - Tuesday, November 15, 2005 - link
Old homes usually have crappy wiring over here. I ran about 15 computers off of a 15 A breaker before tripped at a LAN party.bob661 - Tuesday, November 15, 2005 - link
I wish for an edit button. That said, I ran those 15 computers in a new house (newer than 5 years).Live - Monday, November 14, 2005 - link
Nice review! Still have a question tough:Considering this cooler uses heat pipes how does the orientation of the card affect performance? It looks to me that if positioned with the cooler on the downside as in a normal ATX tower the heat is traveling downwards instead of up as it should. This should in theory affect performance negatively. Granted the angel and distance is not that great but it would be nice to know.
The reason I ask it that many of the test done on review sites are done on an open test bed with desktop style orientation of the motherboard. How AnandTech test I don’t know but if orientation affect cooling the reviews seen today might be off in temperatures, sound and I guess overclocking.
Other comments:
As always on AT I miss minimum fps and/or some time scale to see how much of the time the card drops under say 30 fps or whatever is “unplayable” in different titles.
CPU scaling would be nice to but I guess that is for another article.
I also really think you should consider building up some testing methodology for sound. It does not have to be exact. Just use quiet watercooling like the reserator from Zalman and passive PSU and sound proof the test area a bit and buy a good soundmeter and you should be set to go. Considering the budget of AT and the considerable benefit for the readers I can’t se how that would be a high cost. Noise matters!
AtaStrumf - Monday, November 14, 2005 - link
Just thought I'd drop by and tell you all that Black & White 2 runs just fine on R9600XT/A643000+(S754) machine at 1280*1024, so it's not as demanding as you made it out to be, if you only turn off some GPU raping quality settings. Thumbs up to Lionhead for that.Nice card nVidia! Now that you've crushed ATi AGAiN, how 'bout you get to work on the 7600GT eh???
Yea ATi have become the Intel of GPUs (high clocks, not so good performance, failed launches), and nVidia looks even better than AMD (relatively low clocks, high performance, great launches). But let's not forget ATi has the XBOX360 just around the corner and nVidia still has some time to play till PS3 comes out, so I'd guess they had more spare time on their hands or something.
As for my pseudo X850 XT looking like a dog in this graphs, let's not forget we're talking 1600x1200 @ HIGHEST QUALITY!!! here. It's still a fine card for the money (born from X800GTO2 ;-) though CoD2 is a real performance hog and I don't really know why. Probably some stupid quality setting, we'd be better off without.
2005 was nVidia's year but I'm willing to bet in 2006 we'll see ATi coming back strong. R580 should be the breakthrough I think.
Deku - Wednesday, January 4, 2006 - link
I've got Black and White 2, and a Radeon 9550 and it works generally fine...problem is, I can't change detail levels of features such as vegetation or water detail to at least a decent state...it's unavailable to me, and I've sent an email to Lionhead and they haven't replied yet...do you need some high-end card to get it to work or something? Lionhead obviously didn't realize that not everyone in the world's going to buy a new computer everyday...which really does annoy me. "Just thought I'd drop by and tell you all that Black & White 2 runs just fine on R9600XT/A643000+(S754) machine"
nourdmrolNMT1 - Monday, November 14, 2005 - link
i still need to figure out what to get for my Computer so i can run CSS at native res (1680*1050)its hard having to always scale the games.
ElFenix - Monday, November 14, 2005 - link
i would assume so, seeing as how it would be a very good use of the second slot. but two slot designs don't always do that.Fluppeteer - Tuesday, November 15, 2005 - link
At least one picture I've seen (Ars) shows vents in the second slot backplane. Theair seems to blow in both directions. Which is better than nothing, but I might
still have to make some kind of ducting to stop the fans at the front of my
case blowing into the open end of the shroud. This is one reason I prefer water
cooling, but I'm too wary of cooking the RAM if it's not fully cooled.
(Saving up...)
Fluppeteer - Monday, November 14, 2005 - link
Good review, good to see some high(er) resolutions being benchmarked.Thanks for the efforts, people.
Just wondering, have any of the cards other specs changed? Is it still
one dual-link and one single link DVI (the latter run from the chip,
the former from external SiI parts)? (Since the 512MB 6800Ultra was dual
link and the Quadro FX4500 is dual dual link, I thought I'd check.) I'm
still hoping someone will get around to testing the G70's DVI quality on
the single link output for me, since the issue with the 6800.
I don't suppose nVidia took the opportunity to stick some of SiI's
HDCP-capable TMDS transmitters on it, did they? They're playing
catch-up with the X1800, and it would be a good time for them to
spend the extra few dollars on fixing it.
I'd be quite interested in some audio measurements of the fan, too.
Speaking of which, is the airflow actually useful with the Quadro
fan? I've got a lot of air blowing from the front of my case to the
back, and I've suspected that the overheating issues I've seen with
my 6800 are because the card's fan is fighting the case airflow
(for some reason nVidia's fans seem to blow the wrong way round).
--
Fluppeteer
Sunbird - Monday, November 14, 2005 - link
So how bad does this spank my 5900XT? :Pbob661 - Tuesday, November 15, 2005 - link
I had one of those. You might be able to dig up an early benchmark on the 6600GT that will show how it compares to the 5900XT.bob661 - Tuesday, November 15, 2005 - link
http://tinyurl.com/77v66">Here you go. :)Griswold - Monday, November 14, 2005 - link
Ok, if this peanut represents the 5900XT, the GTX 512 would be the size of a melon. ;)viciousvee - Monday, November 14, 2005 - link
For the price this is really for people with a "I don't care what it cost" (big) budget! Get 2 GT's (7800 ones) and call it a day. N.E ways Good article but I would like to see more benches with WOW (World of warcraft, even though they don't support SLI setup) and with 2 setups rather than one, one with the AMD 3500+ and the 57!Spoonbender - Monday, November 14, 2005 - link
You mean, get two GT's (which would cost about the same as one of these, while offering far less performance? No thanks, if I were to spend $6-700, I'd go for the faster solution. Which means this card.As for the rest, well, why is it relevant? AT is a hardware site, reviewing hardware. They're not benchmarking games to find "the best WoW card", they're benchmarking to find the best card overall. As for the CPU's, what would it add to a review of a card like this? Again, the purpose isn't to tell you "how many fps would you gain if you upgraded your CPU to a FX57?". It's to test this card versus the competition.
viciousvee - Monday, November 14, 2005 - link
Might want to read the Review again (just a thought) but the GT SLI pretty much beats the GTX 512mb ("NON SLI") by a few FPS... So I stand by what I said, Get "TWO" GT's (7800) and Call it a day....viciousvee - Monday, November 14, 2005 - link
[Q=Spoonbender]As for the rest, well, why is it relevant? AT is a hardware site, reviewing hardware. They're not benchmarking games to find "the best WoW card", they're benchmarking to find the best card overall. As for the CPU's, what would it add to a review of a card like this? Again, the purpose isn't to tell you "how many fps would you gain if you upgraded your CPU to a FX57?". It's to test this card versus the competition.
As for the rest....
You might not understand by what I said... unlike my self and the Millions that play WOW, the game can give your GPU a run for your money, especially when you are in an Instance or even In IF, so It would be nice to see how the nVidia/Ati GPU performs. Not everyone has an FX57 so I was just thinking that it would be cool to see how "well" the GPU's perform with a HIGHEnd Cpu vs a MIDRange CPU is all... don't like it... well then keep it moving.
NullSubroutine - Monday, November 14, 2005 - link
Im honestly sick of ATi and Nvidia producing equipment like this, at very low volume, and very high prices. It is not the fault of anandtech, but this review is like a review of two 50 million dollar cars, and finding out which one goes faster or is quicker.I honestly dont care if people are willing to pay $700 for a video card, these products have become something for only the rich and glamorous of the computing world. I dont see a point in producing and selling these products but simply show "who is better".
I had been more impressed with Nvidia because of thier ability to sell thier GTX in volume, from the day they were released, and always gave ATi crap because they have had problems with this as of late. However, both card companies seem to have monopolized the very super uber top end of video cards, and use each others' product pricing/superiority to justifiy their pricing, and its rediculous.
Next generation, there will be new super king, worth 800 dollars, have only 15 produced and trump anything out there. The generation after that, 900, and so on and so on.
[/rant]
SimonNZ - Monday, November 14, 2005 - link
Your entire arguement is wrong, If people are willing to pay for that kind of performance then why shouldnt there be a product for them? Yes the pricing is very high but so are some peoples incomes(or parents) It is only for the rich and glamorous of the computing world, so what so are the two 50 million dollar cars your describeIm sick of people getting all hyped about high end hardware, chances are your just jealous that you dont or cant have it. Get over it,
bob661 - Tuesday, November 15, 2005 - link
You don't have to be rich to afford a $700 video card. You just have to have $700. :) I've spent FAR more than that on things around the house that I consider to be of less value to me than a sparkly new kick ass video card.NullSubroutine - Monday, November 14, 2005 - link
Why should the products be only for the rich and glamorous of the computing world? Do these cards cost any more to produce than other cards such as the GTX or the GT? Even if they justify a small % increase in cost, does this equal 40 to 50 percent increase in price?If you dont have a problem with such monopolization of a market, then I'm sure you dont have a problem unecessiarily paying extra for things you dont need? Your movie tickets are now not $10 dollars, they are now $15, why? Because they can and people will buy it. You gas prices are now $3.50 a gallon, why? Because people will pay it. Price of fast food is now not $5 dollars per meal, but 7.50, why? Not because there is any reason or increase in cost, but simply because people will pay it.
They have no right to complain, they have a choice and the industry has no responsibility to give fairness to the consumer, because this is America, we can exploit whomever we want.
xsilver - Tuesday, November 15, 2005 - link
I think the essence of what you're trying to say is not that you're comparing the few % difference between $500+ cards but rather the amount of media exposure that they get relative to the amount of sales they generate. That is, if nvidia sells 10,000 of these cards they would be laughing, and there are a whole boatload of reviews priasing nvidia for their prowess. Meanwhile the larger population that are buying 6800gs cards and below; which may generate more than 100,000 sales may have a less informed choice.this is actually a strategy based on nvidia and ati's part that dates way back and also expands into all sectors of the economy. The thinking behind it is that if you have a class leading product, then the few buyers of such product will priase it so much that people who may not be able to afford the uber expensive product will still buy a cheaper product from the same brand. cant afford a 7800gtx? the 6600gt will be better than the x800xl though because the 7800gtx is best right? - that's what they want you to think.
/end repeating drivel ;)
smitty3268 - Monday, November 14, 2005 - link
What are you talking about? It's not like these are the only options. You can still buy cheap cards if you want to, so your examples are completely ridiculous. An accurate example would be cars - do you think it costs millions of dollars to make top of the line sports cars? No, but they cost that much because people are willing to pay for them. Still, not everyone can afford that so there are plenty of 10-20K cars for the masses.And yes, these cards do cost significantly more to produce, because there will be a large number of defective parts when manufacturing a top of the line product. The cost of a card isn't how much silicon is in it, it's how much silicon was used to produce it.
nullpointerus - Monday, November 14, 2005 - link
Getting offended at others for simply being different is pointless and counterproductive.Nighteye2 - Monday, November 14, 2005 - link
Looks good, but why test only 4xAA? I'd expect people who are interested in those cards will want to set all graphics quality options as high as possible - at least 8xAA and 16xAF or something.Pity that wasn't tested, but other than that a good review.
ViRGE - Monday, November 14, 2005 - link
AFAIK 4xAA is the last level of AA that's constant between ATI and NV. The X850 tops out at 6xAA(which NV doesn't have), then there's 8xS, and the list goes on...Griswold - Monday, November 14, 2005 - link
Thats a beast no less. The only thing ATI can do now is kick off that mysterious R580 and it better have a few more pipes than the 520 at the same or even higher clock speeds - and no paperlaunch this time. Or just give up and get the launch right for the next generation...Is there any particular reason for only showing nvidia SLI results and no crossfire numbers at all?
Ryan Smith - Monday, November 14, 2005 - link
This is something we discussed when working on this article, and there's really no purpose in testing a Crossfire setup at this point. The X1800 Crossfire master cards are not available yet to test an X1800 setup, and as we noted in our X850 Crossfire review, an X850 setup isn't really viable(not to mention it tops out at 1600x1200 when we test 2 higher resolutions).Griswold - Monday, November 14, 2005 - link
Ah well, woulda thought AT has a few master cards in their closet. Guess not. :)Kyanzes - Monday, November 14, 2005 - link
ONE WORD: DOMINATIONyacoub - Monday, November 14, 2005 - link
Very interesting to see that 512MB has little to no impact on the performance - it is instead almost entirely the clock speed of the GPU and the RAM that makes the difference.Also, I think this is the first time in PC gaming history where I've seen testing done where video cards more than ~9 months old are all essentially 'obsolete' as far as performance. Even the 7800 GT which only even came out maybe six months ago is already near the bottom of the stack at these 1600x1200 tests, and considering that's what anyone with a 19" or greater LCD wants to ideally play at, that's a bit scary. Then you realize that the 7800GT is around $330 for that bottom-end performance and it just goes up from there. It's really $450-550 for solid performance at that resolution these days. That's disappointing.
ElFenix - Monday, November 14, 2005 - link
no one with a 19" desktop LCD is playing a game at any higher than 1280x1024, in which case this card is basically a waste of money. i have a 20" widescreen lcd and i find myself playing in 1280x1024 a lot because the games often don't expand the field of view, rather they just narrow the screen vertically.tfranzese - Monday, November 14, 2005 - link
SLi/XFire scews the graphes. You need to take that into account when looking at the results.Cygni - Monday, November 14, 2005 - link
We have seen this in every successive generation of video cards. Unless your running AA at high res (ie over 1280x1024), RAM size has little impact on performance. Heck, 64mb is probably enough for the textures in most games.cw42 - Monday, November 14, 2005 - link
You really should have included COD2 in the tests. I remember seeing a test on another site that showed COD2 benefited GREATLY from 512mb vs 256mb of ram.Ryan Smith - Monday, November 14, 2005 - link
Actually, we were hoping to bring you CoD2 benchmarks for this review, but it didn't pan out. We do not equip our video testbeds with sound cards, so that we can more accurately compare cards; the problem with this is that we could not get CoD2 to run without sound, and we ran out of time unable to find a solution. It's still something we'd like to benchmark in the future if we get the chance though.ElFenix - Monday, November 14, 2005 - link
then benchmark it with sound and disclose that fact...yacoub - Monday, November 14, 2005 - link
Ditch DoD:Source for CoD2.Ditch DOOM3 for Quake4.
Rename FEAR.EXE to anything else .exe (PHEAR.EXE, TEST.EXE, whatever) when benchmarking ATI cards if you're running any of the latest ATI driver sets since they have yet to fix a faulty "IF" code from the FEAR demo that is hindering performance in the full version game. (The fix did not make the latest driver release earlier this week.) It has shown to improve performance by as much as 15fps.
xbdestroya - Monday, November 14, 2005 - link
I don't know about that FEAR 'fix' though. I mean how many card owners/PC users will actually know to do that? I think it's more legit to leave the bug in the testing - it is a legitimate bug afterall - and wait for the new Catalyst release where it will be 'fixed' and show the increased performance. Or if that's too strong against ATI, publish an article with benchmarks in FEAR highlighting that bug. But for standard comparisson benchmarks, I think it's best if they're done in as much of an 'out-of-the-box,' load it and play situation as possible.tfranzese - Monday, November 14, 2005 - link
I disagree with the 'out-of-box' notion. A product can't ship as a turd, but this is an enthusiast site. Enthusiasts should have the knowledge to use the proper drivers (not always the latest, which is why I say proper).xbdestroya - Monday, November 14, 2005 - link
Well but has this site even published anythign on that fix? Not to my knowledge. I only know abotu it because I'm on the B3D forums where it originated. I imagine that whoever knows it here knows about it from the AT forums. But the fact is that if you're going to include the 'fix' in benchmarks, you might as well have an article preceding it announcing that this fix even exists, don't you think? Not everyone's a forum-goer; I know there was a time once not-too-long ago were I just went to tech sites and rad the articles, not the forums.First the article describing this fix to the masses - *then* the banchmarks incorporating it. Don't you think that makes sense?
xbdestroya - Monday, November 14, 2005 - link
I wish these posts could be edited after the fact, but alas they can not. Anyway sorry for the bad spelling above.Basically though, if we're talking about 'enthusiast' sites, the sites should be publishing 'enthusiast' news like the fear.exe fix, right? Then after that article I could agree with it's inclusion in benchmarks, because a precedent has been established.
ElFenix - Monday, November 14, 2005 - link
or they could just write a blurb in the article, when they do the fear benches, that you can rename fear to anything else and fix the problem. and then bench it both ways.xbdestroya - Monday, November 14, 2005 - link
Seriously though, it deserves it's own article. If it doesn't deserve that, it doesn't deserve benches mixed in with a 'general' comparison. The vast majority of people don't even read the associated text with benchmarks anyway, so it would probably go unnoticed by quite a few if it just had a short explanation on the FEAR page of a banchmark round-up.yacoub - Monday, November 14, 2005 - link
LAAAAAAAAAAAAAAME. Start doing REAL tests. Okay fine, this is your last PEAK FPS test, right? Right?
From now on show us average fps, sound on, etc. What we'll ACTUALLY GET using the card to PLAY the game, not dick-measure it.
Scarceas - Monday, November 14, 2005 - link
I see his point and agree with it (but I don't agree with his style).Sound off is just a way to inflate the numbers, and make the cards look good. To me, a true gamer, I want to know in-game LOWEST fps, with all the options on like I would play with during the actual game.
I don't buy the idea that sound gets in the way of the results due to inconsistencies among sound cards. When it's ATI vs nVidia, they use the same freakin' sound for both tests.
Actually, disabling sound makes more sense if you're wanting to isolate the differences between platforms. For example, when comparing chipsets or processor manufacturers, you would want the sound disabled.
I think, though, with a video card, I need to see what it's going to do in gaming conditions, and I want to see the lowest level of performance as well as the average so that I know what to expect.
Now, do you think an IHV wants to support a website throwing out their lowest numbers when the competition is showing their highest numbers on another site? Anandtech gets privileged information, but with that comes some expectation of promotion. Basically, my idea of benchmarking prolly ain't gonna happen (unless I get rich and fund it myself).
bob661 - Tuesday, November 15, 2005 - link
You sir get a double BZZZTTTT!!!! They're NOT inflating numbers, they're isolating the GPU's as much as possible to test the GPU's without external interference from other components. This is NOT a system test, this is a GPU test. This not that difficult to grasp.tfranzese - Monday, November 14, 2005 - link
Do you use that thing in your head much?We're comparing GPUs here. The bottleneck is the CPU and so it makes perfect sense to disable any utilization caused by audio in order to show what these GPUs are capable of - not the system as a whole.
yacoub - Monday, November 14, 2005 - link
How ridiculous is it that you would defend the review of a GPU that doesn't even show how it would perform in the real world playing the games?These stupid dick-measuring tests are so utterly pointless when it doesn't help the readership of the site - consumers - to determine how this card will perform in the real world, especially when we know that sound enabled can DRASTICALLY change the CPU usage during gameplay.
It's not THAT hard to have them review with a popular onboard sound solution (ALC850 seems to be on almost every NF4Ultra solution right now) and a popular peripheral sound solution (X-Fi for example) to give the readership a USEFUL review of how the card performs in the real world.
Feel free to keep doing all the current bragging rights tests, but maybe just once have a useful review based in reality.
mlittl3 - Monday, November 14, 2005 - link
Dude, clam down. There is no way to do system level tests as all systems are different. Do you want them to test every Dell, HP, Gateway, IBM, etc. configuration possible as well as every DIY configuration possible? Things like motherboards, PSUs, memory types (as well as memory timings), sound cards, etc. all effect performance and barely any one on these forums and elsewhere have the exact same rig. Also, adding sound will decrease performance by 5-10 fps. So just interpolate the results.This article is testing the graphics cards only on a "reference" test system. The title implies as much and no false pretenses were brought forth. If you want to see fps during gameplay, go to hardocp.com. They have a pretty good method similar to what you are asking for.
If Anandtech gave you want you wanted, you would then just complain that they didn't use the right memory timings, the right hard drive, the right motherboard, etc. (and by right I mean what you have in your computer). :)
yacoub - Monday, November 14, 2005 - link
Sorry but that's a bogus excuse. they take the time to test all these systems with DOOM3 even though Quake4 is out when they could simply test just Quake4 instead of both. They take the time to do other extra tests. Simply testing with a popular onboard sound solution and a popular peripheral sound solution is NOT asking a lot and it's most certainly NOT asking for omg memory timings and omg Dell/HP. Stop exaggerating a simpler request into the realm of the unreasonable.Houdani - Tuesday, November 15, 2005 - link
Your "simple request" would triple the number of benchmarks.>> One set with on-board sound.
>> One set with add-in sound.
>> One set with no sound.
And what will it achieve? It'll tell you that on-board sound stinks and add-in sound doesn't. It muddies the scores of the individual components which is what we're comparing here ... one GPU vs the rest.
For the effects of sound on system performance, read a sound card review.
For the effects of memory on system performance, read a memory review.
For the effects of GPU on system performance, read a GPU review.
For the effects of CPU on system performance, read a CPU review.
For the effects of HDD on system performance, read a HDD review.
For the effects of [component] on system performance, read a [component] review.
When I'm selecting components to put in my system I want to know their individual, isolated score. That's more valuable to me than the mish-mash scores.
mlittl3 - Monday, November 14, 2005 - link
Well, I just don't see what enabling sound would show other than translating all the scores 5-10 fps lower. The same sound solution should hurt performance the same regardless of what video card you use so why don't you just extrapolate the results by taking into account a small performance hit with sound.At any rate, you would still be making the same buying decision (the reason for reading a hardware review in the first place) based on the relative performance of each video card tested with respect to each other.
Isn't that not true?
PrinceGaz - Monday, November 14, 2005 - link
All enabling sound with onboard audio solutions would do is shift any bottleneck further towards the CPU. The whole point of graphics-card reviews is to show the card performs, not whether the rest of the system is holding it back. Thats why resolutions up to 2048x1536 are tested.If you want to minimise audio CPU usage you don't have to spend much money. An Audigy 2 will off-load almost almost all sound overhead from the CPU. An X-Fi will do the same but also supports future EAX models, if they make any difference. If your CPU is holding back your gaming performance and you're using onboard audio, the easiest way to better performance is with a proper soundcard.
yacoub - Monday, November 14, 2005 - link
lolDude, how the card actually performs in gaming is how it performs with sound enabled. All we're REALLY seeing in this review is a GPU isolation test. Might as well leave it to the manufacturer to do such things, since it's just dick waving. I'd wager consumers would find it much more useful to see how the card ACTUALLY performs.
bob661 - Tuesday, November 15, 2005 - link
EXACTLY!!!! Which IS the point of these tests! They are intentionally isolating the GPU's because...that's what they're testing! LOL! Anand's been testing the latest and greatest for years now. This is NOT something new here.Brunnis - Monday, November 14, 2005 - link
Huh? Peak FPS? They're testing average FPS. Do you think they would be stupid enough to measure peak FPS. That would make very little sense...Cygni - Monday, November 14, 2005 - link
No way. Different sound solutions have different overheads, different overheads have different effects on the cards.Sound off. Its the only way to get an acurate comparison between the cards.
yacoub - Monday, November 14, 2005 - link
Holy crap nearly 300watts of power just for the GPU! This could be the first card that really puts a gaming system into the realm of NEEDING a 500watt high-efficiency PSU.jkostans - Monday, November 14, 2005 - link
Read he article, it's system power. Meaning at the outlet. Actual power drawn from the PSU by the system components assuming ~75% efficiency would be around 210 which isn't all that much if you think about it. A 400w PSU is plenty for this system.bloc - Monday, November 14, 2005 - link
Getting silly.The mainstream people are still looking for the best value for $200. I hope ati doesn't overreact and start releasing a bunch of vid cards to gain the title back. Wait 4-6 months for the next iteration. The mainstream wants 60 fps @ 1024. Offer the best bang for the dollar and we'll rave about it.
Don't waste money on tons of iterations. Just lower the cost of current generation to compete. Anand will do a FPS vs $$ soon enough, cause that's the real measure of value.
For $700 buy an xbox 360 or take a Winter vacation.
xbdestroya - Monday, November 14, 2005 - link
Well, my 6800GT can't give me playable in 1280x1024 for CoD2, so that's what we've come to. I would have thought that a card like this wouldn't be needed until sometime next year, but already the level of hardware required by games has takena significant jump since Doom 3.Calin - Monday, November 14, 2005 - link
Some users prefer to run their 17" or 19" LCD at the native resolution (1280x1024). This means they want good performance at that resolution. As for those that have bigger screens, they want even better performance.Even so, there are lots of good games that run ok on old video cards (even budget old video cards). But if someone chooses a certain level of quality (antialising, resolution, HDR, ...) they want, is great to have a site that present different options (cards).
PrinceGaz - Monday, November 14, 2005 - link
Isn't the PS3 supposed to be using a 24-pipe nVidia core running at 550MHz as well? If so, that would almost certainly mean that this card is faster as I bet they are using very similar cores, but the 7800GTX512 has much faster memory than the PS3.And of course there's always SLI if you want even more performance...
DerekWilson - Monday, November 14, 2005 - link
heh ... sli ... let's see, $1400 on two video cards or on 3 or 4 next gen consoles ... or on lots of other cool hardware/software/tvs/movies/games ... whateverits a fast beast, but its just too pricy :-)
steelmartin - Monday, November 14, 2005 - link
I guess if you buy this card you´re doing so partly because you´re interested in running games in the hiqhest quality settings. But afaik it can´t do OpenEXR HDR and AA like in Far Cry, so I think this card is somewhat of a contradiction. Surely it depends on how the appliction uses HDR, like Valve showed with HDR and AA for everyone in Lost Coast. But I would say, not a very futureproof card then, as everyone predicts HDR will be big in games, and I guess a lot of them will use OpenEXR. Still, it will top the charts, for what that´s worth.And about the extra memory, how about taking the card for a spin with Call of Duty 2? Seems that game takes advantage of 512 MiB.
/m
DerekWilson - Monday, November 14, 2005 - link
The advantage ATI offers is MSAA with floating point HDR. We've already seen a game (Black and White 2) that employs AA and HDR by using Supersample FSAA, and as you pointed out Valves Source engine avoids full float render targets and still gets good results.The performance hit is larger with SSAA, but it is certainly possible to have HDR and AA without the ability to do MSAA on floating point/multiple render targets. And the sheer brute strenth the 7800 GTX 512 has can easily be spent on SSAA as shown (again) by Black and White 2.
quasarsky - Monday, November 14, 2005 - link
i'm an ati fan but this is ridicoulous. ati just gets crushed and crushed. even the regular 7800 gtx gets crushed. but i knew something like this would happen if the 7800 was cranked up to a clockspeed close to the x1800xt. those extra 8 pipes and the extra memory bandwidth just lead to the same thing: crushing all opponents lol. man. is ati the new intel? i hope not :(. but thats how its looking currently :'(.ha ha i guess my x800xt aiw isn't looking so hot right now :-D.
George Powell - Monday, November 14, 2005 - link
But quite useless for most people who don't run games at statospheric resolutions.I would really like to see this running at 2560x1600 on the Apple 30".
Ozenmacher - Monday, November 14, 2005 - link
That is some pretty amazing performance. It makes my ATi X800Xl look rather pathetic...sighsKaPolski - Monday, November 14, 2005 - link
GoGo geforce 3 ti500 Woohoo!!!!! trust me it spanks the 7800 gtx 512 down to a carefully squeezed lemon :DXenoterranos - Monday, November 14, 2005 - link
w00t! I traded my matching-numbers first-run GeForce 3 (Before they were TI'd) in for a 5900. im not upgrading till socket M2 comes a-rolling into the bargain bin.LoneWolf15 - Monday, November 14, 2005 - link
It it a fast card? Heck yeah. Is it necessary? Far from it. I have an ATI X800XL as well, and I don't plan on switching until I have to. Game developers will continue to make games compatible with our cards for some time to come, and the only thing we'll be missing is Shader Model 3.0. So far, what I have seen of it hasn't been a big enough improvement to encourage me to go out and plunk cash down on a new card. And seeing as my gaming is now measured in hours per week (as opposed to hours per day, like when I worked in a computer store) I couldn't justify spending that kind of bread on something that isn't constantly in use.
I think the 7800GTX 512 is a neat looking toy. But that's just it: it's a toy. I'd rather cover two car payments or two-thirds of a mortgage payment, things I NEED to spend money on It's called "marketing". Don't succumb to it.
Pythias - Monday, November 14, 2005 - link
I agree. I also think the $600 dollar pricetag on the x1800xt is a bit much as well.
phusg - Monday, November 14, 2005 - link
LOL. Weary != wary and in fact reads as the opposite to what I think you mean in this sentence!
ElFenix - Monday, November 14, 2005 - link
i've been asking them to hire an editor for a few years now, but i'm pretty sure they haven't taken my advice yet. every once in a while they post an article that is just unreadable due to the run on and compound sentences.yacoub - Monday, November 14, 2005 - link
It's worse when you have a journalism degree. It drives you up a wall to read so many grammatical and spelling errors. Even so I'd rather they put the money into doing more 'real world' style tests instead of just these Top of the Charts/ GPU with the Biggest Dick contests.phusg - Thursday, November 17, 2005 - link
It could be worse, at least they updated the article to change this comical spelling mistake. Puts them in line with the rest of the computer industry where the testing phase of the development cycle is outsourced to the consumer ;-)Methusela - Monday, November 14, 2005 - link
This thing just destroys every other single card, and every other SLI configuration in almost every test! Yikes. I guess it had better do so at $700 apiece, though.Maybe this will push the price of the 7800GT and GTX models down in a couple of weeks? Cost-conscious buyers like myself can only hope so.
route66 - Monday, November 14, 2005 - link
Sick!Googer - Monday, November 14, 2005 - link
Holy Handgranades 54.4 GB/s Memory Bandwith!pol II - Monday, November 14, 2005 - link
Nice card