Good for ATI, after some issues in the not so distant past it looks like the pendulum has swung back in their direction.
i really like this, it should drop the 7800GT prices down maybe to the ~$200-$220(hoping, as nvidia want to keep the market hold...) which would actually give me a reason to switch to some flavor of pci-e based m/b, but being display limited @ 1280x1024 with a lcd, my x800xtpe is still chugging along nicely :)
How hard would it be for this new series of cards by ATI to be optimized for all benchmarking softwares? Well ask yourself that, I just got done talking to a buddy of mine whos working out at MSI. I swear I freaked out when he said that ATI is using an advantage they found by optimizing the new R580's to work better with the newest benchmarking programs like 3Dmark 06 and such. I argued with him thats impossible, or is it? Please let me know, did ATI possibly use optimizations built into the new R580 cards to gain this advantage?
how would validating die-space on a gpu for cheats make any sense? If there is any cheat it's in the drivers. And no, the only thing is that 3dmark06 needs 24bit DST's for its shadowing and that wasn't supported in the x1800xt (uses some hack instead) and it is supported now. Is that cheating? The x1600 and x1300 have support for this as well btw, and they came out at the same time as the x1800.
Architecturally optimizing for one kind of rendering being called a cheat would make nvidia a really bad company for what they did with the 6x00/Doom3 engine. But noone is complaining about higher framerates in those situations now are they?
It is not impossible, but unless your friend works in some high level capacity I would say his comments at best are questionable. I don't think working in shipping will qualify him as an expert on the subject?
"Notoriously demanding on GPUs, F.E.A.R. has the ability to put a very high strain on graphics hardware, and is therefore another great benchmark for these ultra high-end cards. The graphical quality of this game is high, and it's highly enjoyable to watch these cards tackle the F.E.A.R demo."
Wasn't use of this considered a bad idea as Nvidia cards have a huge performance penalty when used in this and the final buuld was supposed to be much better???
No replies huh? Cause I've read on other sites that the card draws upto 175w... Seems like quite a stretch so that was why I did the math to start with...
This test kind of seems biased to me. The cards were tried in CrossFire for the ATI cards, but when it came to the Nvidia 7800 GTX's, 512mb and 256mb, neither were tested in SLI and compared to CrossFire.
Anyone have a comparison of SLI vs. CrossFire for the same tests?
Seems weird not to have Sli GT's in there. I still think thats the best deal in highend -around the $550 price point. Should clean up on both the 1900Xt and 1900XTX pretty handily for the same price or less. Is AT still in the business of recommending "bang for the buck"? or moving away from that? Because only .05% of your readers are going go up into the realm of $1000 video cards ( GTX's and XTX's in dual config)
Are the figures in the "Load Power" chart the power consumption of just the video card, or the entire system? If those numbers are just the video card, that's flat out insane.
The numbers in the Load Power chart represent the power draw for the entire system under stress testing. Even so, the 7800 GTX 512 SLI and X1900 XTX Xfire setups are ridiculously power-hungry.
its nice to see ATI come up with something GOOD after so many disappointments, paper-launches etc.
$500 is an "attractive" price (relatively spoken), looking at the card's specs...i am still having a X850XT and (sadly ???) dont really have an "urge" to get this card since i MAINLY play HL2 (full details, even AAx6) and its fast and great even on my old X850XT. Al;most makes me wish i had more game-engines which demand/justify upgrading to this card.
As said..very hapy for ATI and this card is all the way UP on my wishlist (since i am a graphicscard-ho ;).....but then i also know G71 will come and this card will be a killer-card too (from the theoretical speaks). If i had a very slow system and barely could play any games i PROBABLY would get the R580 now... ;)
great job by ATI for bringing out some killer cards. note that a crossfire x1900 system is CHEAPER than 7800 512's. but hey, regardless of who's on top we win :)
as far as the parts being expensive - of course they will be, they're top of the line and released today.
i bought a radeon x800pro for $170 and run COD2 at 1280x1024 maxed out (no FSAA/AF) with very few framedrops (worst is scoping in on smoke from grenades). i also have stuttering issues with HDR. minus HDR, i run HL2 @ 1280x1024 6xFSAA and 16xAF. this is coming from a budget system! putting all my components together, my setup costs about 700.
Did anyone else notice how the x1800xt trounced the 7800gtx in almost all tests? A look at the 7800gtx 512 release benchmarks shows the exact opposite. Perhaps the quality settings were on for the 7800gtx while the x1800xt had performance settings. Even the 7800gtx 512 which cannot possibly have a larger than 40% lead over the 7800gtx has a 100% lead in some cases.
The people who buy a card this expensive the first day it comes out won't keep it for a whole year, so the warranty doesn't matter. In 6 months another card will be out that makes this one look slow and they'll be spending even more money.
Due to popular demand, we have added more percent increase performance comparison graphs to the performance breakdown that shows the performance relatoinships at lower resolutions.
Let us know if there is anything else you'd like to see. Thanks!
Thanks for the input all. Just to let you know we are dealing with some problems regarding our power numbers, but they should be up shortly. Thanks for being patient.
One more thing.. We also caught a mistype on the graphs that we are in the process of correcting. The two crossfire systems we tested are the X1900 XTX Crossfire and the X1800 XT Crossfire. (we miss-labeled the latter "X1900 XT Crossfire") Sorry for any confusion this may have caused.
Ah... That makes much more sense now. I was wondering why the XTX crossfire was doing so much better than the XT crossfire when the specs were so similar.
Has anyone else noticed that the X1900XTX only out performs over all in the 1920 x 1440 settings? It is a 50/ 50 split for the most part in 1280 x 960 and 1600 x 1200. So once again everyone and there mom wont be seeing the same numbers as these guys! So misleading!
Yeah I know its looking ahead to the future, but looking on themarket only 5%-2% of gamers run games at these specs... I will give ATI credit that they made a good card "Finally" but lets compare apples to apples, they should take the combined scores and average them out and see what the overall outcome is... im sure ATI will still ead but by not as much as it portray's!
Hmm..1280x1024 would seem to be a useful resolution since many use
19 in and 17 in LCD's.
Seriously, No way can I afford this card but I would like to see it
compared to the 1800XL card at this resolution.
(I'm hoping the 1800XL price drops so I can pick up an ATI shader3
card for less than $250 which is my price point)
Yay for ATI..but ATI still has no mid priced card with shader3.0 :-(
well, the definition of "resolution most people run" change constantly. I understand you are using 4x AA, but 8xAA is right around the corner, as well as higher resolution.
I don't see this kind of performance advantage for x1900xtx on any other setting than the one you used for performance advantage, therefore that graphic is perhaps, a personal/biased view that will not stand against time.
I have an idea. Maybe you could buy something less expensive. For example, a $200 card should be able to play modern games albeit with lower image quality than the more expensive cards. But don't tell anyone! I want to keep this a secret.
I am taking issue with ATI and Anandtech on this one:
1.) The bloody X1800 series is pretty dang new, they are taking about phasing it out when I have not even had a chance to use the AVIO video tool yet WTF!?
2.) The section of the article "Performance Breakdown" http://anandtech.com/video/showdoc.aspx?i=2679&...">http://anandtech.com/video/showdoc.aspx?i=2679&... is very misleading to say the least. I think you owe the readers an explination at how you arrived at those numbers? Did you test at one resolution? Only testing for a resolution that allows the ATI card the advantadge is hardly fair. I know a lot of gamers including myself that still generally game at 1024x768 how do the cards fair at that resolution? What are the real differences overall? I think you should either pull this out of the article all together or test at least 3 common resolutions (1024x768) (1280x1024) & (1600x1200).
The point about the X1800 is well taken. It would be hard for us to expect them to push back their "refresh" part because they dropped the ball on the R520. And it also doesn't make economic sense to totally scrap the R520 before it gets out the door.
It was a tough sitation. At lesat ATI didn't take as much of a bath on the R520 as NVIDIA did on NV30 ...
But again, I certainly understand your sentiment.
Second, I did explain where the numbers came from. 2048x1536 with 4xAA in each game. The graph mentions that we calculated percent increase to x1900 performance -- which means our equation looks like this --
if you game at 1024x768, you have absolutely no business buying a $600 video card.
again ... ^^
we did test 12x10 and 16x12 and people who want those results can easily see them on each game test page.
This is a high end card and it seems like the best fit to describe performance is a high end test. If we did a 1024x768 test it would just be an exercise in observing the cpu overhead of the driver and how well the fx57 was able to handle it.
Our intention is not to mislead. But people often want a quick overview, and detail and acuracy are fundamentally at odds with the idea of a quick and easy demonstration. Our understanding is that people interested in this card are intereted in high quality, high res performance, so this cross section of performance seemed the most logical.
I do think benchmarking at 1024x768 is perfectly valid. I for one game at 1024x768 with all the candy turned on so not only can I enjoy my looks of the game but I can still render high frames while playing online. When I want to enjoy the single player option I am willing to let the frames dive down to the bare minimums so I can enjoy all that the game has to offer without getting lag killed. So I think that justifys my reasons to want a $600.00 video card (-;
Although I was slightly incorrect about how you benchmarked I stand by my feeling that is is not a TRUE representation of the cards performance across the board. It is a snapshot which leaves a lot of holes unfilled. I would feel cheated as a customer to find that it did not perform as well as I had been lead to believe if the way I wanted to use it was not optimized as well as another way. Understood?
Thanks for the time you have spent evaluating it...it does give us an overall feeling and of course when I am looking at spending that kind of money you can bet I will be doing a lot more then reading one review (-:
It would be nice to see World of Warcraft included in the benchmark set. Considering it is probably the most popular game in the world, I'm sure many readers would find the benchmarks useful.
Why, was the WoW engine changed recently? It's easy to max out WoW display settings on far less capable cards, so what useful information would come from benchmarking it with bleeding edge gear? Unless maybe you're running it on some massive $3000 monitor, in which case upgrading to a 300-500 dollar video card should be a no-brainer. The only useful benchmark would be "How would my older video card handle WoW?" and that's already been done. Must be missing something here.
Considering the average framerate on a 6800 ultra at 1600x1200 is a little above 50 fps without AA, I'd say this is a perfectly relevant app to benchmark. I want to know what will run this game at 4 or 6 AA with 8 AF at 1600x1200 at 60+ fps. If you think WOW shouldn't be benchmarked, why use Far Cry, Quake 4 or Day of Defeat?
At the very least WOW has a much wider impact as far as customers go. I doubt the total sales for all three games listed above can equal the current number of WOW subscribers.
And your $3000 monitor comment is completely ridiculous. It isn't hard to get a 24 inch wide screen for 800 to 900 bucks. Also, finding a good CRT that can display greater than 1600x1200 isn't hard and that will run you $400 or so.
Its amusing how the years have changed everyone's perception as to how much is a reasonalble price for a component. Hardrives, memory, monitors and even CPUs have become so cheap many have lost the perspective of what being on the leading edge costs. I paid 750$ for a 100 MB drive for my Amiga, 500$ for a 4x CR-ROM and remember spending 500$ on a 720 X 400 Epson colour injet. (Yeah I'm in my 50's) As long as games continue to challenge the capabilities of video cards and the drive to increase performance continues the top end will be expensive. Unlike other hardware (printers, memory, hardrives) there are still perfomance improvements to be made that the user will perceive. If someday a card can render so fast that all games play like reality, then video cards will become like hardrives are now.
Everyone gets this wrong! It uses 16 PIXEL-PIPELINES with 48 PIXEL SHADER PROCESSORS in it! the pipelines are STILL THE SAME as the X1800XT! 16!!!!!!!!!! oh yeah, if you're wondering, in 3DMark 2005, it reached 11,100 on just a Single X1900XTX...
semantics -- we are saying the same things with different words.
fill rate as the main focus of graphics performance is long dead. doing as much as possible at a time to as many pixels as possible at a time is the most important thing moving forward. Sure, both the 1900xt and 1800xt will run glquake at the same speed, but the idea of the pixel (fragment) pipeline is tied more closely to lighting, texturing and coloring than to rasterization.
actually this would all be less ambigous if opengl were more popular and we had always called pixel shaders fragment shaders ... but that's a whole other issue.
I know not everyone plays it, but it would be nice to have you guys run your tests with it. Especially when we are shopping for $500 dollar plus video cards.
We need to stop being so impressed by so very little. When games look like REAL LIFE does with lots of colors, shading, no jagged edges (unless its from the knife I just plunged into your eye) lol you get the picture.
technology moves forward at a slower pace then that mates. U expect every vid card to be a 9700pro?! right. there has to be a pace the developers can follow.
Not at all, I do not see DX10 arriving before vista near the end of this year. If it does earlier it will not make any splash whatsoever on game development before that. Even so, you cannot be 'behind' if you're only competitor is still at SM3.0 as well. As far as I can tell, there will be no HARD architectural changes in G71/7900 - they might improve tidbits here and there, like support for AA while doing HDR rendering, but that will be about the full extent of changes.
Perhaps I interpreted something wrong, but is it correct that you're saying X1900 is more of a 12x4 technology (because of fetch4) than the 16x3 we always thought? If so, that would make it A LOT more like Xenos, and perhaps R600, which makes sense, if I recall their ALU setup correctly (Xenos is 16x4, one for stall, so effective 16x3). R520 was 16x1, so...I gotta ask...Does this mean a 16x4 is imminent, or am I just reading the information incorrectly?
If that's true, ATi really did mess with the definition of a pipeline.
I can hear the rumours now...R590 with 16 QUADS, 16 ROPs, 16 TMUs, and 64 pixel processors...Oh yeah, and GDDR4 (on a 80nm process.) You heard it here first. ;)
this is where things get a little fuzzy ... when we used to refer to an architecture as being -- for instance -- 16x1 or 8x2, we refered to the pixel shaders ability to texture a pixel. Thus, when an application wanted to perform multitexturing, the hardware would perform about the same -- single pass graphics cut the performance of the 8x2 architecture in half because half the texturing poewr was ... this was much more important for early dx, fixed pipe, or opengl based games. DX9 through all that out the window, as it is now common to see many instructions and cycles spent on any given pixel.
in a way, since there are only 16 texture units you might be able to say its something like 48x0.333 ... it really isn't possible to texture all 48 pixels every clock cycle ad infinitum. in an 8x2 architecture you really could texture each of 8 pixels with 2 textures every clock cycle forever.
to put it more plainly, we are now doing much more actual work with the textures we load, so the focus has shifted from "texturing" a pixel to "shading" a pixel ... or fragment ... or whatever you wanna call it.
it's entirely different then xenos as xenos uses a unified shader architecture.
interestingly though, R580 supports a render to vertex buffer feature that allows you to turn your pixel shaders into vertex processors and spit the output straight back into the incoming vertex data.
Just a few comments (some are being very picky I know)
1) Why are you using the latest hardware with and old Seagate 7200.7 drive when the 7200.9 series is available? Also no FX-60?
2) Disappointing to see no power consumption/noise levels in your testing...
3) You are like the first site to show Crossfire XTX benchmarks? I am very confused... I thought there was only a XT Crossfire card so how do you get Crossfire XTX benchmarks?
crossfire xtx indicates that we ran a 1900 crossfire edition card in conjunction with a 1900 xtx .... this is as opposed to running the crossfire edition card in conjunction with a 1900 xt.
crossfire does not synchronize GPU speed, so performance will be (slightly) better when pairing the faster card with the crossfire.
fx-60 is slower than fx-57 for single threaded apps
power consumption was supposed to be included, but we have had some power issues. We will be updating the article as soon as we can -- we didn't want to hold the entire piece in order to wait for power.
harddrive performance is not going to affect anything but load times in our benchmarks.
Are gamers going insane. $500+ for video card is not a good price. Maybe its jsut me but are bragging rights really worth thats kind of money. Even if you played a game thats needs it you should be pissed at the game company thats puts a blot mess thats needs a $500 card.
$500 too much? there are cars for $300, 000+, but u dont see the majority of ppl complaining because they're NOT aimed at u and me and ferrari & lamborghini could care less what we think cause we're not their target audience. get over yourself, there ARE cards for you in the $100+ $300, so what are u worried about?
While I agree with what you are saying, we are already on our 3rd generation of $500 high end graphic cards. If memory serves, it was the Nvidia 6800 that broke the $500 barrier for a single card solution.
I'm just happy it seems to have leveled off at $500.
Actually GPU's in general scale very well with price/performance and this is no exception. Twice as fast as a 850 XT which you can get for $275 should cost twice as much or $550 which it does. If you want to complain about prices look at CPUs, high end memory and raptors/SCSI which higher line items offer small benefits for huge price premiums.
Geez, talk about missing the point. News flash: Bleeding edge computer gear costs a lot. $500 is an excellent price for the best card out. Would I rather have it for $12? Yes. Can I afford/justify a $500 gfx card? No, but more power to those who can, and give revenue to ATI/Nvidia so that they can continue to make better cards that relatively quickly fall within my reach. I can't afford a $400 9800 pro either... whoops! They don't cost that much now, do they?
quote: Even if you played a game thats needs it you should be pissed at the game company thats puts a blot mess thats needs a $500 card.
Short-sighted again. Look at the launch of Unreal games for instance. Their code is always awesome on the performance side, but can take advantage of more power than most have available at release time. You can tell them their code is shoddy, good luck with that. In reality it's great code that works now, and your gaming enjoyment is extended as you upgrade over time and can access better graphics without having to buy a new game. Open up your mind, quit hating and realize that these companies are giving us value. You can't afford it now, neither can I, but quit your crying and applaud Nv/ATI for giving us constantly more powerful cards.
I like new line graph color and interface, but i like bar graph so much more. Never a big fan over SLI or Crossfire on the graph, makes its a distracting, especially it only represent a small group. Wonder if crossfire and sli can have their own graph by themselves or maybe their own color. =)
it could be possible for us to look at multigpu solutions serpeately, but it is quite relevant to compare single card performance to multigpu performance -- especially when trying to analyze performance.
Good reading! Good to see ATI getting back in the game. Now lets see some price competition for a change.
I don’t understand what CrossFire XTX means. I thought there was no XTX crossfire card? Since the Crossfire and XT have the same clocks it shouldn’t matter if the other card is a XTX. By looking at the graphs it would seem I was wrong but how can this be? This would indicate that the XTX has more going for it then just the clocks but that is not so, right?
My understanding is that Crossfire is async, so both cards run at their maximum speed. The XTX card runs at 650/1.55, while the Crossfire Edition card runs at 625/1.45. You're right, there is no Crossfire Edition XTX card.
perhaps a flash system where you can pick the card within the benchmark and it will show it on the line graph. just simply activate/deactivate feature.
I have to agree that a group color for the multi-GPU setups would be helpful on the bar graphs. The outline you used to denote negative gains would work well for this. Then ATI and Nvidia bars would still have a different major color, but the multi-GPU setups could have a yellow outline. E.G. ATI = red, ATI X-fire = Red w/ yellow outline, Nvidia = blue, Nvidia SLI = blue w/ yellow outline.
I don't know if you meant this or not, on the page mentioning the new crossfire board. There is url, I don't know if it was intended to be active or plain text, but I thought I would just bring that to your attention.
All the second to last section describes in the Image Quality. There was no explaination on power consumtion at all. Was this an accidental omit or something else??
A few things I would like seen done; Put a low-end PCI GFX card in the comp, boot it and register power consumption, leave that card in and then do your normal tests with a single X1900 and then dual so we get a real point on how much power they consume...
Also please clarify exactly what PSU was used and how the consumption was measured so we can figure out more accuratley how much power the card really draws (when counting in the (in)efficiency of the PSU that is...
That's a good idea on isolating the power of the video card.
From the other reviews I've read, the X1900 cards are seriously power hungry. In the neighborhood of 40-50W more than the X1800XT cards. The GTX 512 (and GTX of course) are lower than the X1800XT, let alone the X1900 cards.
Battlefield 2 @ 2048x1536 Max Detail
7800GTX512 33FPS
AIT 1900XTX 32.9FPS
ATI 1900XTX Crossfire. 29FPS
-------------------------------------
Day of Defeat
7800GTX512 18.93FPS
AIT 1900XTX 35.5PS
ATI 1900XTX Crossfire. 35FPS
-------------------------------------
Fear
7800GTX512 20FPS
AIT 1900XTX 36PS
ATI 1900XTX Crossfire. 49FPS
-------------------------------------
Quake 4
7800GTX512 43.3FPS
AIT 1900XTX 42FPS
ATI 1900XTX Crossfire. 73.3FPS
Becareful here ... these max detail settings enabled superaa modes which really killed performance ... especially with all the options flipped on quality.
we're working on getting some screens up to show the IQ difference. but suffice it to say that that the max detail settings are very apples to oranges.
we would have seen performance improvements if we had simply kept using 6xAA ...
to further clarify, fear didn't play well when we set AA outside the game, so it's max quality ended up using the in game 4xaa setting. thus we see a performance improvement.
for day of defeat, forcing aa/af through the control panel works well so we were able to crank up the quality.
I'll try to go back and clarify this in the article.
I'm not sure how that justifies what happens. Your argument is that it is the VERY highest settings so that its ok for the 'dual' 1900xtx to have lower performance than a single card alternative? That doesn't seem to make sense and speaks poorly for the ATI implementation.
I didn't see any comparisons between X1900 XT CrossFire and X1900 XTX CrossFire, except for the comments at the end of the article saying diminishing returns resulting in even smaller a gap in CrossFire than the XTX had over the XT to begin with.
With the exception of 1 B&W2 test (which I suspect is a typo), the graphs all show the X1800 XT vs. the X1900 XTX. Those are two different generations, not just clock speeds.
Nice. Maybe we'll start seeing some real developments in GPU's again. Right now, it's more of a 'do what we've been doing but faster', maybe we'll start seeing some new innovations in video tech in the coming year (adding physics processing, wider encoding capabilities, etc.).
The x.00 line was more of the same yeah, but the x1.00 line is architecturally a pretty large step forward, finally on par with nvidia where it really needed to and a few steps ahead in other areas. If only they gave it more ROPs/Texture engines.
nice scores on ATI but i still have my 6800GT ill wait until next 2 gen i see we can now play respectable 40 frames + @ 19200X1400 nice to know when 1080p tv are out
don't know either that or get a PS3 will see =) it's getting way to expensive.
Digitally Unique has the X1900XT for $525 and Actbuy had them for $504. Based on performance, these cards offer a great bang for your buck. And this is coming from a GTX 512 owner.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
120 Comments
Back to Article
bob4432 - Thursday, January 26, 2006 - link
Good for ATI, after some issues in the not so distant past it looks like the pendulum has swung back in their direction.i really like this, it should drop the 7800GT prices down maybe to the ~$200-$220(hoping, as nvidia want to keep the market hold...) which would actually give me a reason to switch to some flavor of pci-e based m/b, but being display limited @ 1280x1024 with a lcd, my x800xtpe is still chugging along nicely :)
Spoelie - Thursday, January 26, 2006 - link
it won't, they're in a different pricerange alltogether, prices on those cards will not drop before ati brings out a capable competitor to it.neweggster - Thursday, January 26, 2006 - link
How hard would it be for this new series of cards by ATI to be optimized for all benchmarking softwares? Well ask yourself that, I just got done talking to a buddy of mine whos working out at MSI. I swear I freaked out when he said that ATI is using an advantage they found by optimizing the new R580's to work better with the newest benchmarking programs like 3Dmark 06 and such. I argued with him thats impossible, or is it? Please let me know, did ATI possibly use optimizations built into the new R580 cards to gain this advantage?Spoelie - Thursday, January 26, 2006 - link
how would validating die-space on a gpu for cheats make any sense? If there is any cheat it's in the drivers. And no, the only thing is that 3dmark06 needs 24bit DST's for its shadowing and that wasn't supported in the x1800xt (uses some hack instead) and it is supported now. Is that cheating? The x1600 and x1300 have support for this as well btw, and they came out at the same time as the x1800.Architecturally optimizing for one kind of rendering being called a cheat would make nvidia a really bad company for what they did with the 6x00/Doom3 engine. But noone is complaining about higher framerates in those situations now are they?
Regs - Thursday, January 26, 2006 - link
....Where in this article do you see a 3D Mark score?mi1stormilst - Thursday, January 26, 2006 - link
It is not impossible, but unless your friend works in some high level capacity I would say his comments at best are questionable. I don't think working in shipping will qualify him as an expert on the subject?coldpower27 - Wednesday, January 25, 2006 - link
http://www.anandtech.com/video/showdoc.aspx?i=2679...">http://www.anandtech.com/video/showdoc.aspx?i=2679..."Notoriously demanding on GPUs, F.E.A.R. has the ability to put a very high strain on graphics hardware, and is therefore another great benchmark for these ultra high-end cards. The graphical quality of this game is high, and it's highly enjoyable to watch these cards tackle the F.E.A.R demo."
Wasn't use of this considered a bad idea as Nvidia cards have a huge performance penalty when used in this and the final buuld was supposed to be much better???
photoguy99 - Wednesday, January 25, 2006 - link
I noticed 1900x1440 is commonly benchmarked -Wouldn't the majority of people with displays in this range have 1920x1200 since that's what all the new LCDs are using? And it's the HD standard.
Aren't LCDs getting to be pretty capable game displays? My 24" Acer has a 6 ms (claimed) gray to gray response time, and can at least hold it's own.
Resolution for this monitor and almost all others this large: 1920x1200 - not 1920x1440.
Per Hansson - Wednesday, January 25, 2006 - link
Doing the math:Crossfire = 459w - 1900XTX = 341w = 118w, efficiency of PSU used@400w=78% so 118x0.78=92,04w
Per Hansson - Friday, January 27, 2006 - link
No replies huh? Cause I've read on other sites that the card draws upto 175w... Seems like quite a stretch so that was why I did the math to start with...Midreian - Wednesday, January 25, 2006 - link
This test kind of seems biased to me. The cards were tried in CrossFire for the ATI cards, but when it came to the Nvidia 7800 GTX's, 512mb and 256mb, neither were tested in SLI and compared to CrossFire.Anyone have a comparison of SLI vs. CrossFire for the same tests?
DerekWilson - Wednesday, January 25, 2006 - link
For all the games but Battlefield 2 we ran both CrossFire and SLI numbersWe only ran SLI for the GTX 512 because we only looked at the highest end multigpu solution for each series (7800, 1800, 1900).
We would have included SLI in the BF2 portion, but our benchmark doesn't correctly represent gameplay for SLI. We are working on this.
Thanks,
Derek Wilson
Zebo - Tuesday, January 24, 2006 - link
Seems weird not to have Sli GT's in there. I still think thats the best deal in highend -around the $550 price point. Should clean up on both the 1900Xt and 1900XTX pretty handily for the same price or less. Is AT still in the business of recommending "bang for the buck"? or moving away from that? Because only .05% of your readers are going go up into the realm of $1000 video cards ( GTX's and XTX's in dual config)danidentity - Tuesday, January 24, 2006 - link
Are the figures in the "Load Power" chart the power consumption of just the video card, or the entire system? If those numbers are just the video card, that's flat out insane.Josh Venning - Wednesday, January 25, 2006 - link
The numbers in the Load Power chart represent the power draw for the entire system under stress testing. Even so, the 7800 GTX 512 SLI and X1900 XTX Xfire setups are ridiculously power-hungry.flexy - Tuesday, January 24, 2006 - link
its nice to see ATI come up with something GOOD after so many disappointments, paper-launches etc.$500 is an "attractive" price (relatively spoken), looking at the card's specs...i am still having a X850XT and (sadly ???) dont really have an "urge" to get this card since i MAINLY play HL2 (full details, even AAx6) and its fast and great even on my old X850XT. Al;most makes me wish i had more game-engines which demand/justify upgrading to this card.
As said..very hapy for ATI and this card is all the way UP on my wishlist (since i am a graphicscard-ho ;).....but then i also know G71 will come and this card will be a killer-card too (from the theoretical speaks). If i had a very slow system and barely could play any games i PROBABLY would get the R580 now... ;)
Fenixgoon - Tuesday, January 24, 2006 - link
great job by ATI for bringing out some killer cards. note that a crossfire x1900 system is CHEAPER than 7800 512's. but hey, regardless of who's on top we win :)as far as the parts being expensive - of course they will be, they're top of the line and released today.
i bought a radeon x800pro for $170 and run COD2 at 1280x1024 maxed out (no FSAA/AF) with very few framedrops (worst is scoping in on smoke from grenades). i also have stuttering issues with HDR. minus HDR, i run HL2 @ 1280x1024 6xFSAA and 16xAF. this is coming from a budget system! putting all my components together, my setup costs about 700.
Xenoterranos - Wednesday, January 25, 2006 - link
Wow, for that kind of money, you could have almos bought an Xbox 360 bundle...or half a ps3 (har har har).lamestlamer - Tuesday, January 24, 2006 - link
Did anyone else notice how the x1800xt trounced the 7800gtx in almost all tests? A look at the 7800gtx 512 release benchmarks shows the exact opposite. Perhaps the quality settings were on for the 7800gtx while the x1800xt had performance settings. Even the 7800gtx 512 which cannot possibly have a larger than 40% lead over the 7800gtx has a 100% lead in some cases.ocyl - Tuesday, January 24, 2006 - link
It's been mentioned above but I will say it again. While it's okay to say that R580 has 48 pixel shaders, it only really has 16 pixel pipelines.Harkonnen - Tuesday, January 24, 2006 - link
Almost $900 CDN for the XTX and it only has a 1 year warranty?Main reason I would never buy an expensive ATi card is that right there.
smitty3268 - Tuesday, January 24, 2006 - link
The people who buy a card this expensive the first day it comes out won't keep it for a whole year, so the warranty doesn't matter. In 6 months another card will be out that makes this one look slow and they'll be spending even more money.DerekWilson - Tuesday, January 24, 2006 - link
Due to popular demand, we have added more percent increase performance comparison graphs to the performance breakdown that shows the performance relatoinships at lower resolutions.Let us know if there is anything else you'd like to see. Thanks!
Live - Tuesday, January 24, 2006 - link
The performance breakdown looks very good now! I would go so far as to say that this should be standard in future reviews.piroroadkill - Tuesday, January 24, 2006 - link
Using a lossy image format (JPEG) for image quality comparison screenshots seems kind of... pointless.But I guess you have to worry about bandwidth.
Josh Venning - Tuesday, January 24, 2006 - link
Thanks for the input all. Just to let you know we are dealing with some problems regarding our power numbers, but they should be up shortly. Thanks for being patient.Josh Venning - Tuesday, January 24, 2006 - link
One more thing.. We also caught a mistype on the graphs that we are in the process of correcting. The two crossfire systems we tested are the X1900 XTX Crossfire and the X1800 XT Crossfire. (we miss-labeled the latter "X1900 XT Crossfire") Sorry for any confusion this may have caused.smitty3268 - Tuesday, January 24, 2006 - link
Ah... That makes much more sense now. I was wondering why the XTX crossfire was doing so much better than the XT crossfire when the specs were so similar.SpaceRanger - Tuesday, January 24, 2006 - link
Problems with the publishing of them, or problems in the sense that it requires a direct link into a nuclear reactor to power properly??
DerekWilson - Tuesday, January 24, 2006 - link
our local nuclear plant ran us an extention cord just for this event :-)Josh Venning - Tuesday, January 24, 2006 - link
:-)GTMan - Tuesday, January 24, 2006 - link
How long until we see lower end parts.My guesses:
X1900XL replaces X1800XL
X1700 replaces X1600
Sledgehamer70 - Tuesday, January 24, 2006 - link
Has anyone else noticed that the X1900XTX only out performs over all in the 1920 x 1440 settings? It is a 50/ 50 split for the most part in 1280 x 960 and 1600 x 1200. So once again everyone and there mom wont be seeing the same numbers as these guys! So misleading!Yeah I know its looking ahead to the future, but looking on themarket only 5%-2% of gamers run games at these specs... I will give ATI credit that they made a good card "Finally" but lets compare apples to apples, they should take the combined scores and average them out and see what the overall outcome is... im sure ATI will still ead but by not as much as it portray's!
DerekWilson - Tuesday, January 24, 2006 - link
again, if you want a card to run at low resolutions, the 6800 GS or x800 gto are probably much better and more cost effective ways to go.why does everyone want to swat a fly with a baseball bat?
Wellsoul2 - Tuesday, January 24, 2006 - link
Hmm..1280x1024 would seem to be a useful resolution since many use19 in and 17 in LCD's.
Seriously, No way can I afford this card but I would like to see it
compared to the 1800XL card at this resolution.
(I'm hoping the 1800XL price drops so I can pick up an ATI shader3
card for less than $250 which is my price point)
Yay for ATI..but ATI still has no mid priced card with shader3.0 :-(
beggerking - Tuesday, January 24, 2006 - link
well, the definition of "resolution most people run" change constantly. I understand you are using 4x AA, but 8xAA is right around the corner, as well as higher resolution.I don't see this kind of performance advantage for x1900xtx on any other setting than the one you used for performance advantage, therefore that graphic is perhaps, a personal/biased view that will not stand against time.
vladik007 - Tuesday, January 24, 2006 - link
Are they really out of their mind ? I've never bought console before but these PC components prices are gonna drive me out of the market.And i thought my 6800GT for $400 was absurd price to pay.... wow
nullpointerus - Tuesday, January 24, 2006 - link
I have an idea. Maybe you could buy something less expensive. For example, a $200 card should be able to play modern games albeit with lower image quality than the more expensive cards. But don't tell anyone! I want to keep this a secret.poohbear - Wednesday, January 25, 2006 - link
rofl nullpointerus u crack me up. So true man, LETS keep it a secret.;)poohbear - Wednesday, January 25, 2006 - link
rofl nullpointerus u crack me up. So true man, LETS keep it a secret.;)ChronoReverse - Tuesday, January 24, 2006 - link
Indeed. My 6800LE cost $129 and unlocked, gets pretty close to 6800GT speeds.Obviously not as good, but still pretty damn good and I paid a lot less too.
It's been like this since the TNT2 M64 came out.
mi1stormilst - Tuesday, January 24, 2006 - link
I am taking issue with ATI and Anandtech on this one:1.) The bloody X1800 series is pretty dang new, they are taking about phasing it out when I have not even had a chance to use the AVIO video tool yet WTF!?
2.) The section of the article "Performance Breakdown" http://anandtech.com/video/showdoc.aspx?i=2679&...">http://anandtech.com/video/showdoc.aspx?i=2679&... is very misleading to say the least. I think you owe the readers an explination at how you arrived at those numbers? Did you test at one resolution? Only testing for a resolution that allows the ATI card the advantadge is hardly fair. I know a lot of gamers including myself that still generally game at 1024x768 how do the cards fair at that resolution? What are the real differences overall? I think you should either pull this out of the article all together or test at least 3 common resolutions (1024x768) (1280x1024) & (1600x1200).
Just my two cents.
GTMan - Tuesday, January 24, 2006 - link
Yes, your card is now out of date. You should stop using it. I'll give you $5 for it if that would make you feel better.mi1stormilst - Wednesday, January 25, 2006 - link
HAHA! My kid 8 year old kid will inherit the X1800XL in a few months after I order the X1900XT. I bet that bothers you ... no? (-;beggerking - Tuesday, January 24, 2006 - link
I 2nd that!!b/w, this is a repost. lol.. see my comments above..
DerekWilson - Tuesday, January 24, 2006 - link
I appreciate your comments.The point about the X1800 is well taken. It would be hard for us to expect them to push back their "refresh" part because they dropped the ball on the R520. And it also doesn't make economic sense to totally scrap the R520 before it gets out the door.
It was a tough sitation. At lesat ATI didn't take as much of a bath on the R520 as NVIDIA did on NV30 ...
But again, I certainly understand your sentiment.
Second, I did explain where the numbers came from. 2048x1536 with 4xAA in each game. The graph mentions that we calculated percent increase to x1900 performance -- which means our equation looks like this --
((x1900 score) - (competing score)) / competing score * 100
if you game at 1024x768, you have absolutely no business buying a $600 video card.
again ... ^^
we did test 12x10 and 16x12 and people who want those results can easily see them on each game test page.
This is a high end card and it seems like the best fit to describe performance is a high end test. If we did a 1024x768 test it would just be an exercise in observing the cpu overhead of the driver and how well the fx57 was able to handle it.
Our intention is not to mislead. But people often want a quick overview, and detail and acuracy are fundamentally at odds with the idea of a quick and easy demonstration. Our understanding is that people interested in this card are intereted in high quality, high res performance, so this cross section of performance seemed the most logical.
mi1stormilst - Wednesday, January 25, 2006 - link
Thanks for responding (-:I do think benchmarking at 1024x768 is perfectly valid. I for one game at 1024x768 with all the candy turned on so not only can I enjoy my looks of the game but I can still render high frames while playing online. When I want to enjoy the single player option I am willing to let the frames dive down to the bare minimums so I can enjoy all that the game has to offer without getting lag killed. So I think that justifys my reasons to want a $600.00 video card (-;
Although I was slightly incorrect about how you benchmarked I stand by my feeling that is is not a TRUE representation of the cards performance across the board. It is a snapshot which leaves a lot of holes unfilled. I would feel cheated as a customer to find that it did not perform as well as I had been lead to believe if the way I wanted to use it was not optimized as well as another way. Understood?
Thanks for the time you have spent evaluating it...it does give us an overall feeling and of course when I am looking at spending that kind of money you can bet I will be doing a lot more then reading one review (-:
Garrett
beggerking - Tuesday, January 24, 2006 - link
It is kind of misleading since Nvidia leads in max quality test in a few games, but the advantage is still given to ATI.x1900xtx is a better performing card overall, but it is not THAT much better. quite an exaggeration.
blahoink01 - Tuesday, January 24, 2006 - link
It would be nice to see World of Warcraft included in the benchmark set. Considering it is probably the most popular game in the world, I'm sure many readers would find the benchmarks useful.fishbits - Tuesday, January 24, 2006 - link
Why, was the WoW engine changed recently? It's easy to max out WoW display settings on far less capable cards, so what useful information would come from benchmarking it with bleeding edge gear? Unless maybe you're running it on some massive $3000 monitor, in which case upgrading to a 300-500 dollar video card should be a no-brainer. The only useful benchmark would be "How would my older video card handle WoW?" and that's already been done. Must be missing something here.blahoink01 - Wednesday, January 25, 2006 - link
Considering the average framerate on a 6800 ultra at 1600x1200 is a little above 50 fps without AA, I'd say this is a perfectly relevant app to benchmark. I want to know what will run this game at 4 or 6 AA with 8 AF at 1600x1200 at 60+ fps. If you think WOW shouldn't be benchmarked, why use Far Cry, Quake 4 or Day of Defeat?At the very least WOW has a much wider impact as far as customers go. I doubt the total sales for all three games listed above can equal the current number of WOW subscribers.
And your $3000 monitor comment is completely ridiculous. It isn't hard to get a 24 inch wide screen for 800 to 900 bucks. Also, finding a good CRT that can display greater than 1600x1200 isn't hard and that will run you $400 or so.
DerekWilson - Tuesday, January 24, 2006 - link
we have looked at world of warcraft in the past, and it is possible we may explore it again in the future.Phiro - Tuesday, January 24, 2006 - link
"The launch of the X1900 series no only puts ATI back on top, "Should say:
"The launch of the X1900 series not only puts ATI back on top, "
GTMan - Tuesday, January 24, 2006 - link
That's how Scotty would say it. Beam me up...DerekWilson - Tuesday, January 24, 2006 - link
thanks, fixedDrDisconnect - Tuesday, January 24, 2006 - link
Its amusing how the years have changed everyone's perception as to how much is a reasonalble price for a component. Hardrives, memory, monitors and even CPUs have become so cheap many have lost the perspective of what being on the leading edge costs. I paid 750$ for a 100 MB drive for my Amiga, 500$ for a 4x CR-ROM and remember spending 500$ on a 720 X 400 Epson colour injet. (Yeah I'm in my 50's) As long as games continue to challenge the capabilities of video cards and the drive to increase performance continues the top end will be expensive. Unlike other hardware (printers, memory, hardrives) there are still perfomance improvements to be made that the user will perceive. If someday a card can render so fast that all games play like reality, then video cards will become like hardrives are now.finbarqs - Tuesday, January 24, 2006 - link
Everyone gets this wrong! It uses 16 PIXEL-PIPELINES with 48 PIXEL SHADER PROCESSORS in it! the pipelines are STILL THE SAME as the X1800XT! 16!!!!!!!!!! oh yeah, if you're wondering, in 3DMark 2005, it reached 11,100 on just a Single X1900XTX...DerekWilson - Tuesday, January 24, 2006 - link
semantics -- we are saying the same things with different words.fill rate as the main focus of graphics performance is long dead. doing as much as possible at a time to as many pixels as possible at a time is the most important thing moving forward. Sure, both the 1900xt and 1800xt will run glquake at the same speed, but the idea of the pixel (fragment) pipeline is tied more closely to lighting, texturing and coloring than to rasterization.
actually this would all be less ambigous if opengl were more popular and we had always called pixel shaders fragment shaders ... but that's a whole other issue.
DragonReborn - Tuesday, January 24, 2006 - link
I'd love to see how the noise output compares to the 7800 series...slatr - Tuesday, January 24, 2006 - link
How about some Lock On Modern Air Combat tests?I know not everyone plays it, but it would be nice to have you guys run your tests with it. Especially when we are shopping for $500 dollar plus video cards.
photoguy99 - Tuesday, January 24, 2006 - link
Why do the editors keep implying the power of cards is "getting ahead" of games when it's actually not even close?- 1600x1200 monitors are pretty affordable
- 8xAA does look better than 4xAA
- It's nice play games with a minimum frame rate of 50-60
Yes these are high end desires, but the X1900XT can't even meet these needs despite it's great power.
Let's face it - the power of cards could double tomorrow and still be put to good use.
mi1stormilst - Tuesday, January 24, 2006 - link
Well said well said my friend...We need to stop being so impressed by so very little. When games look like REAL LIFE does with lots of colors, shading, no jagged edges (unless its from the knife I just plunged into your eye) lol you get the picture.
poohbear - Tuesday, January 24, 2006 - link
technology moves forward at a slower pace then that mates. U expect every vid card to be a 9700pro?! right. there has to be a pace the developers can follow.photoguy99 - Wednesday, January 25, 2006 - link
I think we are agreeing with you -The article authors keep implying they have to struggle to push these cards to their limit because they are getting so powerful so fast.
To your point, I do agree it's moving forward slow - relative to what people can make use of.
For example 90% of Office users can not make use of a faster CPU.
However 90% of gamers could make use of a faster GPU.
So even though GPU performance is doubling faster than CPU performance they should keep it up because we can and will use every ounce of it.
Powermoloch - Tuesday, January 24, 2006 - link
It is great to see that ATi is doing their part right ;)photoguy99 - Tuesday, January 24, 2006 - link
When DX10 is released with vist it seems like this card would be like having SM2.0 - you're behind the curve again.Yea, I know there is always something better around the corner - and I don't recommend waiting if you want a great card now.
But I'm sure some people would like to know.
Spoelie - Thursday, January 26, 2006 - link
Not at all, I do not see DX10 arriving before vista near the end of this year. If it does earlier it will not make any splash whatsoever on game development before that. Even so, you cannot be 'behind' if you're only competitor is still at SM3.0 as well. As far as I can tell, there will be no HARD architectural changes in G71/7900 - they might improve tidbits here and there, like support for AA while doing HDR rendering, but that will be about the full extent of changes.DigitalFreak - Tuesday, January 24, 2006 - link
True, but I'm betting it will be quite a while before we see any DX10 games. I would suspect that the R620/G80 will be DX10 parts.timmiser - Tuesday, January 24, 2006 - link
I expect that Microsoft's Flight Simulator X will be the first DX10 game.hwhacker - Tuesday, January 24, 2006 - link
Question to Derek (or whomever):Perhaps I interpreted something wrong, but is it correct that you're saying X1900 is more of a 12x4 technology (because of fetch4) than the 16x3 we always thought? If so, that would make it A LOT more like Xenos, and perhaps R600, which makes sense, if I recall their ALU setup correctly (Xenos is 16x4, one for stall, so effective 16x3). R520 was 16x1, so...I gotta ask...Does this mean a 16x4 is imminent, or am I just reading the information incorrectly?
If that's true, ATi really did mess with the definition of a pipeline.
I can hear the rumours now...R590 with 16 QUADS, 16 ROPs, 16 TMUs, and 64 pixel processors...Oh yeah, and GDDR4 (on a 80nm process.) You heard it here first. ;)
DerekWilson - Tuesday, January 24, 2006 - link
this is where things get a little fuzzy ... when we used to refer to an architecture as being -- for instance -- 16x1 or 8x2, we refered to the pixel shaders ability to texture a pixel. Thus, when an application wanted to perform multitexturing, the hardware would perform about the same -- single pass graphics cut the performance of the 8x2 architecture in half because half the texturing poewr was ... this was much more important for early dx, fixed pipe, or opengl based games. DX9 through all that out the window, as it is now common to see many instructions and cycles spent on any given pixel.in a way, since there are only 16 texture units you might be able to say its something like 48x0.333 ... it really isn't possible to texture all 48 pixels every clock cycle ad infinitum. in an 8x2 architecture you really could texture each of 8 pixels with 2 textures every clock cycle forever.
to put it more plainly, we are now doing much more actual work with the textures we load, so the focus has shifted from "texturing" a pixel to "shading" a pixel ... or fragment ... or whatever you wanna call it.
it's entirely different then xenos as xenos uses a unified shader architecture.
interestingly though, R580 supports a render to vertex buffer feature that allows you to turn your pixel shaders into vertex processors and spit the output straight back into the incoming vertex data.
but i digress ....
aschwabe - Tuesday, January 24, 2006 - link
I'm wondering how a dual 7800GT/7800GTX stacked up against this card.i.e. Is the brand new system I bought literally 24 hours ago going to be able to compete?
Live - Tuesday, January 24, 2006 - link
SLI figures is all over the review. Go read and look at the graphs again.aschwabe - Tuesday, January 24, 2006 - link
Ah, my bad, thanks.DigitalFreak - Tuesday, January 24, 2006 - link
Go check out the review on hardocp.com. They have benchies for both the GTX 256 & GTX 512, SLI & non SLI.Live - Tuesday, January 24, 2006 - link
No my bad. I'm a bit slow. Only the GTX 512 SLI are in there. sorry!Viper4185 - Tuesday, January 24, 2006 - link
Just a few comments (some are being very picky I know)1) Why are you using the latest hardware with and old Seagate 7200.7 drive when the 7200.9 series is available? Also no FX-60?
2) Disappointing to see no power consumption/noise levels in your testing...
3) You are like the first site to show Crossfire XTX benchmarks? I am very confused... I thought there was only a XT Crossfire card so how do you get Crossfire XTX benchmarks?
Otherwise good job :)
DerekWilson - Tuesday, January 24, 2006 - link
crossfire xtx indicates that we ran a 1900 crossfire edition card in conjunction with a 1900 xtx .... this is as opposed to running the crossfire edition card in conjunction with a 1900 xt.crossfire does not synchronize GPU speed, so performance will be (slightly) better when pairing the faster card with the crossfire.
fx-60 is slower than fx-57 for single threaded apps
power consumption was supposed to be included, but we have had some power issues. We will be updating the article as soon as we can -- we didn't want to hold the entire piece in order to wait for power.
harddrive performance is not going to affect anything but load times in our benchmarks.
DigitalFreak - Tuesday, January 24, 2006 - link
See my comment above. They are probably running an XTX card with the Crossfire Edition master card.OrSin - Tuesday, January 24, 2006 - link
Are gamers going insane. $500+ for video card is not a good price. Maybe its jsut me but are bragging rights really worth thats kind of money. Even if you played a game thats needs it you should be pissed at the game company thats puts a blot mess thats needs a $500 card.poohbear - Tuesday, January 24, 2006 - link
$500 too much? there are cars for $300, 000+, but u dont see the majority of ppl complaining because they're NOT aimed at u and me and ferrari & lamborghini could care less what we think cause we're not their target audience. get over yourself, there ARE cards for you in the $100+ $300, so what are u worried about?timmiser - Tuesday, January 24, 2006 - link
While I agree with what you are saying, we are already on our 3rd generation of $500 high end graphic cards. If memory serves, it was the Nvidia 6800 that broke the $500 barrier for a single card solution.I'm just happy it seems to have leveled off at $500.
Zebo - Tuesday, January 24, 2006 - link
Actually GPU's in general scale very well with price/performance and this is no exception. Twice as fast as a 850 XT which you can get for $275 should cost twice as much or $550 which it does. If you want to complain about prices look at CPUs, high end memory and raptors/SCSI which higher line items offer small benefits for huge price premiums.fishbits - Tuesday, January 24, 2006 - link
Geez, talk about missing the point. News flash: Bleeding edge computer gear costs a lot. $500 is an excellent price for the best card out. Would I rather have it for $12? Yes. Can I afford/justify a $500 gfx card? No, but more power to those who can, and give revenue to ATI/Nvidia so that they can continue to make better cards that relatively quickly fall within my reach. I can't afford a $400 9800 pro either... whoops! They don't cost that much now, do they?Short-sighted again. Look at the launch of Unreal games for instance. Their code is always awesome on the performance side, but can take advantage of more power than most have available at release time. You can tell them their code is shoddy, good luck with that. In reality it's great code that works now, and your gaming enjoyment is extended as you upgrade over time and can access better graphics without having to buy a new game. Open up your mind, quit hating and realize that these companies are giving us value. You can't afford it now, neither can I, but quit your crying and applaud Nv/ATI for giving us constantly more powerful cards.
aschwabe - Tuesday, January 24, 2006 - link
Agreed, I'm not sure how anyone constitutes $500 for ONE component a good price. I'll pay no more than 300-350 for a vid card.bamacre - Tuesday, January 24, 2006 - link
Hear, hear!! A voice of reason!rqle - Tuesday, January 24, 2006 - link
I like new line graph color and interface, but i like bar graph so much more. Never a big fan over SLI or Crossfire on the graph, makes its a distracting, especially it only represent a small group. Wonder if crossfire and sli can have their own graph by themselves or maybe their own color. =)DerekWilson - Tuesday, January 24, 2006 - link
it could be possible for us to look at multigpu solutions serpeately, but it is quite relevant to compare single card performance to multigpu performance -- especially when trying to analyze performance.Live - Tuesday, January 24, 2006 - link
Good reading! Good to see ATI getting back in the game. Now lets see some price competition for a change.I don’t understand what CrossFire XTX means. I thought there was no XTX crossfire card? Since the Crossfire and XT have the same clocks it shouldn’t matter if the other card is a XTX. By looking at the graphs it would seem I was wrong but how can this be? This would indicate that the XTX has more going for it then just the clocks but that is not so, right?
Bha I'm confused :)
DigitalFreak - Tuesday, January 24, 2006 - link
My understanding is that Crossfire is async, so both cards run at their maximum speed. The XTX card runs at 650/1.55, while the Crossfire Edition card runs at 625/1.45. You're right, there is no Crossfire Edition XTX card.Live - Tuesday, January 24, 2006 - link
Thanks for the explanation! Derek I think this merits a mention in the review.NullSubroutine - Tuesday, January 24, 2006 - link
perhaps a flash system where you can pick the card within the benchmark and it will show it on the line graph. just simply activate/deactivate feature.bldckstark - Tuesday, January 24, 2006 - link
I have to agree that a group color for the multi-GPU setups would be helpful on the bar graphs. The outline you used to denote negative gains would work well for this. Then ATI and Nvidia bars would still have a different major color, but the multi-GPU setups could have a yellow outline. E.G. ATI = red, ATI X-fire = Red w/ yellow outline, Nvidia = blue, Nvidia SLI = blue w/ yellow outline.Rock Hydra - Tuesday, January 24, 2006 - link
I don't know if you meant this or not, on the page mentioning the new crossfire board. There is url, I don't know if it was intended to be active or plain text, but I thought I would just bring that to your attention.DerekWilson - Tuesday, January 24, 2006 - link
thanks, fixedemilyek - Tuesday, January 24, 2006 - link
Good article.You have two typos in your article.
In the system specs you have OZC Powerstreams instead of ...stream
When you use the words 'eek out' as a verb that means 'squeeze out', it is spelled 'eke'-- 'eke out'.
DerekWilson - Tuesday, January 24, 2006 - link
I had no idea there was a correct spelling for eke ...thanks
beggerking - Tuesday, January 24, 2006 - link
Did anyone notice it? the breakdown graphs doesn't quite reflect the actual data..the breakdown is showing 1900xtx being much faster than 7800 512, but in the actual performance graph 1900xtx is sometimes outpaced by 7800 512..
DerekWilson - Tuesday, January 24, 2006 - link
We didn't aggregate performance of each card under each game.for the percent improvment breakdown we only looked at 2048x1536 with 4xAA which clearly shows the x1900xtx in the lead.
our reasoning is that this is the most stressful stock test we throw at the cards -- it shows what the cards can handle under the highest stress.
beggerking - Tuesday, January 24, 2006 - link
umm.. what about 8xAA or higher? or lower resolution? w/wo AA?if you don't aggregate performance, then won't the graphic be misleading?
isn't max quality the most stressful test ?
tuteja1986 - Tuesday, January 24, 2006 - link
wait for firing squad review then :) if you want AAx8beggerking - Tuesday, January 24, 2006 - link
Did anyone notice it? the breakdown graphs doesn't quite reflect the actual data..the breakdown is showing 1900xtx being much faster than 7800 512, but in the actual performance graph 1900xtx is sometimes outpaced by 7800 512..
SpaceRanger - Tuesday, January 24, 2006 - link
All the second to last section describes in the Image Quality. There was no explaination on power consumtion at all. Was this an accidental omit or something else??Per Hansson - Tuesday, January 24, 2006 - link
Yes, please show us the power consumption ;-)A few things I would like seen done; Put a low-end PCI GFX card in the comp, boot it and register power consumption, leave that card in and then do your normal tests with a single X1900 and then dual so we get a real point on how much power they consume...
Also please clarify exactly what PSU was used and how the consumption was measured so we can figure out more accuratley how much power the card really draws (when counting in the (in)efficiency of the PSU that is...
peldor - Tuesday, January 24, 2006 - link
That's a good idea on isolating the power of the video card.From the other reviews I've read, the X1900 cards are seriously power hungry. In the neighborhood of 40-50W more than the X1800XT cards. The GTX 512 (and GTX of course) are lower than the X1800XT, let alone the X1900 cards.
vaystrem - Tuesday, January 24, 2006 - link
Anyone else find this interesting??Battlefield 2 @ 2048x1536 Max Detail
7800GTX512 33FPS
AIT 1900XTX 32.9FPS
ATI 1900XTX Crossfire. 29FPS
-------------------------------------
Day of Defeat
7800GTX512 18.93FPS
AIT 1900XTX 35.5PS
ATI 1900XTX Crossfire. 35FPS
-------------------------------------
Fear
7800GTX512 20FPS
AIT 1900XTX 36PS
ATI 1900XTX Crossfire. 49FPS
-------------------------------------
Quake 4
7800GTX512 43.3FPS
AIT 1900XTX 42FPS
ATI 1900XTX Crossfire. 73.3FPS
DerekWilson - Tuesday, January 24, 2006 - link
Becareful here ... these max detail settings enabled superaa modes which really killed performance ... especially with all the options flipped on quality.we're working on getting some screens up to show the IQ difference. but suffice it to say that that the max detail settings are very apples to oranges.
we would have seen performance improvements if we had simply kept using 6xAA ...
DerekWilson - Tuesday, January 24, 2006 - link
to further clarify, fear didn't play well when we set AA outside the game, so it's max quality ended up using the in game 4xaa setting. thus we see a performance improvement.for day of defeat, forcing aa/af through the control panel works well so we were able to crank up the quality.
I'll try to go back and clarify this in the article.
vaystrem - Wednesday, January 25, 2006 - link
I'm not sure how that justifies what happens. Your argument is that it is the VERY highest settings so that its ok for the 'dual' 1900xtx to have lower performance than a single card alternative? That doesn't seem to make sense and speaks poorly for the ATI implementation.Lonyo - Tuesday, January 24, 2006 - link
The XTX especially in Crossfire does seem to give a fair boost in a number of tests over the XT and XT in Crossfire.Orbs - Wednesday, January 25, 2006 - link
I didn't see any comparisons between X1900 XT CrossFire and X1900 XTX CrossFire, except for the comments at the end of the article saying diminishing returns resulting in even smaller a gap in CrossFire than the XTX had over the XT to begin with.With the exception of 1 B&W2 test (which I suspect is a typo), the graphs all show the X1800 XT vs. the X1900 XTX. Those are two different generations, not just clock speeds.
poohbear - Tuesday, January 24, 2006 - link
I LOVE competition.:)poohbear - Tuesday, January 24, 2006 - link
especially GOOD competition.:0Aquila76 - Tuesday, January 24, 2006 - link
Nice. Maybe we'll start seeing some real developments in GPU's again. Right now, it's more of a 'do what we've been doing but faster', maybe we'll start seeing some new innovations in video tech in the coming year (adding physics processing, wider encoding capabilities, etc.).Spoelie - Thursday, January 26, 2006 - link
The x.00 line was more of the same yeah, but the x1.00 line is architecturally a pretty large step forward, finally on par with nvidia where it really needed to and a few steps ahead in other areas. If only they gave it more ROPs/Texture engines.Phantronius - Tuesday, January 24, 2006 - link
And the shitty cycle of upgrading continues.Capt Caveman - Tuesday, January 24, 2006 - link
And available at a good price. Way to go ATI.gimpsoft - Tuesday, January 24, 2006 - link
nice scores on ATI but i still have my 6800GT ill wait until next 2 gen i see we can now play respectable 40 frames + @ 19200X1400 nice to know when 1080p tv are outdon't know either that or get a PS3 will see =) it's getting way to expensive.
bamacre - Tuesday, January 24, 2006 - link
WTF is your idea of "good price?" I see X1900 XT starting at $550 to $605 per card.Capt Caveman - Tuesday, January 24, 2006 - link
Digitally Unique has the X1900XT for $525 and Actbuy had them for $504. Based on performance, these cards offer a great bang for your buck. And this is coming from a GTX 512 owner.