ATI Radeon HD 2900 XT: Calling a Spade a Spade

by Derek Wilson on 5/14/2007 12:04 PM EST
Comments Locked

86 Comments

Back to Article

  • wjmbsd - Monday, July 2, 2007 - link

    What is the latest on the so-called Dragonhead 2 project (aka, HD 2900 XTX)? I heard it was just for OEMs at first...anyone know if the project is still going and how the part is benchmarking with newest drivers?
  • teainthesahara - Monday, May 21, 2007 - link

    After this failure of the R600 and likely overrated(and probably late) Barcelona/Agena processors I think that Intel will finally bury AMD. Paul Ottelini is rubbing his hands with glee at the moment and rightfully so. AMD now stands for mediocrity.Oh dear what a fall from grace.... To be honest Nvidia don't have any real competition on the DX10 front at any price points.I cannot see AMD processors besting Intel's Core 2 Quad lineup in the future especially when 45nm and 32 nm become the norm and they don't have a chance in hell of beating Nvidia. Intel and Nvidia are turning the screws on Hector Ruiz.Shame AMD brought down such a great company like ATI.
  • DerekWilson - Thursday, May 24, 2007 - link

    To be fair, we really don't have any clue how these cards compete on the DX10 front as there are no final, real DX10 games on the market to test.

    We will try really hard to get a good idea of what DX10 will look like on the HD 2000 series and the GeForce 8 Series using game demos, pre-release code, and SDK samples. It won't be a real reflection of what users will experience, but we will certainly hope to get a glimpse at performance.

    It is fair to say that NVIDIA bests AMD in current game performance. But really there are so many possibilities with DX10 that we can't call it yet.
  • spinportal - Friday, May 18, 2007 - link

    From the last posting of results for the GTS 320MB round-up
    http://www.anandtech.com/video/showdoc.aspx?i=2953...">Prey @ AnandTech - 8800GTS320
    we see that the 2900XT review chart pushes the nVidia cards down about 15% across the board.
    http://www.anandtech.com/video/showdoc.aspx?i=2988...">Prey @ AnandTech - ATI2900XT
    The only difference in systems is software drivers as the cpu / mobo / mem are the same.

    Does this mean ATI should be getting a BIGGER THRASHING BEAT-DOWN than the reviewer is stating?
    400$ ATI 2900XT performing as good as a 300$ nVidia 8800 GTS 320MB?

    Its 100$ short and 6 months late along with 100W of extra fuel.

    This is not your uncle's 9700 Pro...
  • DerekWilson - Sunday, May 20, 2007 - link

    We switched Prey demos -- I updated our benchmark.

    Both numbers are accurate for the tests I ran at the time.

    Our current timedemo is more stressful and thus we see lower scores with this test.
  • Yawgm0th - Wednesday, May 16, 2007 - link

    The prices listed in this article are way off.

    Currently, 8800GTS 640MB retails for $350-380, $400+ for OC or special versions. 2900XT retails for $430+. In the article, both are listed as $400, and as such the card is given a decent review in the conclusion.

    Realistically, this card provides slightly inferior performance to the 8800GTS 640MB at a considerably higher price point -- $80-$100 more than the 8800GTS. I mean, it's not like the 8800Ultra, but for the most part this card has little use outside of AMD and/or ATI fanboys. I'd love for this card to do better as AMD needs to be competing with Nvidia and Intel right now, but I just can't see how this is even worth looking at, given current prices.
  • DerekWilson - Thursday, May 17, 2007 - link

    really, this article focuses on architechture more than product, and we went with MSRP prices...

    we will absolutly look closer at price and price/performance when we review retail products.
  • quanta - Tuesday, May 15, 2007 - link

    As I recalled, the Radeon HD 2900 only has DVI ports, but nowhere in DVI documentation specifies it can carry audio signals. Unless the card comes with adapter that accepts audio input, it seems the audio portion of R600 is rendered useless.
  • DerekWilson - Wednesday, May 16, 2007 - link

    the card does come with an adapter of sorts, but the audio input is from the dvi port.

    you can't use a standard DVI to HDMI converter for this task.

    when using AMD's HDMI converter the data sent out over the DVI port does not follow the DVI specification.

    the bottom line is that the DVI port is just a physical connector carrying data. i could take a DVI port and solder it to a stereo and use it to carry 5.1 audio if I wanted to ... wouldn't be very useful, but I could do it :-)

    While connected to a DVI device, the card operates the port according to the DVI specification. When connected to an HDMI device through the special converter (which is not technically "dvi to hdmi" -- it's amd proprietry to hdmi), the card sends out data that follows the HDMI spec.

    you can look at it another way -- when the HDMI converter is connected, just think of the dvi port as an internal connector between an I/O port and the TMDS + audio device.
  • ShaunO - Tuesday, May 15, 2007 - link

    I was at an AMD movie night last night where they discussed the technical details of the HD 2900 XT and also showed the Ruby Whiteout DX10 Demo rendered using the card. It looked amazing and I had high hopes until I checked out the benchmark scores. They're going to need more than free food and popcorn to convince me to buy an obsolete card.

    However there is room for improvement of course. Driver updates, DX10 and whatnot. The main thing for me personally will be driver updates, I will be interested to see how well the card improves over time while I save my pennies for my next new machine.

    Everyone keeps saying "DX10 performance will be better, yadda yadda" but I also want to be able to play the games I have now and older games without having to rely on DX10 games to give me better performance. Nothing like totally underperforming in DX9 games and then only being equal or slightly better in DX10 games compared to the competition. I would rather have a decent performer all-round. Even saying that we don't even know for sure if DX10 games are even going to bring any performance increases of the competition, it's all speculation right now and that's all we can do, speculate.

    Shaun.
  • Roy2001 - Tuesday, May 15, 2007 - link

    The reason is, you have to pay extra $ for a power supply. No, most probably your old PSU won't have enough milk for this baby. I will stick with nVidia in future. My 2 cents.
  • Chaser - Tuesday, May 15, 2007 - link

    quote:


    While AMD will tell us that R600 is not late and hasn't been delayed, this is simply because they never actually set a public date from which to be delayed. We all know that AMD would rather have seen their hardware hit the streets at or around the time Vista launched, or better yet, alongside G80.

    First, they refuse to call a spade a spade: this part was absolutely delayed, and it works better to admit this rather than making excuses.



    Such a revealing tech article. Thanks for other sources Tom.
  • archcommus - Tuesday, May 15, 2007 - link

    $300 is the exact price point I shoot for when buying a video card, so that pretty much eliminates AMD right off the bat for me right now. I want to spend more than $200 but $400 is too much. I'm sure they'll fill this void eventually, and how that card will stack up against an 8800 GTS 320 MB is what I'm interested in.
  • H4n53n - Tuesday, May 15, 2007 - link

    Interesting enough in some other websites it wins from 8800 gtx in most games,especially the newer ones and comparing the price i would say it's a good deal?I think it's just driver problems,ati has been known for not having a very good driver compared to nvidia but when they fixed it then it'll win
  • dragonsqrrl - Thursday, August 25, 2011 - link

    lol...fail. In retrospect it's really easy to pick out the EPIC ATI fanboys now.
  • Affectionate-Bed-980 - Tuesday, May 15, 2007 - link

    I skimmed this article because I have a final. ATI can't hold a candle to NV at the moment it seems. Now while the 2900XT might have good value, I am correct in saying that ATI has lost the performance crown by a buttload (not even like X1800 vs 7800) but like they're totally slaughtered right?

    Now I won't go and comment about how the 2900 stacks up against competition in the same price range, but it seems that GTSes can be acquired for cheap.

    Did ATI flop big here?
  • vailr - Monday, May 14, 2007 - link

    I'd rather use a mid-range older card that "only" uses ~100 Watts (or less) than pay ~$400 for a card that requires 300 Watts to run. Doesn't AMD care about "Global Warming"?
    Al Gore would be amazed, alarmed, and astounded !!
  • Deusfaux - Monday, May 14, 2007 - link

    No they dont and that's why the 2600 and 2400 don't exist
  • ochentay4 - Monday, May 14, 2007 - link

    Let me start with this: i always had a nvidia card. ALWAYS.

    Faster is NOT ALWAYS better. For the most part this is true, for me, it was. One year ago I boght a MSI7600GT. Seemed the best bang for the buck. Since I bought it, I had problems with TVout detection, TVout wrong aspect ratios, broken LCD scaling, lot of game problems, inexistent support (nv forum is a joke) and UNIFIED DRIVER ARQUITECTURE. What a terrible lie! The latest official drivers is 6 months ago!!!

    Im really demanding, but i payed enough to demand a 100% working product. Now ATi latest offering has: AVIVO, FULL VIDEO ACC, MONTHLY DRIVER UPDATES, ALL BUGS I NOTICED WITH NVIDIA CARD FIXED, HDMI AND PRICE. I prefer that than a simple product, specially for the money they cost!

    I will never buy a nvidia card again. I'm definitely looking forward ATis offering (after the joke that is/was 8600GT/GTS).

    Enough rant.
    Am I wrong?
  • Roy2001 - Tuesday, May 15, 2007 - link

    Yeah, you are wrong. Spend $400 on a 2900XT and then $150 on a PSU.
  • yyrkoon - Tuesday, May 15, 2007 - link

    See, the problem here is: guys like you are so bent on saving that little bit of money, by buying a lesser brand name, that you do not even take the time to research your hardware. USe newegg , and read the user reviews, and if that is not enough for you, go to the countless other resources all over the internet.
  • yyrkoon - Tuesday, May 15, 2007 - link

    Blame the crappy OEM you bought the card from, not nVIdia. Get an EVGA card, and embrace a completely different aspect on video card life.

    MSI may make some decent motherboards, but their other components have serious issues.
  • LoneWolf15 - Thursday, May 17, 2007 - link

    Um, since 95% of nvidia-GPU cards on the market are the reference design, I'd say your argument here is shaky at best. EVGA and MSI both use the reference design, and it's even possible that cards with the same GPU came off the same production line at the same plant.
  • DerekWilson - Thursday, May 17, 2007 - link

    it is true that the majority of parts are based on reference designs, but that doesn't mean they all come from the same place. I'm sure some of them do, but to say that all of these guys just buy completed boards and put their name on them all the time is selling them a little short.

    at the same time, the whole argument of which manufacturer builds the better board on a board component level isn't something we can really answer.

    what we would suggest is that its better to buy from OEMs who have good customer service and long extensive warranties. this way, even if things do go wrong, there is some recourse for customers who get bad boards or have bad experiences with drivers and software.
  • cmdrdredd - Monday, May 14, 2007 - link

    you're wrong. 99% of people buying these high end cards are gaming. Those gamers demand and deserve the best possible performance. If a card that uses MORE power and costs MORE (x2900xt vs 8800gts) and performs generally the same or slower what is the point? Fact is...ATI's high end is in fact slower than mid range offerings from Nvidia and consumes alot more power. Regardless of what you think, people are buying these based on performance benchmarks in 99% of all cases.
  • AnnonymousCoward - Tuesday, May 15, 2007 - link

    No, you're wrong. Did you overlook the emphasis he put on "NOT ALWAYS"?

    You said 99% use for gaming--so there's 1%. Out of the gamers, many really want LCD scaling to work, so that games aren't stretched horribly on widescreen monitors. Some gamers would also like TVout to work.

    So he was right: faster is NOT ALWAYS better.
  • erwos - Monday, May 14, 2007 - link

    It'd be nice to get the scoop on the video decode acceleration present on these boards, and how it stocks up to the (excellent) PureVideo HD found in the 8600 series.
  • imaheadcase - Tuesday, May 15, 2007 - link

    I agree! They need to do a whole article on video acceleration on a range of cards and show the pluses and cons of each card in respective areas. A lot of people like myself like to watch videos and game on cards, but like the option open to use the advanced video features.

  • Turnip - Monday, May 14, 2007 - link

    "We certainly hope we won't see a repeat of the R600 launch when Barcelona and Agena take on Core 2 Duo/Quad in a few months...."


    Why, that's exactly what I had been thinking :)

    Phew! I made it through the whole thing though, I even read all of those awfully big words and everything! :)

    Thanks guys, another top review :)
  • Kougar - Monday, May 14, 2007 - link

    First, great article! I will be going back to reread the very indepth analysis of the hardware and features, something that keeps me a avid Anandtech reader. :)

    Since it was mentioned that overclocking will be included in a future article, I would like to suggest that if possible watercooling be factored into it. So far one review site has already done a watercooled test with a low-end watercooling setup, and without mods acheived 930MHz on the Core, which indirectly means 930MHz shaders if I understand the hardware.

    I'm sure I am not the only reader extremely interested to see if all R600 needs is a ~900-950MHz overclock to offer some solid GTX level performance... or if it would even help at all. Again thanks for the consideration, and the great article! Now off to find some Folding@Home numbers...
  • mostlyprudent - Monday, May 14, 2007 - link

    Frankly, neither the NVIDIA nor the AMD part at this price point is all that impressive an upgrade from the prior generations. We keep hearing that we will have to wait for DX10 titles to know the real performance of these cards, but I suspect that by the time DX10 titles are on the shelves we will have at least product line refreshes by both companies. Does anyone else feel like the graphics card industry is jerking our chains?
  • johnsonx - Monday, May 14, 2007 - link

    It seems pretty obvious that AMD needs a Radeon HD2900Pro to fill in the gap between the 2900XT and 2600XT. Use R600 silicon, give it 256Mb RAM with a 256-bit memory bus. Lower the clocks 15% so that power consumption will be lower, and so that chips that don't bin at full XT speeds can be used. Price at $250-$300. It would own the upper-midrange segment over the 8600GTS, and eat into the 8800GTS 320's lunch as well.
  • GlassHouse69 - Monday, May 14, 2007 - link

    If I know this, and YOU know this.... wouldnt anandtech? I see money under the table or utter stupidity at work at anand. I mean, I know that the .01+ version does a lot better in benches as well as the higher res with aa/af on sometimes get BETTER framerates than lower res, no aa/af settings. This is a driver thing. If I know this, you know this, anand must. I would rather admit to being corrupt rather than that stupid.

  • GlassHouse69 - Monday, May 14, 2007 - link

    wrong section. dt is doing that today it seems to a few people
  • xfiver - Monday, May 14, 2007 - link

    Hi, thank you for a really in depth review. While reading other 'earlier' reviews I remember a site using Catalyst 8.38 and reported performance improvements upto 14% from 8.37. Look forward to Anandtech's view on this.
  • xfiver - Monday, May 14, 2007 - link

    My apologies it was VR zone and 8.36 to 8.37 (not 8.38)
  • GlassHouse69 - Monday, May 14, 2007 - link

    If I know this, and YOU know this.... wouldnt anandtech? I see money under the table or utter stupidity at work at anand. I mean, I know that the .01+ version does a lot better in benches as well as the higher res with aa/af on sometimes get BETTER framerates than lower res, no aa/af settings. This is a driver thing. If I know this, you know this, anand must. I would rather admit to being corrupt rather than that stupid.
  • Gary Key - Tuesday, May 15, 2007 - link

    quote:

    If I know this, and YOU know this.... wouldnt anandtech? I see money under the table or utter stupidity at work at anand. I mean, I know that the .01+ version does a lot better in benches as well as the higher res with aa/af on sometimes get BETTER framerates than lower res, no aa/af settings. This is a driver thing. If I know this, you know this, anand must. I would rather admit to being corrupt rather than that stupid.


    I have worked extensively with four 8.37 releases and now the 8.38 release for the upcoming P35 release article. The 8.37.4.2 alpha driver had the top performance in SM3.0 heavy apps but was not very stable with numerous games, especially under Vista. The released 8.37.4.3 driver on AMD's website is the most stable driver to date and has decent performance but nothing near the alpha 8.37 or beta 8.38. The 8.38s offer great benchmark performance in the 3DMarks, several games, and a couple of DX10 benchmarks from AMD.

    However, the 8.38s more or less broke CrossFire, OpenGL, and video acceleration in Vista depending upon the app and IQ is not always perfect. While there is a great deal of promise in their performance and we see the potential, they are still Beta drivers that have a long ways to go in certain areas before their final release date of 5/23 (internal target).

    That said, would you rather see impressive results in 3DMarks or have someone tell you the truth about the development progress or lack of it with the drivers. As much as I would like to see this card's performance improve immediately, it is what it is at this time with the released drivers. AMD/ATI will improve the performance of the card with better drivers but until they are released our only choice is to go with what they sent. We said the same thing about NVIDIA's early driver issues with the G80 so there are not any fanboys or people taking money under the table around here. You can put all the lipstick on a pig you want, but in the end, you still have a pig. ;-)
  • Anand Lal Shimpi - Monday, May 14, 2007 - link

    There's nothing sinister going on, ATI gave us 8.37 to test with and told us to use it. We got 8.38 today and are currently testing it for a follow-up.

    Take care,
    Anand
  • GlassHouse69 - Monday, May 14, 2007 - link

    wow dood. you replied!

    Yes, I have been wondering about the ethics of your group here for about a year now. I felt this sorta slick leaning towards and masking thing goign on. Nice to see there is not.

    Thanks for the 1000's of articles and tests!
    -Mr. Glass
  • johnsonx - Monday, May 14, 2007 - link

    and to which are you going to admit to?

    What was that old saying about glass houses and throwing stones? Shouldn't throw them in one? Definitely shouldn't them if you ARE one!
  • Puddleglum - Monday, May 14, 2007 - link

    quote:

    ATI's latest and greatest doesn't exactly deliver the best performance per watt, so while it doesn't compete performance-wise with the GeForce 8800 GTX it requires more power.
    You mean, while it does compete performance-wise?
  • johnsonx - Monday, May 14, 2007 - link

    No, I'm pretty sure they mean DOESN'T. That is, the card can't compete with a GTX, yet still uses more power.
  • INTC - Monday, May 14, 2007 - link

    quote:

    We certainly hope we won't see a repeat of the R600 launch when Barcelona and Agena take on Core 2 Duo/Quad in a few months....
  • Chadder007 - Monday, May 14, 2007 - link

    When will we have the 2600's out in review?? Thats the card im waiting for.
  • TA152H - Monday, May 14, 2007 - link

    Derek,

    I like the fact you weren't mincing your words, except for a little on the last page, but I'll give you a perspective of why it might be a little better than some people will think.

    There are some of us, and I am one, that will never buy NVIDIA. I bought one, had nothing but trouble with it, and have been buying ATI for 20 years. ATI has been around for so long, there is brand loyalty, and as long as they come out with something that is competent, we'll consider it against their other products without respect to NVIDIA. I'd rather give up the performance to work with something I'm a lot more comfortable with.

    The power though is damning, I agree with you 100% on this. Any idea if these beasts are being made by AMD now, or still whoever ATI contracted out? AMD is typically really poor in their first iteration of a product on a process technology, but tend to improve quite a bit in succeeding ones. I wonder how much they'll push this product initially. It might be they just get it out to have it out, and the next one will be what is really a worthwhile product. That only makes sense, of course, if AMD is now manufacturing this product. I hope they are, they surely don't need to make anymore of their processors that aren't selling well.

    One last thing I noticed is the 2400 Pro had no fan! It had a heatsink from Hell, but that will still make this a really attractive product for a growing market segment. Any chance of you guys doing a review on the best fanless cards?
  • DerekWilson - Wednesday, May 16, 2007 - link

    TSMC is manufacturing the R600 GPUs, not AMD.
  • AnnonymousCoward - Tuesday, May 15, 2007 - link

    "I bought one, had nothing but trouble with it, and have been buying ATI for 20 years."

    That made me laugh. If one bad experience was all it took to stop you from using a computer component, you'd be left with a PS/2 keyboard at best.

    "...to work with something I'm a lot more comfortable with."

    Are you more comfortable having 4:3 resolutions stretched on a widescreen? Maybe you're also more comfortable with having crappier performance than nvidia has offered for the last 6 months and counting? This kind of brand loyalty is silly.
  • MadBoris - Monday, May 14, 2007 - link

    As far as your brand loyalty, ATI doesn't exist anymore. Furthermore AMD executives will got the staff so you can't call it the same.
    Secondly, Nvidia has been a stellar company providing stellar products. Everyone has some ups and downs. Unfortunately with the hardware and drivers this is ATI's (er AMD's) downs.

    This card should do ok in comparison to the GTS, especially as drivers mature. Some reviews show it doing better than GTS640 in most tests, so I am not sure where or how discrepencies are coming about. Maybe hardware compatibility, maybe settings.
  • rADo2 - Monday, May 14, 2007 - link

    Many NVIDIA 8600GT/GTS cards do not have a fan, are available on the market now, and are (probably; different league) much more powerful than 2400 ;) But as you are a fanboy, you are not interested, right?
  • TA152H - Monday, May 14, 2007 - link

    Fanboy? What a dork.

    I've had success with ATI, not with NVIDIA, and I know ATI stuff a lot better so it's just easier for me to work with. It's not an irrational like or dislike. I bought one NVIDIA and it was a nightmare. Plus, I'm not as sure they'll be around for very long as I am ATI/AMD, although they had a good quarter, and AMD surely had a dreadful one.

    Selling discrete video cards alone might get a lot more difficult with the integration of CPUs, and GPUs.
  • yyrkoon - Monday, May 14, 2007 - link

    You are a fanboy, face it. 'I tried a nVidia card once . . .' How long ago was that ? Who made the card ? Did you have it configured properly? Once?! Details like this are important, and seemily/conviently left out. Anyhow, anyone claiming that nVIdia cards are 'junk' has definate issues with assembling/configuring hardware. I say this because my current system uses a nVidia based card, and is 100% rock solid. 'Person between the chair and keyboard' rings a bell.

    Ask any Linux user why they refuse to use ATI cards in their system . . . You are also one of these people out there that claims ATI driver support is superior to nVIdias driver support I suppose ? If you have truely been using ATI products for 20 years, then you know ATI has one of the worst reputations on the planet for driver support(and while it may have improved, it is not as good as nVidias still).

    Yeah, anyhow, ATI, and nVidia both can have problems with their hardware, it is not based 100% on their architecture, but the OEM releasing the products have a lot of effect here also. There are bad OEMs to buy from here on both sides of the fence, knowing who to stay away from, is half the work when building a PC, and probably had a lot more to do with your alleged 'bad nVIdia card', assuming you actually configured the card properly.

    I also had a problem with an nVIdia card once, I bought a brand new GF3 card about 7 years ago, and a few of the older games I had, would not display properly with it. What did I do ? I waited about a month, for a new driver, and the problem was solved. I have also had issues with ATI cards, one of which drew too much power from the AGP slot, and would cause the given system to crash 1-2 times a day. This was a design issue/oversight on ATI's behalf(the card was made by Saphire, who also makes ATIs cards). What did I do ? I replaced the card with an nVIdia card, and the system has been stable since.

    So you see, I too can skew things to make anyone look bad also, and in the end, it would only serve to make me look like the dork. But if you want to pay more, for less, that is perfectly fine by me.
  • Pirks - Monday, May 14, 2007 - link

    I've got all problems and crappy drivers (especially Linux ones) only from ATI while nVidia software was always much better in my experience. power hungry noisy monsters made by whom? by ATI! as always :) same shit as with their x1800/x1900 miserable power guzzling series

    discrete video cards are not going away any time soon. ever heard of integrated video used in games, besides ones from 2000, like old Quake 2? no? then please continue your lovefest with ATI, but for me - it looks like I'll pass on them this time again - since Radeon 9800Pro they went downhill and continue in that direction. they MAY make a decent integrated CPU/GPU budget-oriented vendor in a future, for all those office folks playing simple 2D office games, but real stuff? nope, ATI is still out of the game for me. let's see if they manage to come back with reincarnation of R300 in future.

    ironically, AMD CPUs on the other hand have best price/performance ratio, so intel won't see me as their customer. I wish ATI 3D chips were as good as AMD CPUs in that regard (and overclockers please shut up, I'm not bothering to OC my rig because I don't enjoy benchmark numbers, I enjoy REAL stuff like games, and Intel is out of the game for me as well, at least until their budget single core Conroes are out)
  • utube545 - Tuesday, May 22, 2007 - link

    Get a clue, you fucking cretin.
  • dragonsqrrl - Thursday, August 25, 2011 - link

    haha... lol, wow. facepalm.
  • dragonsqrrl - Thursday, August 25, 2011 - link

    Damn you're a fail noob of an ATI fanboy. Time has not been kind to the HD2900XT, and now you sound more ridiculous then ever... lol.
  • yzkbug - Monday, May 14, 2007 - link

    Not a word about new AVIVO HD and digital sound features?
  • DerekWilson - Wednesday, May 16, 2007 - link

    we mentioned this ...

    on the r600 overview page ...
  • photoguy99 - Monday, May 14, 2007 - link

    First to be clear and I do not condone the title of this article, there's no need to bring racism into this.

    But my point is NVidia can and will react by making the performance per dollar competitive for the R600 vs 8800GTS.

    Once the prices are comparable, why buy a more power hungry part (the ATI)?

    This is one disadvantage they can't correct until the next respin.

  • DrMrLordX - Monday, May 14, 2007 - link

    Based on the benchmarks results, the only reason I can see for getting 2900XTs is if a). you don't care about power consumption and b). want to run a Crossfire rig at a lower cost of entry than dual-8800 GTXs or 8800 Ultras.

    As others have said, some more benchmarks in mature DX10 titles might show who the real winner here is performance-wise, and that holds true for multi-GPU scenarios as well.
  • dragonsqrrl - Thursday, August 25, 2011 - link

    You forgot c).

    -if you're an ATI fanboy
  • vijay333 - Monday, May 14, 2007 - link

    http://www.randomhouse.com/wotd/index.pperl?date=1...">http://www.randomhouse.com/wotd/index.pperl?date=1...

    "the expression to call a spade a spade is thousands of years old and etymologically has nothing whatsoever to do with any racial sentiment."
  • strikeback03 - Wednesday, May 16, 2007 - link

    What about in Euchre, where a spade can be a club (and vice versa)?
  • johnsonx - Monday, May 14, 2007 - link

    Just wait until AT refers to AMD's marketing budget as 'niggardly'...
  • bldckstark - Monday, May 14, 2007 - link

    What do shovels have to do with race?
  • Stan11003 - Monday, May 14, 2007 - link

    My big hope out all of this that the ATI part forces the Nvidia parts lower so I can use my upgrade option from EVGA to get a nice 8800 GTX instead of my 8800 GTS ACS3 320. However with a quad core and a decent 2GB I have no gaming issues at all. I play at 1600x1200(when that become a low rez?) and everything is butter smooth. Without newer titles all this hardware is a waist anyways.
  • Gul Westfale - Monday, May 14, 2007 - link

    the article says that the part is not a failure, but i disagree. i switched from a radeon 1950pro to an nvidia geforce 8800GTS 320MB about a mont ago, and i paid only $350US for it. now i see that it still outperforms the new 2900...

    one of my friends wanted to wait to buy a new card, he said he hoped that the ATI part was going to be faster. now he says he will just buy the 8800GTS 320, since ATI have failed.

    if they can bring out a part that competes well with the 8800GTS and price it similarly or lower then it would be worth buying, but until then i will stick with nvidia. better performance, better price, and better drivers... why would anyone buy the ATI card now?
  • ncage - Monday, May 14, 2007 - link

    My conclusion is to wait. All of the recent GPU do great with dx9...the question is how will they do with dx10? I think its best to wait for dx10 titles to come out. I think crysis would be a PERFECT test.
  • wingless - Monday, May 14, 2007 - link

    I agree with you. Crysis is going to be the benchmark for these DX10 cards. Its hard to tell both Nvidia and AMD's DX10 performance with these current, first generation DX10 titles (most of which have a DX9 version) because they don't fully take advantage of all the power on both the G80 or R600 yet. Its true that Crysis will have a DX9 version as well but the developer stated there are some big differences in code. I'm an Nvidia fanboy but I'm disappointed with the Pure Video and HDMI support on the 8800 series cards. ATI got this worked out with their great AVIVO and their nice HDMI implementation but for now Nvidia is still the performance champ with "simpler" hardware. The G80 and R600 continue the traditions of their manufacturers. Nvidia has always been about raw power and all out speed with few bells and whistles. ATI is all about refinement, bells and whistles, innovations, and unproven new methods which may make or break them.

    All I really want to wait for is to see how developers embrace CUDA or ATI's setup for PHYSICS PROCESSING! Both companies seem to have well thought out methods to do physics and I cant wait to see that showdown. AGEIA and HAVOK need to hop on-board and get some software support for all this good hardware potential they have to play with. Physics is the next big gimmick and you know how much we all love gimmicks (just like good 'ole 3D acceleration 10 years ago).
  • poohbear - Monday, May 14, 2007 - link

    they dont make a profit from high end parts that's why they're not bothering w/ it? that's AMD's story? so why bother having an FX line w/ their cpus?
  • GoatMonkey - Monday, May 14, 2007 - link

    That's obviously BS. This IS their high end part, it just doesn't perform as well as nVidia's high end part, so it is priced accordingly.
  • poohbear - Monday, May 14, 2007 - link

    sweet review though! thanks for including all the important and pertinent cards in your roundup (the 8800gts 320mb inparticular). also love how neutral Anand is in their reviews, unlike some other sites.:p
  • Creig - Monday, May 14, 2007 - link

    The R600 is finally here. I'm sure the overall performance is not what AMD was hoping for. Nobody ever shoots to have their newest product be the 2nd best. But pricing it at $399 and including a very nice game bundle will make the HD 2900 XT a VERY worthwhile purchase. I also have the feeling that there is a significant amount of performance increase to be realized through future driver releases ala X1800XT.
  • shady28 - Tuesday, May 15, 2007 - link


    Nvidia has gone over the cliff on pricing.

    I know of no one personally who has an 88xx series card. I know one who recently picked up an 8600 of some kind, that's it. I have the best GPU of anyone I know.

    It's a real shame that there is so much focus on graphics cards that virtually no one buys. These are niche products folks - yet 'who is best' seems to be totally dependent on these niche products. That's patently ridiculous.

    It's like saying, since IBM makes the fastest computers in the world (they do), they're the best and you should be buying IBM (or now, lenovo) laptops and desktops.

    No one ever said that sort of thing because it's patently ridiculous. Why do people say it now for graphics cards? The fact that they do says a lot about the mentality of sites like AT.
  • DerekWilson - Tuesday, May 15, 2007 - link

    We don't say what you are implying, and we are also very upset with some of NVIDIA's pricing (specifically the 8800 ultra)

    the 8800 gts 320mb is one of the best values for your money anywhere and isn't crazy expensive -- it's actually the card I'd recommend to anyone who cares about graphics in games and wants good quality and performance at 1600x1200.

    I would never tell anyone to buy an 8600 gts because nvidia has the fastest high end card. In fact, in this article, I hope I made it clear that AMD has the opportunity to capitalize on the huge performance gap nvidia left between the 8600 and 8800 series ... If AMD builds a part that performs in this range is priced competitively, they'll have our recommendation in a flash.

    Recommending parts based on value at each price or performance segment is something we take pride in and will always do, no matter who has the absolute fastest hardware out there.

    The reason our focus was on AMD's fastest part is because they haven't given us any other hardware to test. We will absolutely be talking a lot and in much depth about midrange and budget hardware when AMD makes these parts available to us.
  • yacoub - Monday, May 14, 2007 - link

    $400 is a lot of money. Not terribly long ago the highest end GPU available didn't cost more than $400. Now they hit $750 so you start to think $400 sounds cheap. It's really not. It's a heck of a lot of money for one piece of hardware. You can put together a 650i SLI rig with 2GB of DDR2 6400 and an E4400 for that much money. I know because I just did that. I kept my 7900GT from my old rig because I wanted to see how R600 did before purchasing an 8800GTS 640MB. Now that we've seen initial results I will wait to see how R600 does with more mature drivers and also wait to see the 640MB GTS price come down even more in the meantime.
  • vijay333 - Monday, May 14, 2007 - link

    http://www.randomhouse.com/wotd/index.pperl?date=1...">http://www.randomhouse.com/wotd/index.pperl?date=1...

    "the expression to call a spade a spade is thousands of years old and etymologically has nothing whatsoever to do with any racial sentiment."

  • yacoub - Monday, May 14, 2007 - link

    Yes, a spade was a shovel long before muslims enslaved europeans to do hard labor in north africa and europeans enslaved africans to do hard labor in the 'new world'.
  • vijay333 - Monday, May 14, 2007 - link

    whoops...replied to the wrong one.
  • rADo2 - Monday, May 14, 2007 - link

    It is not 2nd best (after 8800ULTRA), not 3rd best (after 8800GTX), not 4th best (after 8800GTX-640), but 5th best (after 8800GTS-320), or even worse ;)

    Bad performance with AA turned on (everybody turns on AA), huge power consumption, late to the market.

    A definitive failure.
  • imaheadcase - Tuesday, May 15, 2007 - link

    quote:

    Bad performance with AA turned on (everybody turns on AA), huge power consumption, late to the market.


    Says who? Most people I know don't care to turn on AA since they visually can't see a difference. Only people who are picky about everything they see do normally, the majority of people don't notice "jaggies" since the brain fixes it for you when you play.
  • Roy2001 - Tuesday, May 15, 2007 - link

    Says who? Most people I know don't care to turn on AA since they visually can't see a difference.
    ------------------------------------------
    Wow, I never turn it of once I am used to have AA. I cannot play games anymore without AA.
  • Amuro - Tuesday, May 15, 2007 - link

    quote:

    the majority of people don't notice "jaggies" since the brain fixes it for you when you play.

    Says who? No one spent $400 on a video card would turn off AA.
  • SiliconDoc - Wednesday, July 8, 2009 - link

    Boy we'd sure love to hear those red fans claiming they turn off AA nowadays and it doesn't matter.
    LOL
    It's just amazing how thick it gets.
  • imaheadcase - Tuesday, May 15, 2007 - link

    quote:

    Says who? No one spent $400 on a video card would turn off AA.


    Sure they do, because its a small "tweak" with a performance hit. I say who spends $400 on a video card to remove "jaggies" when they are not noticeable in the first place to most people. Same reason most people don't go for SLI or Crossfire, because it really in the end offers nothing substantial for most people who play games.

    Some might like it, but they would not miss it if they stopped using it for some time. Its not like its make or break feature of a video card.
  • motiv8 - Tuesday, May 15, 2007 - link

    Depends on the game or player tbh.

    I play within ladders without AA turned on, but for games like oblivion I would use AA. Depends on your needs at the time.

Log in

Don't have an account? Sign up now