Comments Locked

74 Comments

Back to Article

  • SixtyFo - Friday, September 15, 2006 - link

    So do they still use a dongle between the cards? If you had 2 xfire cards then it won't be connecting to a dvi port. Is there an adaptor? I guess what I'm asking is are you REALLY sure I can run 2 crossfire ed. x1950s together? I'm about to drop a grand on video cards so that piece of info may come in handy.
  • unclebud - Friday, September 1, 2006 - link

    "And 10Mhz beyond the X1600 XT is barely enough to warrant a different pair of letters following the model number, let alone a whole new series starting with the X1650 Pro."

    nvidia has been doing it for years with the 4mx/5200/6200/7300/whatever and nobody here said boo!
    hm.
  • SonicIce - Thursday, August 24, 2006 - link

    How can a whole X1900XTX system use only 267 watts? So a 300w power supply could handle the system?
  • DerekWilson - Saturday, August 26, 2006 - link

    generally you need something bigger than a 300w psu, because the main problem is current supply on both 12v rails must be fairly high.
  • Trisped - Thursday, August 24, 2006 - link

    The crossfire card is not the same as the normal one. The normal card also has the extra video out options. So there is a reason to buy the one to team up with the other, but only if you need to output to a composite, s-video, or component.
  • JarredWalton - Thursday, August 24, 2006 - link

    See discussion above under the topic "well..."
  • bob4432 - Thursday, August 24, 2006 - link

    why is the x1800xt left out of just about every comparison i have read? for the price you really can't beat it....
  • araczynski - Thursday, August 24, 2006 - link

    ...I haven't read the article, but i did want to just make a comment...

    having just scored a brand new 7900gtx for $330 shipped, it feels good to be able to see the headlines for articles like this, ignore them, and think "...whew, i won't have to read anymore of these until the second generation of DX10's comes out..."

    I'm guessing nvidia will be skipping the 8000's, and 9000's, and go straight for the 10,000's, to signal the DX10 and 'uber' (in hype) improvements.

    either way, its nice to get out of the rat race for a few years.
  • MrJim - Thursday, August 24, 2006 - link

    Why no Anisotropic filtering tests? Or am i blind?
  • DerekWilson - Saturday, August 26, 2006 - link

    yes, all tests are performed with at least 8xAF. Under games that don't allow selection of a specific degree of AF, we choose the highest quality texture filtering option (as in BF2 for instance).

    AF comes at fairly little cost these days, and it just doesn't make sense not to turn on at least 8x. I wouldn't personally want to go any higher without angle independant AF (like the high quality af offered on ATI x1k cards).
  • JarredWalton - Thursday, August 24, 2006 - link

    Anisotropic filtering was enabled in all tests at 8xAF as far as I know. When we use antialiasing, we generally enable anisotropic filtering as well.
  • LoneWolf15 - Thursday, August 24, 2006 - link

    Looks like there's no HDCP support or HDMI connector added like I'd expect with a brand new top-end card. And, they didn't add the new quieter cooler to the X1900XT. Pity. I doubt it would cost ATI more, and it'd up the sale of cards since people hate the noisy fan ATI has been currently using.

    I'll pass. My older (by alpha-geek standards) X800XL does the job fine.

    P.S. -1 for not doing any bench tests with Elder Scrolls: Oblivion.
  • DerekWilson - Saturday, August 26, 2006 - link

    also, all of these cards have HDCP support -- which I believe I mentioned somewhere in there. HDMI is up to the vendor.
  • JarredWalton - Thursday, August 24, 2006 - link

    +2 You might want to read page 8.
  • LoneWolf15 - Thursday, August 24, 2006 - link

    I don't know what's going on, I must have been blind. My apologies there, Jarred.
  • Dfere - Thursday, August 24, 2006 - link

    You just can't always eat your cake and then have it left over.

    YOu should change your phrase from "Sometimes we can have our cake and eat it too"

    to "Sometimes we can eat our cake and have it too"
  • poohbear - Thursday, August 24, 2006 - link

    the established exnglish expression is "you cant have your cake and eat it too", even if it doesnt make logical sense. There are many words and expressions that dont make sense in english (driveway, football, highway). Im guessing you're not a native english speaker, but that's the way the language is. now, please post about technology and not the logic of english expressions.
  • Griswold - Thursday, August 24, 2006 - link

    Whats wrong with football? Or do you mean american "football"?
  • poohbear - Thursday, August 24, 2006 - link

    can anyone confirm if those power consumption tests are for the entire system or just the vid cards? the highest figure was 267wts: a high end system that consumes 267wts underload is sweet! can you confirm that is indeed for the entire system (cpu, mobo, hdd, vid card... everything). thanks.
  • JarredWalton - Thursday, August 24, 2006 - link

    I'm pretty sure that this is power use for the entire system, but Derek's results are quite a bit lower than what I got on the ABS system I tested last week for X1900 CrossFire. Of course, the water cooling and extra fans on the ABS system might add a decent amount of power draw, and I don't know how "loaded" the systems are in this test. I would guess that Derek ran Splinter Cell: Chaos Theory for load conditions.
  • DerekWilson - Saturday, August 26, 2006 - link

    yeah ... i didn't test power with crossfire -- which is a whole lot higher. also, i have a minimal set of componets to make it work -- one hdd, one cdrom drive, and no addin cards other than graphics.

    we'll do multi-gpu power when we look at quadsli
  • ElFenix - Thursday, August 24, 2006 - link

    the review states that power consumption was measured at the wall wtih a kill-a-watt, during a 3Dmark run.

    in addition to the water cooling, it could be he's running a more efficient PSU. in a powerful system drawing 220 watts from the power supply would draw 277 watts from the wall with an 80% efficient PSU (like a good seasonic) and draw 314 watts with a 70% efficient PSU. that's a pretty decent difference right there.

    ... still waiting for nvidia's HQ driver run...
  • poohbear - Thursday, August 24, 2006 - link

    thanks
  • Rock Hydra - Wednesday, August 23, 2006 - link

    With those competitively price parts, hopefully nVIDIA will respond with lower prices.
  • CreepieDeCrapper - Wednesday, August 23, 2006 - link

    I'm not familiar with 1920x1440, did you mean 1920x1200? What resolution were these tests performed? Thank you!

  • JarredWalton - Wednesday, August 23, 2006 - link

    1920x1440 is a standard 4:3 aspect ratio used on many CRTs. It is often included as performance is somewhat close to 1920x1200 performance.
  • CreepieDeCrapper - Wednesday, August 23, 2006 - link

    Thanks, I've been using my LCD for so long I forgot about the vintage CRT res's out there ;) Plus I never ran that particular res on my CRT when I had one, so I just wasn't familiar.
  • cgaspar - Wednesday, August 23, 2006 - link

    While average frame rates are interesting, I _really_ care about minimum frame rates - 300fps average is useless if at a critical moment in a twitch game the frame rate drops to 10fps for 3 seconds - this is especially true in Oblivion. Of course it's possible that the minimums would be the same for all cards (if the game is CPU bound in some portion), but they might not be.
  • JarredWalton - Wednesday, August 23, 2006 - link

    A lot of games have instantaneous minimums that are very low due to HDD accesses and such. Oblivion is a good example. Benchmarking also emphasizes minimum frame rates, as in regular play they occur less frequently. Basically, you run around an area for a longer period of time in actual gaming, as opposed to a 30-90 second benchmark. If there's a couple seconds at the start of the level where frame rates are low due to the engine caching textures, that doesn't mean as much as continuos low frame rates.

    More information is useful, of course, but it's important to keep things in perspective. :)
  • kmmatney - Wednesday, August 23, 2006 - link

    The charts show tht the 7900GT gets a huge boost from being factory overclocked. It would be nice to see if the X1900XT 256 MB can also be overclocked at all, or if there is any headroom.
  • JarredWalton - Wednesday, August 23, 2006 - link

    We used factory overclocked 7900 GT cards that are widely available. These are basically guaranteed overclocks for about $20 more. There are no factory overclocked ATI cards around, but realistically don't expect overclocking to get more than 5% more performance on ATI hardware.

    The X1900 XTX is clocked at 650 MHz, which isn't much higher than the 625 MHz of the XT cards. Given that ATI just released a lower power card but kept the clock speed at 650 MHz, it's pretty clear that there GPUs are close to topped out. The RAM might have a bit more headroom, but memory bandwidth already appears to be less of a concern, as the X1950 isn't tremendously faster than the X1900.
  • yyrkoon - Wednesday, August 23, 2006 - link

    I think its obvious why ATI is selling thier cards for less now, and that reason is alot of 'tech savy' users, are waiting for Direct3D 10 to be released, and want to buy a capable card. This is probably to try an entice some people into buying technology that will be 'obsolete', when Direct3D 10 is released.

    Supposedly Vista will ship with Directx 9L, and Directx 10 (Direct3D 10), but I've also read to the contrary, and that Direct3D 10 wont be released until after Vista ships (sometime). Personally, I couldnt think of a better time to buy hardware, but alot of people think that waiting, and just paying through the nose for a Video card later, is going to save them money. *shrug*
  • Broken - Wednesday, August 23, 2006 - link

    In this review, the test bed was an Intel D975XBX (LGA-775). I thought this was an ATI Crossfire only board and could not run two Nvidia cards in SLI. Are there hacked drivers that allow this, and if so, is there any penalty? Also, I see that this board is dual 8x pci-e and not dual 16x... at high resolutions, could this be a limiting factor, or is that not for another year?

  • DerekWilson - Wednesday, August 23, 2006 - link

    Sorry about the confusion there. We actually used an nForce4 Intel x16 board for the NVIDIA SLI tests. Unfortunately, it is still not possible to run SLI on an Intel motherboard. Our test section has been updated with the appropriate information.

    Thanks for pointing this out.

    Derek Wilson
  • ElFenix - Wednesday, August 23, 2006 - link

    as we all should know by now, Nvidia's default driver quality setting is lower than ATi's, and makes a significant difference in the framerate when you use the driver settings to match the quality settings. your "The Test" page does not indicate that you changed the driver quality settings to match.
  • DerekWilson - Wednesday, August 23, 2006 - link

    Drivers were run with default quality settings.

    Default driver settings between ATI and NVIDIA are generally comparable from an image quality stand point unless shimmering or banding is noticed due to trilinear/anisotropic optimizations. None of the games we tested displayed any such issues during our testing.

    At the same time, during our Quad SLI followup we would like to include a series of tests run at the highest possible quality settings for both ATI and NVIDIA -- which would put ATI ahead of NVIDIA in terms of Anisotropic filtering or in chuck patch cases and NVIDIA ahead of ATI in terms of adaptive/transparency AA (which is actually degraded by their gamma correction).


    If you have any suggestions on different settings to compare, we are more than willing to run some tests and see what happens.

    Thanks,
    Derek Wilson
  • ElFenix - Wednesday, August 23, 2006 - link

    could you run each card with the quality slider turned all the way up, please? i believe that the the default setting for ATi, and the 'High Quality' setting for nvidia. someone correct me if i'm wrong.

    thanks!

    michael
  • yyrkoon - Wednesday, August 23, 2006 - link

    I think as long as all settings from both offerings are as close as possible per benchmark, there is no real gripe.

    Although, some people seem to think it nessisary to run AA as high resolutions (1600x1200 +), but I'm not one of them. Its very hard for me to notice jaggies even at 1440x900, especially when concentrating on the game, instead of standing still, and looking with a magnifying glass for jaggies . . .
  • mostlyprudent - Wednesday, August 23, 2006 - link

    When are we going to see a good number of Core 2 Duo motherboards that support Crossfire? The fact that AT is using an Intel made board rather than a "true enthusiast" board says something about the current state of Core 2 Duo motherboards.
  • DerekWilson - Wednesday, August 23, 2006 - link

    Intel's boards are actually very good. The only reason we haven't been using them in our tests (aside from a lack of SLI support) is that we have not been recommending Intel processors for the past couple years. Core 2 Duo makes Intel CPUs worth having, and you definitely won't go wrong with a good Intel motherboard.
  • nextsmallthing - Wednesday, August 23, 2006 - link

    Did anyone else notice that the specs for some of the NVIDIA cards are wrong? For example, the core clock of the 7900GTX is supposed to be 650 MHz, not 700 MHz, and the core clock of the 7900GT should be 450 MHz, not 470 MHz. Also, the pipeline configuration for the 7300GT (according to Wikipedia anyway) should be 8 pixel & 4 vertex.

    This many mistakes really makes me question the accuracy of other specs I read on Anandtech.

    (And by the way, would somebody please inform the DailyTech writers that it's "Xbox 360", not "XBOX 360". And yes I'm aware of the conventions that punctuation goes inside quotes and you shouldn't start sentences with "and".)
  • Anand Lal Shimpi - Wednesday, August 23, 2006 - link

    The 7900GTX/GT clock speeds that were listed were actually vertex clock speeds, not general core clock speeds, so they were technically correct (parts of the GPU do run at those frequencies) just not comparable to the other numbers. They have been corrected.

    The 7300GT is indeed 8 pipes, that was a copy/paste error. Thanks for the heads up.

    Take care,
    Anand
  • nextsmallthing - Thursday, August 24, 2006 - link

    Wow--prompt correction and courteous reply. I'm impressed, and my faith in Anandtech is restored!
  • Josh7289 - Wednesday, August 23, 2006 - link

    From the looks of the pricing structure for ATI's cards on the first page, and especially from the looks of the pricing structure for ATI's cards after they simplify their lineup, it looks like ATI is giving up on midrange cards, from $100 - $200. The 7600GT and the upcoming 7900GS both are alone in that price range (about $150 and $200, respecitively), with no competition from ATI, so it seems they really are giving that price range to Nvidia.

    Am I right with this or am I seriously missing something?
  • yyrkoon - Wednesday, August 23, 2006 - link

    there is a x1800GTO2, price last I looked was around $230, of course, they released it rather quietly. Still, thats about $90 higher than the 7600 GT (or in my case the eVGA 7600GT KO).
  • OrSin - Wednesday, August 23, 2006 - link

    I wondering the same thing. Are they going to stop making any 1800's. They should be dropping in this price range nicely. Not sure how competative they are with the 7900's. And now that the 7900GS is coming out the 1800 might be just too outclassed. (you guys just missed a great deal on woot 7900GS for $145).

    I hope 1800 drop is still being made and I hope it drops to $150-180 range to fill that gap.
  • JarredWalton - Wednesday, August 23, 2006 - link

    I think they've already stopped making all of the X1800 series, but there are still cards floating around.
  • Josh7289 - Wednesday, August 23, 2006 - link

    The X1900GT is a card meant to compete with the stock 7900GT, and as such is somewhere around the $200 - $250 price range.

    As for the X1950 Pro and X1650 XT, what are these supposed to compete against and at what prices. More importantly, when are these supposed to launch?
  • coldpower27 - Wednesday, August 23, 2006 - link

    As well the X1650 XT is also in the works.
  • coldpower27 - Wednesday, August 23, 2006 - link

    X1950 Pro, is upcoming, as well they still have the X1900 GT.
  • Ecmaster76 - Wednesday, August 23, 2006 - link

    Is it a GDDR3 or a DDR2 product?

    If the former, any chance it will crossfire with x1600 xt? Oficially I mean (methinks a bios flash might work, though x1650 is maybe a 80nm part)
  • coldpower27 - Wednesday, August 23, 2006 - link

    No I don't think that would work.

    an X1650 Pro has 600/1400 Speeds so 100% sure is GDDR3, DDR2 doesn't exisit at such high clockspeed.

  • Genx87 - Wednesday, August 23, 2006 - link

    Some of the other reviews had this x1950XT beating the GX2 almost every time, sometimes by a wide margin.

    I still cant get over the power\transistor\die size to performance advantage Nvidia has over ATI right now.

  • PrinceGaz - Wednesday, August 23, 2006 - link

    Interesting. The first review I read was at where the X1950XTX beat or equalled the 7950GX2 every time, then here the reverse is true. I think I'll have to read more reviews to decide what is going on (it certainly isn't CPU limitations). Maybe 's focus on optimum quality settings rather than raw framerate is the reason they favoured ATI, and another is the clear fact that when it came to minimum framerates instead of average framerates ( posted both for all tests) the X1950XTX was especially good.

    In other words the 7950GX2 posted great average numbers, but the X1950XTX was playable at higher quality settings because the minimum framerate didn't drop so low. Hopefully some other sites also include minimum framerates along with graphs to clearly show the cards perform.

    I remember a few years ago when ATs graphics card articles included image-quality comparisons and all sorts of other reports about how the cards compared in real-world situations. Now it seems all we get is a report on average framerate with a short comment that basically says "higher is better". Derek- I strongly suggest you look at how test cards and the informative and useful comments that accompany each graph. There may only have been three cards in their comparison but it gave a much better idea of how the cards compare to each other.

    Anyway I'll not be getting any of these cards. My 6800GT has plenty of performance for now so I'll wait until Vista SP1 and the second-generation of DX10 cards which hopfully won't require a 1KW PSU :)
  • PrinceGaz - Wednesday, August 23, 2006 - link

    It seems the comments system here uses the brackets in HardOCP's abbreviation as some sort of marker. Apologies for making the rest of the text invisible, please amend my comment appropriately. I was talking about HardOCP by the way, when I said they use minimum framerates and optimum quality settings for each card.
  • JarredWalton - Wednesday, August 23, 2006 - link



    Don't use {H} in the comments, please. Just like {B} turns on bold, {H} turns on highlighting (white text).
  • JarredWalton - Wednesday, August 23, 2006 - link

    Ah, seems you figured that out already. ;) I need to see if we can disable that feature....
  • haris - Wednesday, August 23, 2006 - link

    Actually if you look at all of the reviews a bit more closely the sites the scores depend on which processor is being used for the test. It appears that nVidia cards tend to run better on Conroes(probably just means the games are slightly less cpu bottlenecked at the resolutions being tested) while ATi tends to run better on AMD systems(or when the cpu is slowing things down) Of course that is IIRC from the 5 reviews I skimmed through today.
  • coldpower27 - Wednesday, August 23, 2006 - link

    No just no X1950 XTX alone is not more powerful then the 7950GX2. Only in ATI favourable scenarios or where SLI flat out doesn't work will this occur.



  • UNESC0 - Wednesday, August 23, 2006 - link

    quote:

    You get the same performance, same features and better flexibility with the CrossFire card so why not?


    you might want to run dual monitors...
  • Vigile - Wednesday, August 23, 2006 - link

    My thought exactly on this one Anand...
  • Anand Lal Shimpi - Wednesday, August 23, 2006 - link

    You can run dual monitors with a CrossFire card as well, the CrossFire dongle that comes with the card has your 2nd DVI output on it :)

    Take care,
    Anand
  • kneecap - Wednesday, August 23, 2006 - link

    What about VIVO? The Crossfire Edition does not support that.
  • JarredWalton - Wednesday, August 23, 2006 - link

    For high-end video out, the DVI port is generally more useful anyway. It's also required if you want to hook up to a display using HDCP - I think that will work with a DVI-to-HDMI adapter, but maybe not? S-VIDEO and Composite out are basically becoming seldom used items in my experience, though the loss of component out is a bit more of a concern.
  • JNo - Thursday, August 24, 2006 - link

    So if I use DVI out and attach a DVI to HDMI adaptor before attaching to a projector or HDTV, will I get a properly encrypted signal to fully display future blu-ray/hd-dvd encrypted content?

    The loss of component is a bit of a concern as many HDTVs and projectors still produce amazing images with component and, in fact, I gather that some very high resolutions+refresh rates are possible on component but not DVI due to certain bandwidth limitations with DVI. But please correct me if I am wrong. I take Anandtech's point on the crossfire card offering more but with a couple of admittedly small quesiton marks, I see no reason not to get the standard card and crossfire for the second later if you decided to go that route...
  • JarredWalton - Thursday, August 24, 2006 - link

    I suppose theoretically component could run higher resolutions than DVI, with dual-link being required for 2048x1536 and higher. Not sure what displays support such resolutions with component inputs, though. Even 1080p can run off of single-link DVI.

    I think the idea with CF cards over standard is that they will have a higher resale value if you want to get rid of them in the future, and they are also more versatile -- TV out capability being the one exception. There are going to be a lot of people that get systems with a standard X1950 card, so if they want to upgrade to CrossFire in the future they will need to buy the CrossFire edition. We all know that at some point ATI is no longer going to make any of the R5xx cards, so if people wait to upgrade to CrossFire they might be forced to look for used cards in a year or two.

    Obviously, this whole scenario falls apart if street prices on CrossFire edition cards end up being higher than the regular cards. Given the supply/demand economics involved, that wouldn't be too surprising, but of course we won't know for another three or four weeks.
  • UNESC0 - Wednesday, August 23, 2006 - link

    thanks for clearing that up Anand, news to me!
  • TigerFlash - Wednesday, August 23, 2006 - link

    I was wondering if anyone thinks it's wise to get an intel core duo 2 motherboard with crossfire support now that AMD is buying out ATI. Do you think ATI would stop supporting Intel motherboards?
  • johnsonx - Wednesday, August 23, 2006 - link

    quote:

    Do you think ATI would stop supporting Intel motherboards?


    Of course not. AMD/ATI isn't stupid. Even if their cross-licensing agreement with Intel didn't prevent them from blocking Crossfire on Intel boards (which it almost surely does), cutting out that part of the market would be foolish.
  • dderidex - Wednesday, August 23, 2006 - link

    What's with the $99 -> $249 gap?

    Weren't we supposed to see an X1650XT, too? Based on RV570? ...or RV560? Something?
  • TigerFlash - Wednesday, August 23, 2006 - link

    I suppose I worded that the opposite way. Do you think Intel will stop supporting Crossfire cards?
  • michal1980 - Wednesday, August 23, 2006 - link

    Can we not even get any numbers for cards below the 7900GTX.

    I understand your limited, but how about some numbers from some cards below that, to see what an upgrade would do.

    I know we can kind of take test from old reviews of the cards, but your test bed has changed since core 2, so its not a fair heads to heads test of old numbers to new.

    it would be nice to see if theres a point(wise or not) to upgrade from a 7800gt or that gen of cards, or something slower like a 7900gt.

    but it seems like ever 'new gen' card test just drops off 'older' cards
  • michal1980 - Wednesday, August 23, 2006 - link

    what i meant is that on the tables, or where all the new cards are, it would be nice to have some numbers for old cards.
  • Lifted - Wednesday, August 23, 2006 - link

    Agreed. I'm still running a 6800GT and have not seen much of a reason to upgrade with the current software I run. Perhaps if I saw that newer games are 3x faster I might consider an upgrade, so how about it?

Log in

Don't have an account? Sign up now