Answering this question will put overclocking expectations in line. Generally, optically shrunk cores from TSMC overclock to the about the same as the original or perhaps slightly worse.
Well no as this piepline configuration doesn't exist natively before on the 90nm node. It's a 3 Quad Part, so it's basedon R580 but has 1 Quad Physical removed as well as being shrunk to 80nm. Not to mention Native Crossfire support was added onto the die.
Optical shrink, this is 80nm and the original was 90nm. You're normally correct because the first optical shrink usually does not have the same technologies as the proces higher up (low-k and SOI for example, this was the case with 130nm -> 110nm), but I don't think it's the case for this generation. Regardless, haven't seen any overclocking articles on it yet so I'm quite curious.
oie, maybe I should add that it's reworked as well, so both actually. Since this core didn't exist before (rv570 and that pipeline configuration), I don't think that they just sliced a part of the core...
Just because ATI made the spec change public does not mean it is alright to change the specs of a product that has been shipping for 4 months.
X1900 GT has been available since May 9 as a 575/1200 part.
The message we want to send isn't that ATI is trying to hide something, its that they shouldn't do the thing in the first place.
No matter how many times a company says it changed the specs of a product, when people search for reviews they're going to see plenty that have been written since May talking about the original X1900 GT.
Naming is already ambiguous enough. I stand by my opinion that having multiple versions of a product with the exact same name is a bad thing.
I'm sorry if I wasn't clear on this in the article. Please let me know if there's anything I can reword to help get my point across.
any word on what the new price for the x1900gt's will be now that the x1950pros are out?
or are they being phased out and no price drop is being considered?
For a single card buy why would you get this?
Why would you buy the 1900GT even after the
1900XT 256MB came out?
I got my 1900XT 256MB for $240 shipped..
Except for power consumption it's a much better card.
You get to run Oblivion great with one card.
Two cards is such a scam. More expensive motherboard..power consumption etc.
This is progress? CPU's have evolved..
It's hard to even find a motherboard with 3 PCI slots..
What a scam! Where's my ultra-fast HDTV board for PCI Express?
Seriously..Why buy into SLI/Crossfire? Why not 2 GPU's on one card?
Too late..You all bought into it.
Sorry I am just so sick of the praise for this money-grab of SLI/Crossfire.
First off, it was an intersting article and it's nice to see that ATi is making changes to "fix" the poor dual-card implementation :).
But now akin to my subject line, I was wondering if you [Anandtech] would be keen on adding in a "Performance / Price" sort of chart at the end. The idea would be to keep different quality modes separate (i.e. No-AA/No-AF and #x-AA/#x-AF would need separate charts or only one chart but using one set of data) while ordering cards by their average FPS or "average ranking". Such as, if Video Card A is seen in spot #1 the most, it stays there in the chart regardless of the figure listed. I'd say this is more of a user-friendly idea than anything required to be practical. But here's what I think a typical bar would represent:
The idea is that some video cards may be return the same value (i.e. 120/$400 and 60/$200), but listing cards in their typical performance standing allows someone to say "well, hey... I like the performance ratio of this card, and I don't need 120 FPS!"
I think it may provide a way for people who read the benchmarks to get a real world idea of these cards rather than an "in box" idea, because as nice as it is to see a card produce 120 FPS... how much will we have to produce to purchase it ;).
Just something I thought up while looking at pages and pages of charts and not knowing how "worthwhile" the cards really were for the performance.
I don't think it'll be very practical for most people. When a graphic card purchase is being considered, most people just have a hard cash target in mind (for example, 200$) and just look for the fastest card at that price point. The only exception to this is when those people notice that a small rise in their budget allows a much more powerful graphic card purchase, e.g. 6600gt for 100$ but 7600gt for 130$
So as long as anandtech keeps comparing on pricepoints and of course mentioning any caveats/featuredifferences/possible better deals higher up, there's no real need for such a chart.
That's really a view of someone who's looking to spend x amount of dollars trying to find the best item for that price, which is one thing that's kept in focus. My proposed chat method helps to show two things really, what's the best "bang for your buck" and allows you to see if that performance pricing is the level of performance that you want.
To go into detail a bit about that, if the best performance ratio was .3 (120fps/$400 or 60fps/$200, etc), someone might go, "Well, hey... I like the performance rating on these two cards, but I don't need something that fast." So, the user would shoot for the 60fps card.
But in your case, you most likely benefit from what's currently done. For me, I could really care less how much a new video card costs me. I got tired of spending smaller amounts on mediocre performance years ago and I've tried to keep my dollars spent in the high-end sector (the range of high-end cards, not necessarily referring to the ultra-high-end-uber cards only) as much as possible to avoid having to perform constant upgrades to keep that level of detail that I like (I can't stand "jaggies" ... I'm just too anal about things like that).
For example, my current 6800GT was the 2nd top performing graphics card when I bought it, but it simply doesn't suit my needs anymore as below 30FPS (in more render-heavy areas) in WoW at 1280x1024 with high graphics settings is just subpar. Now, my card is old and about to be replaced by another generation (the G80). Or maybe I just need one of them Killer NICs (just kidding :P).
I think a good thing to mention is that the bar system that I mentioned may be better for a bulk review more than a single card review. Especially since a lot of people may come into the review of a single card knowing all the information about the other cards, so they're only performing minor mental comparisons. But in a bulk review, there's a lot more information to keep in check, so an overall comparison that takes a couple of the specifics about a card ( average performance and price mainly ) could be beneficial.
"The idea is that some video cards may be return the same value"
should be...
"The idea is that some video cards may return the same value"
Stupid me deleting and rewriting so much that I'll leave in a word from a prior revision XD. Oh yeah, and I had to make sure to post this fix before someone sought it upon themselves to ignore the entire post, but to make a comment on my little accident :P.
With other CrossFire configurations as a guide, we can easily expect X1950 Pro to nearly double its single card performance and put it on par with the 7950 GX2 and 7900 GT SLI configurations. As for single card performance, we see the trend of X1950 Pro domination continuing. Performance greater than that of the 7900 GS and GT for $200 is quite a plus.
Now all ATI has to do is get rid of that Catalyst Control Center. I have been using a Radeon 9700 Pro for two years now, and I have been almost perfectly happy with it except for two problems: It sometimes loses TV mode with no explanation, and I have to remove and re-add the TV in CCC. The other is that CCC takes anywhere from 10-20 seconds to load up on an Athlon XP 2500+. The old control panel set opened within 2 or three seconds (the advanced screens). I've set up a quick profile, but even right-clicking the taskbar icon has a huge delay (often ten seconds or more) before the menu appears, and then sometimes activating the profile doesn't work, and I have to set it manually (sometimes it does work). After initial launch, it does open more quickly (cached, I assume). I assume that this is because of the use of .NET.
Also, ZoneAlarm reports that CLI.EXE (listed as various ATI apps) is listening to TCP ports 1052, 1057, and 1058. There are currently three CLI.exe processes running on my system, taking 6,132K, 5,800K, and 3,572K of memory respectively. That is about 15MB of memory, and I don't have CCC open. My old Matrox G400Max had tons of options in the advanced screens, had a better TV output quality, and didn't require all of this garbage. The old ATI control panel even seemed easier to use.
That is ridiculous. Can they not make an installer that removes their own cruft before installing the new version? It's a chore to put a new driver in, and it shouldn't be.
It might be a good idea to use omega's drivers, they do not include catalyst control center but instead use ati tray tools OR the old control panel slightly updated. The only downside to this is that omega's are sometimes one or two releases behind the official ones.
if you're not comfortable with omega's drivers (even though they're rock solid :)) you can always download just the driver from ati and install ati tray tools seperatly. it includes every option you need to change driver settings etc but is a sleek minimalist fast 1mb tool :)
Unfortunately, CCC is required to enable CrossFire. I don't know if Omega gets around this requirement somehow, but the standard ATI control panel drivers do not have the CrossFire checkbox anywhere.
The awkward drivers is actually the main reason I steer clear of ATi still. Also, I get a bit annoyed at the company as they only seem to care about their graphics sector and ignore all of their other products. My ATi TV Wonder Pro Remote Control Edition had so many problems over the years that it was barely worth owning. The Remote Control software just crashes randomly still.
Although, I have yet to try the newest version of the software, because I removed the card from my system and it won't let you install the main software without it.
So... with my experience, it leaves me a bit wary.
But I do also have to admit how much I also don't like the newer nVidia control panel, but at least I can go back to the original one with one mouse click :).
After browsing through some other reviews, all which seem to use the Catalyst 6.9 drivers, it occured to me that they all have significantly lower performance for the ATi camp then what anandtech is reporting.
Most reviews place 7900gs performance well above that of the x1950pro in quake 4. Can anyone explain to me why that is, and the supposed opengl/doom3 optimisations are only being seen by AT and not by sites such as bit-tech, hardocp, the tech report, firing squad, etc. ??
First of all, every site uses their own benchmarking techniques and sequences in the games. Numbers between review sites won't be comparable.
For Quake 4 we used ultraquality mode, and this seems to give ATI the advantage over NVIDIA. We don't have a problem with this because we would prefer to tip the scales in favor of the product that can deliver the best performance at the highest image quality.
We would rather they just run out of x1900 GT cards. They're discontinuing the line anyways, so it seems a little strange to try to increase supply by underhanded means.
They should ship it under a different model number. Call is the X1900 GTA or something like that (or some other alphabet soup combo that's not already taken) so that people can tell that the different model# = different performance.
You guys are ragging on this CF implementation like it's some sub-par solution. The transfer speed may be lower than that used by NVIDIA's SLI bridge, but SLI is simplex while this implementation is full duplex. Being able to send data in both directions at the same time should provide a huge speed boost while using ATi's SuperAA modes.
Scalability is the key factory. In most benchmarks, SLI gets more of an improvement than CrossFire, indicating that the compositing engine is not an optimal multi-GPU solution. There's almost certainly a decent amount of overhead involved. We do like the new CF connector, but the proof is in the pudding. If 7900 GS is clearly slower in single card configs but often faster in dual-GPU configs, clearly SLI is scaling better than CF.
Are you guys thinking of doing any testing with any of either vendor's multi-card AA modes any time soon? I really think the full duplex connection would really help there (i.e. the cards may not scale as well with the number of cards, but what about as the image quality increases?)
The Tech Report had problems with this motherboard and Crossfire which made them switch to the Asus P5W DH.
You aren't expiriencing any problems with this board?
Second thing, why no AA with games like B&W2 and FEAR?
AA performance under Black and White 2 and FEAR were excluded because we decided framerate was already at a minimum for the resolution we were testing.
Bit disappointed, was hoping for 600/700 clocks. I'm curious about the temperatures under load and if it would easily overclock to at least those speeds. And what about HDCP? But I guess we'll have to wait for retail cards.
If the price is €200 or less I just might be getting one to replace my x800xt :)
HDCP support is optional for vendors, but it seems like ATI is heavily encouraging them to include HDCP on all 1950 PRO cards. Since it's not guaranteed, be sure to check the specifications before you purchase.
The power color 1950 PRO is not passively cooled but it includes a low dB fan. It does look like an interesting product, and we intend to acquire one for further investigation.
-the pricefinder at the top of the article seems a bit out of whack. It shows as x1950 512 mb (PCI), but it links to the 1950 pro 256 mb for amazon and to the x1950 xt for the others.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
45 Comments
Back to Article
Zoomer - Thursday, October 19, 2006 - link
Is this a optical shrink to 80nm?Answering this question will put overclocking expectations in line. Generally, optically shrunk cores from TSMC overclock to the about the same as the original or perhaps slightly worse.
coldpower27 - Friday, October 20, 2006 - link
Well no as this piepline configuration doesn't exist natively before on the 90nm node. It's a 3 Quad Part, so it's basedon R580 but has 1 Quad Physical removed as well as being shrunk to 80nm. Not to mention Native Crossfire support was added onto the die.Spoelie - Friday, October 20, 2006 - link
Optical shrink, this is 80nm and the original was 90nm. You're normally correct because the first optical shrink usually does not have the same technologies as the proces higher up (low-k and SOI for example, this was the case with 130nm -> 110nm), but I don't think it's the case for this generation. Regardless, haven't seen any overclocking articles on it yet so I'm quite curious.Spoelie - Friday, October 20, 2006 - link
oie, maybe I should add that it's reworked as well, so both actually. Since this core didn't exist before (rv570 and that pipeline configuration), I don't think that they just sliced a part of the core...Zstream - Tuesday, October 17, 2006 - link
Beyond3D reported the spec change a month before anyone received the card. I think you need to do some FAQ checking on your opinions mate.All in all decent review but poor unknowledgeable opinions…
DerekWilson - Wednesday, October 18, 2006 - link
Just because ATI made the spec change public does not mean it is alright to change the specs of a product that has been shipping for 4 months.X1900 GT has been available since May 9 as a 575/1200 part.
The message we want to send isn't that ATI is trying to hide something, its that they shouldn't do the thing in the first place.
No matter how many times a company says it changed the specs of a product, when people search for reviews they're going to see plenty that have been written since May talking about the original X1900 GT.
Naming is already ambiguous enough. I stand by my opinion that having multiple versions of a product with the exact same name is a bad thing.
I'm sorry if I wasn't clear on this in the article. Please let me know if there's anything I can reword to help get my point across.
Zoomer - Thursday, October 19, 2006 - link
This is very common. Many vendors in the past have passed off 8500s that run at 250/250 instead of the stock 275/275, and don't label them as such.There are some Asus SKUs that have this same handicap, but I can't recall what models that were.
xsilver - Tuesday, October 17, 2006 - link
any word on what the new price for the x1900gt's will be now that the x1950pros are out?or are they being phased out and no price drop is being considered?
Wellsoul2 - Monday, November 6, 2006 - link
You guys are such cheerleaders..For a single card buy why would you get this?
Why would you buy the 1900GT even after the
1900XT 256MB came out?
I got my 1900XT 256MB for $240 shipped..
Except for power consumption it's a much better card.
You get to run Oblivion great with one card.
Two cards is such a scam. More expensive motherboard..power consumption etc.
This is progress? CPU's have evolved..
It's hard to even find a motherboard with 3 PCI slots..
What a scam! Where's my ultra-fast HDTV board for PCI Express?
Seriously..Why buy into SLI/Crossfire? Why not 2 GPU's on one card?
Too late..You all bought into it.
Sorry I am just so sick of the praise for this money-grab of SLI/Crossfire.
jcromano - Tuesday, October 17, 2006 - link
Are the power consumption numbers (98W idle, 181W load) for just the graphics card or are they total system power?Thanks in advance,
Jim
Spoelie - Tuesday, October 17, 2006 - link
They're total system power, we're not gonna see 180W gpu's till november :)Aikouka - Tuesday, October 17, 2006 - link
Those cards are going to be so hot.Sorry, that was my one and only bad pun of the day :P.
Aikouka - Tuesday, October 17, 2006 - link
First off, it was an intersting article and it's nice to see that ATi is making changes to "fix" the poor dual-card implementation :).But now akin to my subject line, I was wondering if you [Anandtech] would be keen on adding in a "Performance / Price" sort of chart at the end. The idea would be to keep different quality modes separate (i.e. No-AA/No-AF and #x-AA/#x-AF would need separate charts or only one chart but using one set of data) while ordering cards by their average FPS or "average ranking". Such as, if Video Card A is seen in spot #1 the most, it stays there in the chart regardless of the figure listed. I'd say this is more of a user-friendly idea than anything required to be practical. But here's what I think a typical bar would represent:
[ Video_Card_Name - (Average_FPS/Price) - Average_FPS ]
The idea is that some video cards may be return the same value (i.e. 120/$400 and 60/$200), but listing cards in their typical performance standing allows someone to say "well, hey... I like the performance ratio of this card, and I don't need 120 FPS!"
I think it may provide a way for people who read the benchmarks to get a real world idea of these cards rather than an "in box" idea, because as nice as it is to see a card produce 120 FPS... how much will we have to produce to purchase it ;).
Just something I thought up while looking at pages and pages of charts and not knowing how "worthwhile" the cards really were for the performance.
P.S. I'd love to hear comments!
Spoelie - Tuesday, October 17, 2006 - link
I don't think it'll be very practical for most people. When a graphic card purchase is being considered, most people just have a hard cash target in mind (for example, 200$) and just look for the fastest card at that price point. The only exception to this is when those people notice that a small rise in their budget allows a much more powerful graphic card purchase, e.g. 6600gt for 100$ but 7600gt for 130$So as long as anandtech keeps comparing on pricepoints and of course mentioning any caveats/featuredifferences/possible better deals higher up, there's no real need for such a chart.
but that's just my opinion
Aikouka - Tuesday, October 17, 2006 - link
That's really a view of someone who's looking to spend x amount of dollars trying to find the best item for that price, which is one thing that's kept in focus. My proposed chat method helps to show two things really, what's the best "bang for your buck" and allows you to see if that performance pricing is the level of performance that you want.To go into detail a bit about that, if the best performance ratio was .3 (120fps/$400 or 60fps/$200, etc), someone might go, "Well, hey... I like the performance rating on these two cards, but I don't need something that fast." So, the user would shoot for the 60fps card.
But in your case, you most likely benefit from what's currently done. For me, I could really care less how much a new video card costs me. I got tired of spending smaller amounts on mediocre performance years ago and I've tried to keep my dollars spent in the high-end sector (the range of high-end cards, not necessarily referring to the ultra-high-end-uber cards only) as much as possible to avoid having to perform constant upgrades to keep that level of detail that I like (I can't stand "jaggies" ... I'm just too anal about things like that).
For example, my current 6800GT was the 2nd top performing graphics card when I bought it, but it simply doesn't suit my needs anymore as below 30FPS (in more render-heavy areas) in WoW at 1280x1024 with high graphics settings is just subpar. Now, my card is old and about to be replaced by another generation (the G80). Or maybe I just need one of them Killer NICs (just kidding :P).
I think a good thing to mention is that the bar system that I mentioned may be better for a bulk review more than a single card review. Especially since a lot of people may come into the review of a single card knowing all the information about the other cards, so they're only performing minor mental comparisons. But in a bulk review, there's a lot more information to keep in check, so an overall comparison that takes a couple of the specifics about a card ( average performance and price mainly ) could be beneficial.
Thanks for the comments!
Aikouka - Tuesday, October 17, 2006 - link
"The idea is that some video cards may be return the same value"should be...
"The idea is that some video cards may return the same value"
Stupid me deleting and rewriting so much that I'll leave in a word from a prior revision XD. Oh yeah, and I had to make sure to post this fix before someone sought it upon themselves to ignore the entire post, but to make a comment on my little accident :P.
bupkus - Tuesday, October 17, 2006 - link
With other CrossFire configurations as a guide, we can easily expect X1950 Pro to nearly double its single card performance and put it on par with the 7950 GX2 and 7900 GT SLI configurations. As for single card performance, we see the trend of X1950 Pro domination continuing. Performance greater than that of the 7900 GS and GT for $200 is quite a plus.takumsawsherman - Tuesday, October 17, 2006 - link
Now all ATI has to do is get rid of that Catalyst Control Center. I have been using a Radeon 9700 Pro for two years now, and I have been almost perfectly happy with it except for two problems: It sometimes loses TV mode with no explanation, and I have to remove and re-add the TV in CCC. The other is that CCC takes anywhere from 10-20 seconds to load up on an Athlon XP 2500+. The old control panel set opened within 2 or three seconds (the advanced screens). I've set up a quick profile, but even right-clicking the taskbar icon has a huge delay (often ten seconds or more) before the menu appears, and then sometimes activating the profile doesn't work, and I have to set it manually (sometimes it does work). After initial launch, it does open more quickly (cached, I assume). I assume that this is because of the use of .NET.Also, ZoneAlarm reports that CLI.EXE (listed as various ATI apps) is listening to TCP ports 1052, 1057, and 1058. There are currently three CLI.exe processes running on my system, taking 6,132K, 5,800K, and 3,572K of memory respectively. That is about 15MB of memory, and I don't have CCC open. My old Matrox G400Max had tons of options in the advanced screens, had a better TV output quality, and didn't require all of this garbage. The old ATI control panel even seemed easier to use.
I would love to get a nice new x1950, as Oblivion gets pretty choppy even at fairly low settings. But I really want to get away from these bulky system tray apps. And look at the update process here: https://support.ati.com/ics/support/default.asp?de...">https://support.ati.com/ics/support/def...mp;task=...
That is ridiculous. Can they not make an installer that removes their own cruft before installing the new version? It's a chore to put a new driver in, and it shouldn't be.
Zoomer - Thursday, October 19, 2006 - link
I think this is partly microsoft's fault. .Net framework is a POS.ATi should shoulder some of the blame too, for choosing a crappy base to program on. I would think even java based apps are not this bad.
mesyn191 - Friday, October 20, 2006 - link
I dunno...I personally don't like the whole CCC/CP approach to config. the graphics card either, but nV's implementation aint' half bad.
Spoelie - Tuesday, October 17, 2006 - link
It might be a good idea to use omega's drivers, they do not include catalyst control center but instead use ati tray tools OR the old control panel slightly updated. The only downside to this is that omega's are sometimes one or two releases behind the official ones.if you're not comfortable with omega's drivers (even though they're rock solid :)) you can always download just the driver from ati and install ati tray tools seperatly. it includes every option you need to change driver settings etc but is a sleek minimalist fast 1mb tool :)
JarredWalton - Tuesday, October 17, 2006 - link
Unfortunately, CCC is required to enable CrossFire. I don't know if Omega gets around this requirement somehow, but the standard ATI control panel drivers do not have the CrossFire checkbox anywhere.Aikouka - Tuesday, October 17, 2006 - link
The awkward drivers is actually the main reason I steer clear of ATi still. Also, I get a bit annoyed at the company as they only seem to care about their graphics sector and ignore all of their other products. My ATi TV Wonder Pro Remote Control Edition had so many problems over the years that it was barely worth owning. The Remote Control software just crashes randomly still.Although, I have yet to try the newest version of the software, because I removed the card from my system and it won't let you install the main software without it.
So... with my experience, it leaves me a bit wary.
But I do also have to admit how much I also don't like the newer nVidia control panel, but at least I can go back to the original one with one mouse click :).
DerekWilson - Tuesday, October 17, 2006 - link
Right on.Zaitsev - Tuesday, October 17, 2006 - link
Typo on page 2, third paragraph."It is hard enough for us to sort things out when parts hit the selves at different speeds..."
RamarC - Tuesday, October 17, 2006 - link
suggestion: replace Q4 and B&W2 with Prey and Company of HeroesDerekWilson - Tuesday, October 17, 2006 - link
We are planning on doing exactly that starting in early November.spe1491 - Tuesday, October 17, 2006 - link
Possible typo?-
Basilisk - Tuesday, October 17, 2006 - link
Further clue: try "heartily"; "hardily" means "ruggedly", etc..Spoelie - Tuesday, October 17, 2006 - link
After browsing through some other reviews, all which seem to use the Catalyst 6.9 drivers, it occured to me that they all have significantly lower performance for the ATi camp then what anandtech is reporting.Most reviews place 7900gs performance well above that of the x1950pro in quake 4. Can anyone explain to me why that is, and the supposed opengl/doom3 optimisations are only being seen by AT and not by sites such as bit-tech, hardocp, the tech report, firing squad, etc. ??
DerekWilson - Tuesday, October 17, 2006 - link
First of all, every site uses their own benchmarking techniques and sequences in the games. Numbers between review sites won't be comparable.For Quake 4 we used ultraquality mode, and this seems to give ATI the advantage over NVIDIA. We don't have a problem with this because we would prefer to tip the scales in favor of the product that can deliver the best performance at the highest image quality.
munky - Tuesday, October 17, 2006 - link
Would you rather Ati continued to ship the x1900gt with the original specs, and then a bunch of the cards would have to be RMA'd?DerekWilson - Tuesday, October 17, 2006 - link
We would rather they just run out of x1900 GT cards. They're discontinuing the line anyways, so it seems a little strange to try to increase supply by underhanded means.sri2000 - Tuesday, October 17, 2006 - link
They should ship it under a different model number. Call is the X1900 GTA or something like that (or some other alphabet soup combo that's not already taken) so that people can tell that the different model# = different performance.Goty - Tuesday, October 17, 2006 - link
You guys are ragging on this CF implementation like it's some sub-par solution. The transfer speed may be lower than that used by NVIDIA's SLI bridge, but SLI is simplex while this implementation is full duplex. Being able to send data in both directions at the same time should provide a huge speed boost while using ATi's SuperAA modes.JarredWalton - Tuesday, October 17, 2006 - link
Scalability is the key factory. In most benchmarks, SLI gets more of an improvement than CrossFire, indicating that the compositing engine is not an optimal multi-GPU solution. There's almost certainly a decent amount of overhead involved. We do like the new CF connector, but the proof is in the pudding. If 7900 GS is clearly slower in single card configs but often faster in dual-GPU configs, clearly SLI is scaling better than CF.mesyn191 - Friday, October 20, 2006 - link
I don't think its possible to comment on the new CF at all, they've clearly got screwed up drivers for it ATM, but then its ATi so what else is new...I hope AMD cleans up thier driver team because still even after all these years ATi does a half assed job on its drivers.
Goty - Tuesday, October 17, 2006 - link
Are you guys thinking of doing any testing with any of either vendor's multi-card AA modes any time soon? I really think the full duplex connection would really help there (i.e. the cards may not scale as well with the number of cards, but what about as the image quality increases?)Rza79 - Tuesday, October 17, 2006 - link
The Tech Report had problems with this motherboard and Crossfire which made them switch to the Asus P5W DH.You aren't expiriencing any problems with this board?
Second thing, why no AA with games like B&W2 and FEAR?
DerekWilson - Tuesday, October 17, 2006 - link
No problems with the motherboard.AA performance under Black and White 2 and FEAR were excluded because we decided framerate was already at a minimum for the resolution we were testing.
Spoelie - Tuesday, October 17, 2006 - link
Bit disappointed, was hoping for 600/700 clocks. I'm curious about the temperatures under load and if it would easily overclock to at least those speeds. And what about HDCP? But I guess we'll have to wait for retail cards.If the price is €200 or less I just might be getting one to replace my x800xt :)
Spoelie - Tuesday, October 17, 2006 - link
Apparantly, powercolor clocks all its x1950pro cards up to 600/700 and have a 512mb sku. Plus silent cooling :)http://www.powercolor.com/global/main_product_seri...">http://www.powercolor.com/global/main_product_seri...
No word on hdcp and price tho :/
DerekWilson - Tuesday, October 17, 2006 - link
HDCP support is optional for vendors, but it seems like ATI is heavily encouraging them to include HDCP on all 1950 PRO cards. Since it's not guaranteed, be sure to check the specifications before you purchase.The power color 1950 PRO is not passively cooled but it includes a low dB fan. It does look like an interesting product, and we intend to acquire one for further investigation.
Goty - Tuesday, October 17, 2006 - link
Go read the review over at bit-tech. They've got prices up and the Saphire card they reviewed has HDCP.MadBadger - Tuesday, October 17, 2006 - link
Thanks for the review :beer;An observation:
-the pricefinder at the top of the article seems a bit out of whack. It shows as x1950 512 mb (PCI), but it links to the 1950 pro 256 mb for amazon and to the x1950 xt for the others.