thinking about getting a 7950 but i cant find monitor any monitors that do 2048 my boss says the dell 21" at work do but i have yet to see it what do you use? and where can i find them online?
thinking about getting a 7950gt but i cant find monitor any monitors that do 2048 my boss says the dell 21" at work do but i have yet to see it what do you use? and where can i find them online?
Think seriously about what's been presented to you in this article thus far, concerning GeForce 7950 GX2, and you should (if I can do my job properly) come to a conclusion like this: NVIDIA GeForce 7950 GX2 is probably the most caveat-laden graphics purchase yet released.
The conditions that have to be satisfied before it makes sense to get one are pretty much as follows:
* Are you willing to live with less-than-absolute best image quality from the high-end generation right now?
* Are you sure the games you play all have SLI support, or you're at least happy to wait for support to come in a future driver?
* Do you have a PC platform that supports it properly, which is realistically just nForce SLI of some flavour?
* Do you have a very well ventilated PC chassis, able to assist in the significant cooling challenges it presents?
* Do you own a high resolution PC display, since it's built for at least 1600x1200 in current supported games?
* If you run dual displays, are you happy for one to go blank when in multi-GPU mode?
Be sure and really consider the first two questions, and ponder the fact that ATI Crossfire is arguably even worse at satisfying the second, given its lack of a user-adjustable game profiling system. Done so? Good.
Now given yes to all of the above, are you then willing to spend £450 on a single graphics board? You are? Brilliant, they're available today, you'll enjoy the framerates and overall IQ, happy shopping!
I just wanted to point out... that list isn't just particular to this card. That's all SLI setups.
All have to watch the heat. Drivers don't allow any SLI setup to dual-view+multi-GPU. SLI does dominate primarily in Hi-Res configs. and so on and so on.
Its still a great list to consider... but please remember that it isn't limited to just the 7950GX2.
except the 3rd point doesn't apply at all to 7950 GX2 --
quote: Do you have a PC platform that supports it properly, which is realistically just nForce SLI of some flavour?
It seems there has been a lot of confusion around the support requied for this card. It will run on Intel, ATI, VIA, SiS, an NVIDIA chipsets as long as the BIOS supports non-video devices in x16 PCIe slots along with supporting add-in PCIe switches. Suprisingly many boards support these features as of this week.
the 7950 has a bunch of advantages over a 7900 GT SLI setup, not the least of which is performance. The lenient platform requirements, power draw less than an x1900xt, potential for expansion to quad sli, and it's really very transparently plug and play (just plug it in, install drivers, and no more tweaking is required for it to work as expected).
On page three in the chart it says the 7900GTX core clock is 700 MHz (650 for vertex core). Is that accurate? So what I'm seeing and adjusting in the driver settings with coolbits is actually the speed of the vertex core?
you know, I borrowed that chard from NVIDIA's reviewers guide on the 7950 GX2 ...
I am under the impression that the vertex core of the 7900 GTX is 700MHz. As this is NVIDIA's chart, I'm not sure if vertex clock is listed first or second ...
Based on the other values in the chart, and the row/column titles, I assumed the number given was supposed to be the core clock, and the one in parenthasis was supposed to be the vertex clock. I guess it could be a type-o... otherwise I want a new 7900GTX since the core speed of mine is only 650 MHz and it should be 700 according to that chart. :D
Just want to inform everyone that the article is now as it was intended to be. As has been mentioned before, we had some server trouble this morning which distracted me from getting everything posted up quite correctly.
Here's a short list of things added since the article went live --
1) Idle and Load power and power commentary
2) Analysis on each benchmark page for each resolution
3) a corrected typo wrt power draw between the 7950 GX2 and X1900XT
Sorry for the oversight, but all should be in order now. Please let me know if anything is out of the ordinary.
Based on the rest of the results, looks like it should be just ahead of a couple 7900GT's in SLI... so... figure 5-10% better performance than a pair of 7900GT's.
I play HL2 EP One at 1600x1200 4x/4x Max and it's perfectly playable with a over clocked 7800GT with a 2.4 GHz AMD. Though other games at that setting it's a slide show...so I really have no point. But hey, I posted!
Where's the ATI x1900 XT crossfire? that would definitely make a more complete benchmark (people buying 7900GT SLI will probably also look at x1900 or x1800 crossfire)
Considering this is a $600 single PCIe slot (double-wide) solution, its fair to compare it to single GPUs. The 7900 GT SLI is thrown in for reference, and you can see how other multi-GPU solutions stack up in other articles. X1900 XT CF (and 7900 GTX SLI) will certainly be faster, but both will also cost at least 50% more. If nothing else, the PCIe switch is an interesting development.
That point is valid. This is available from XFX for $599 at NewEgg as we speak (in stock). X1900XT Crossfire is atleast $400+$450=$850+. But still, I'm sure it would make for a nice shootout (7900GTX SLI, X1900XT CF and this lone warrior).
Quick question: does the 7950GX2 require games to have SLI-profiles for it to utilize both GPUs?
SLI profiles are used if availalbe, but SLI profiles are never required to enable multi-GPU support on NVIDIA hardware.
there are some advanced options for enabling multi-GPU or single-GPU rendering in the control panel -- even down to the AFR or SFR mode type (and SLIAA modes as a fallback if nothing else will work for you).
haven't read the article yet as I didn't see reference to Oblivion benchmarks, and lets be honest, that's the only game out these days that's worth benchmarking (in terms of actually giving the high end cards an actual workout).
quote: Not since Quantum3D introduced the Obsidian X24 have we seen such a beast (which, interestingly enough, did actual Scan Line Interleaving on a single card).
The Voodoo5 5500 had two GPUs on a single card which did true SLI, not to mention the Voodoo5 6000 which had four GPUs, but never really made it to market.
DX9 itself has a good deal of overhead in some situations, something Microsoft is changing for DX10. We'll have more on that in our upcomming Vista article later this week.
One of the concens I have had regarding this card is the noise level relative to nVidia's other top offering, the 7900GTX. I have seen all over the net that nVidia's large 2-slot heatsink is extremely quiet compared to most others. As the 7950 GX2 has 2 single slot type coolers, I am assuming that the noise is going to be considerably higher, but I haven't seen anything regarding noise levels on any of the reviews I've read so far. Could you weigh in on this (objective info on noise, I'm not looking for dBA measurements or anything).
as for noise, from a subjective perspective, the 7950 GX2 is on a similar level to the 7900 GTX. Both are much much much quieter than an ATI X1900 XT spinning up to full speed.
I see that it's mentioned in the article that the 7950GX2 draws less power then the 7900GTX:
"For those who live on the bleeding edge, this lower power alternative to the 7900 GTX is a solid way to go."
Do you have any numbers to support this? Is that only in comparison to 7900GTX SLI or does is it truely consume less power then a single 7900GTX?
Another thing not mentioned was the noise level or the heat produced. I understand that you can't get numbers for everything ... but things like noise level and heat could be commented on subjectivly.
If looking at "high end perforamnce parts," but primarily concerned with power consumption, heat produced, and noise levels ... It's my understanding that the 7900GT and 7900GTX are prefered over the ATI solutions.
with the server issues this morning, the power section was accidentally left out at publication.
unfortuantely, i also mentioned the wrong card in the conclusion -- my original references to the 7900 GTX should have been to the X1900 XT. This has been corrected. Sorry for the confusion.
Derek, tell us how you really feel about content protection. Hah.
Good article. I wasn't aware how elegant this solution was until now. It works regardless of the chipset. Impressive.
Haha. 50% failure rate. That's comedy.
Where'd you pull that number from?
Hardocp said BFG reported 3-5%, Evga reported .04-1.9%, XFX said in the last 2 weeks they reported a .5% (half of one percent) increase in RMA's.
Yea. That seems like 50% to me.
50% failure rate might be bullshit but the fact that you completely ignored the other half of his message is also bullshit fanboy-ism.
The X1900 XTX isn't on here. The X1900 XT Crossfire isn't on here either, but the 7900 GT SLI is. This review is missing 2 of the top video cards, and for what reason? It makes this review incomplete and this should be addressed.
quote: 50% failure rate might be bullshit but the fact that you completely ignored the other half of his message is also bullshit fanboy-ism.
Actually, I agree with both of your points. The x1900 XTX should have been included in this review in both crossfire and single card. To the same end, the 7900 GTX in sli should have been included imo.
Noise comparisons and power draws would have been nice as well.
the 7950 GX2 is a single board solution (for those uncomortable with the inclusion of 2 PCBs, think of one as a daughterboard or something). We wanted to really focus on the comparison to other single board solutions.
Right now x1900 crossfire and 7900 gtx sli are over $1000 investments, and don't really compete with the 7950 GX2 -- unless we look at the 7950 GX2 in SLI. As we couldn't get quad SLI on the 7950 GX2 working for this article, we decided we save the comparison to that copetition later. It does seem pretty clear fromt these tests that the 7950 GX2 in SLI will be able to trump any other solution in its market segment.
Also, the 7950 GX2 doesn't require an SLI board -- which is a great advantage of current multi-GPU solutions. In many cases, putting two other solutions in SLI won't be an option for users who upgrade to a 7950 GX2.
But
Please understand that I certainly appreciate the requests for the inclusion of the 1900xt crossfire and the 7900 gtx crossfire as a reference point to what is currently possible on the highest end of the spectrum. In future articles involving the 7950 GX2 we will address this issue. Thanks very much for your feedback.
50% failure rate? dude, do u know how this percentage thing works?! that would mean 1 in 2 79XX cards fail. please, bs is a great thing and we have plenty of it on the net, but try to atleast make your bs somewhat believable.
Just to reinforce another poster's comments. Oblivion is now the yardstick for truly sweating a high-performance PC system. A comparison of a single GX2 vs dual 7900GT in SLI would be very interesting indeed, since Oblivion pushes up against the 256Meg graphics memory limit of the 7900GT (with or without SLI), and will exceed it if some of the 'oblivion.ini' parameters are tweaked for more realistic graphics in outdoor environments, especially in combo with some of the user-created texture-enhancements mods.
That was actually my first thought and the reason I read the article ... "How will it run Oblivion?". I hope you'll find the time to add some graphs for Oblivion. Thanks.
Nice article. Some insanely rich gamers may want to compare the absolute high-end, so they may have wanted to see 1900XT in Crossfire. It'd help with the comparison of value.
We had some server issues which are resolved now. The graphs were initially broken on a few charts (all values were 0.0) and so the article was taken down until the problem could be corrected.
This is a very cool but what would be a better idea if nvidia would use the socket concept where you can change out the VPU just like you can a cpu. So you could buy a card with only one VPU and then add another one later if you needed it....
Isn't that what PCI-Express is? Think of a graphics card like a slot 1 or slot A CPU back in the old days. A graphics card is a GPU with it's own cache on the same PCB. If we were to plug a GPU into the motherboard, then it would have to use system memory (slow) or use memory soldiered onto the motherboard (not updatable). The socket idea for GPUs doesn't make sense.
but it is exactly what HTX will be with AMD's Torrenza and coherent HT links from the GPU to the processor. The CPU and the GPU will be able to work much more closely together with this technology.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
60 Comments
Back to Article
bluebob950 - Tuesday, June 27, 2006 - link
thinking about getting a 7950 but i cant find monitor any monitors that do 2048 my boss says the dell 21" at work do but i have yet to see it what do you use? and where can i find them online?bluebob950 - Tuesday, June 27, 2006 - link
thinking about getting a 7950gt but i cant find monitor any monitors that do 2048 my boss says the dell 21" at work do but i have yet to see it what do you use? and where can i find them online?JNo - Wednesday, June 7, 2006 - link
I think the review at Hexus.net bares a very interesting negative conclusion, somewhat in contrast to anandtech (not that I'm dissing anandtech), and provides food for sobering thought...http://lifestyle.hexus.net/content/item.php?item=5...">http://lifestyle.hexus.net/content/item.php?item=5...
Think seriously about what's been presented to you in this article thus far, concerning GeForce 7950 GX2, and you should (if I can do my job properly) come to a conclusion like this: NVIDIA GeForce 7950 GX2 is probably the most caveat-laden graphics purchase yet released.
The conditions that have to be satisfied before it makes sense to get one are pretty much as follows:
* Are you willing to live with less-than-absolute best image quality from the high-end generation right now?
* Are you sure the games you play all have SLI support, or you're at least happy to wait for support to come in a future driver?
* Do you have a PC platform that supports it properly, which is realistically just nForce SLI of some flavour?
* Do you have a very well ventilated PC chassis, able to assist in the significant cooling challenges it presents?
* Do you own a high resolution PC display, since it's built for at least 1600x1200 in current supported games?
* If you run dual displays, are you happy for one to go blank when in multi-GPU mode?
Be sure and really consider the first two questions, and ponder the fact that ATI Crossfire is arguably even worse at satisfying the second, given its lack of a user-adjustable game profiling system. Done so? Good.
Now given yes to all of the above, are you then willing to spend £450 on a single graphics board? You are? Brilliant, they're available today, you'll enjoy the framerates and overall IQ, happy shopping!
Tephlon - Wednesday, June 7, 2006 - link
I just wanted to point out... that list isn't just particular to this card. That's all SLI setups.All have to watch the heat. Drivers don't allow any SLI setup to dual-view+multi-GPU. SLI does dominate primarily in Hi-Res configs. and so on and so on.
Its still a great list to consider... but please remember that it isn't limited to just the 7950GX2.
DerekWilson - Wednesday, June 7, 2006 - link
except the 3rd point doesn't apply at all to 7950 GX2 --the 7950 has a bunch of advantages over a 7900 GT SLI setup, not the least of which is performance. The lenient platform requirements, power draw less than an x1900xt, potential for expansion to quad sli, and it's really very transparently plug and play (just plug it in, install drivers, and no more tweaking is required for it to work as expected). It seems there has been a lot of confusion around the support requied for this card. It will run on Intel, ATI, VIA, SiS, an NVIDIA chipsets as long as the BIOS supports non-video devices in x16 PCIe slots along with supporting add-in PCIe switches. Suprisingly many boards support these features as of this week.
Jeff7181 - Tuesday, June 6, 2006 - link
On page three in the chart it says the 7900GTX core clock is 700 MHz (650 for vertex core). Is that accurate? So what I'm seeing and adjusting in the driver settings with coolbits is actually the speed of the vertex core?DerekWilson - Tuesday, June 6, 2006 - link
you know, I borrowed that chard from NVIDIA's reviewers guide on the 7950 GX2 ...I am under the impression that the vertex core of the 7900 GTX is 700MHz. As this is NVIDIA's chart, I'm not sure if vertex clock is listed first or second ...
Jeff7181 - Tuesday, June 6, 2006 - link
Based on the other values in the chart, and the row/column titles, I assumed the number given was supposed to be the core clock, and the one in parenthasis was supposed to be the vertex clock. I guess it could be a type-o... otherwise I want a new 7900GTX since the core speed of mine is only 650 MHz and it should be 700 according to that chart. :Dv12 - Tuesday, June 6, 2006 - link
Could you post comparative scores of the X1900XTX as well? does it beat the 7950 or not? I cant tell from this review.V12
AnnonymousCoward - Tuesday, June 6, 2006 - link
Thanks for the sweet article. 2 GPUs on a non-SLI board is the way to go...until dual-core GPUs. When are those coming out, anyway?DerekWilson - Monday, June 5, 2006 - link
Hello all,Just want to inform everyone that the article is now as it was intended to be. As has been mentioned before, we had some server trouble this morning which distracted me from getting everything posted up quite correctly.
Here's a short list of things added since the article went live --
1) Idle and Load power and power commentary
2) Analysis on each benchmark page for each resolution
3) a corrected typo wrt power draw between the 7950 GX2 and X1900XT
Sorry for the oversight, but all should be in order now. Please let me know if anything is out of the ordinary.
wilki24 - Tuesday, June 6, 2006 - link
Any chance to include an Oblivion page in the review?I'm thinking of buying one of these beasts, but I'd really like to see that first.
Thanks!
Jeff7181 - Tuesday, June 6, 2006 - link
Based on the rest of the results, looks like it should be just ahead of a couple 7900GT's in SLI... so... figure 5-10% better performance than a pair of 7900GT's.Regs - Monday, June 5, 2006 - link
I play HL2 EP One at 1600x1200 4x/4x Max and it's perfectly playable with a over clocked 7800GT with a 2.4 GHz AMD. Though other games at that setting it's a slide show...so I really have no point. But hey, I posted!JarredWalton - Monday, June 5, 2006 - link
We show 64.1 FPS at that setting, so yes, the game is imminently playable at 16x12 4x/8x with anything 7800GT/X1800 level or higher.JarredWalton - Monday, June 5, 2006 - link
"eminently playable" as well. ;)DerekWilson - Monday, June 5, 2006 - link
lets have a hand for our editor folks :-)Fenixgoon - Monday, June 5, 2006 - link
Where's the ATI x1900 XT crossfire? that would definitely make a more complete benchmark (people buying 7900GT SLI will probably also look at x1900 or x1800 crossfire)JarredWalton - Monday, June 5, 2006 - link
Considering this is a $600 single PCIe slot (double-wide) solution, its fair to compare it to single GPUs. The 7900 GT SLI is thrown in for reference, and you can see how other multi-GPU solutions stack up in other articles. X1900 XT CF (and 7900 GTX SLI) will certainly be faster, but both will also cost at least 50% more. If nothing else, the PCIe switch is an interesting development.MacGuffin - Monday, June 5, 2006 - link
That point is valid. This is available from XFX for $599 at NewEgg as we speak (in stock). X1900XT Crossfire is atleast $400+$450=$850+. But still, I'm sure it would make for a nice shootout (7900GTX SLI, X1900XT CF and this lone warrior).Quick question: does the 7950GX2 require games to have SLI-profiles for it to utilize both GPUs?
JarredWalton - Monday, June 5, 2006 - link
Yes, SLI profiles are used for full utilization of the GX2 card. (AFAIK - Derek can correct me if I'm wrong.)DerekWilson - Monday, June 5, 2006 - link
SLI profiles are used if availalbe, but SLI profiles are never required to enable multi-GPU support on NVIDIA hardware.there are some advanced options for enabling multi-GPU or single-GPU rendering in the control panel -- even down to the AFR or SFR mode type (and SLIAA modes as a fallback if nothing else will work for you).
in short -- required: no, used: yes.
araczynski - Monday, June 5, 2006 - link
haven't read the article yet as I didn't see reference to Oblivion benchmarks, and lets be honest, that's the only game out these days that's worth benchmarking (in terms of actually giving the high end cards an actual workout).DigitalFreak - Monday, June 5, 2006 - link
It's amazing all the cool stuff you can do with PCI Express.Sniderhouse - Monday, June 5, 2006 - link
The Voodoo5 5500 had two GPUs on a single card which did true SLI, not to mention the Voodoo5 6000 which had four GPUs, but never really made it to market.
shabby - Tuesday, June 6, 2006 - link
The x24 was also a dual pcb video card, thats what he meant. Not dual chip or whatever.timmiser - Monday, June 5, 2006 - link
Exactly what I was thinking!DerekWilson - Monday, June 5, 2006 - link
Perhaps I should have said successful products ... or products that were availble in any real quantity :-)photoguy99 - Monday, June 5, 2006 - link
From page 1, what limitations are being referred to?Ryan Smith - Monday, June 5, 2006 - link
DX9 itself has a good deal of overhead in some situations, something Microsoft is changing for DX10. We'll have more on that in our upcomming Vista article later this week.dug777 - Monday, June 5, 2006 - link
? Seems daft not to include CF in that kinda comparison...Mclendo06 - Monday, June 5, 2006 - link
One of the concens I have had regarding this card is the noise level relative to nVidia's other top offering, the 7900GTX. I have seen all over the net that nVidia's large 2-slot heatsink is extremely quiet compared to most others. As the 7950 GX2 has 2 single slot type coolers, I am assuming that the noise is going to be considerably higher, but I haven't seen anything regarding noise levels on any of the reviews I've read so far. Could you weigh in on this (objective info on noise, I'm not looking for dBA measurements or anything).Mclendo06 - Monday, June 5, 2006 - link
Great minds think alike. If you answer VooDooAddict's post, feel free to ignore mine.VooDooAddict - Monday, June 5, 2006 - link
:)DerekWilson - Monday, June 5, 2006 - link
as for noise, from a subjective perspective, the 7950 GX2 is on a similar level to the 7900 GTX. Both are much much much quieter than an ATI X1900 XT spinning up to full speed.VooDooAddict - Tuesday, June 6, 2006 - link
Thanks for the updateVooDooAddict - Monday, June 5, 2006 - link
I see that it's mentioned in the article that the 7950GX2 draws less power then the 7900GTX:"For those who live on the bleeding edge, this lower power alternative to the 7900 GTX is a solid way to go."
Do you have any numbers to support this? Is that only in comparison to 7900GTX SLI or does is it truely consume less power then a single 7900GTX?
Another thing not mentioned was the noise level or the heat produced. I understand that you can't get numbers for everything ... but things like noise level and heat could be commented on subjectivly.
If looking at "high end perforamnce parts," but primarily concerned with power consumption, heat produced, and noise levels ... It's my understanding that the 7900GT and 7900GTX are prefered over the ATI solutions.
DerekWilson - Monday, June 5, 2006 - link
sorry -- there was a two fold problem herewith the server issues this morning, the power section was accidentally left out at publication.
unfortuantely, i also mentioned the wrong card in the conclusion -- my original references to the 7900 GTX should have been to the X1900 XT. This has been corrected. Sorry for the confusion.
VooDooAddict - Tuesday, June 6, 2006 - link
Thanks for the clarification.Jojo7 - Monday, June 5, 2006 - link
Derek, tell us how you really feel about content protection. Hah.Good article. I wasn't aware how elegant this solution was until now. It works regardless of the chipset. Impressive.
Exsomnis - Monday, June 5, 2006 - link
Since when did slapping two PCBs together = single card? *Confused.*z3R0C00L - Monday, June 5, 2006 - link
Marketing Gimmick...It's two GPU's. It's SLI. The Fastest Single VPU/GPU solution is the x1900XTX (not tested here).
The most advanced GPU/VPU is the x1900XTX as well.
I wonder if these crds will also suffer from the 50% failure rate other 7900 series cards suffer from.
Jojo7 - Monday, June 5, 2006 - link
Haha. 50% failure rate. That's comedy.Where'd you pull that number from?
Hardocp said BFG reported 3-5%, Evga reported .04-1.9%, XFX said in the last 2 weeks they reported a .5% (half of one percent) increase in RMA's.
Yea. That seems like 50% to me.
Xenoid - Monday, June 5, 2006 - link
50% failure rate might be bullshit but the fact that you completely ignored the other half of his message is also bullshit fanboy-ism.The X1900 XTX isn't on here. The X1900 XT Crossfire isn't on here either, but the 7900 GT SLI is. This review is missing 2 of the top video cards, and for what reason? It makes this review incomplete and this should be addressed.
Jojo7 - Monday, June 5, 2006 - link
Actually, I agree with both of your points. The x1900 XTX should have been included in this review in both crossfire and single card. To the same end, the 7900 GTX in sli should have been included imo.
Noise comparisons and power draws would have been nice as well.
DerekWilson - Monday, June 5, 2006 - link
this does touch on our motivation --the 7950 GX2 is a single board solution (for those uncomortable with the inclusion of 2 PCBs, think of one as a daughterboard or something). We wanted to really focus on the comparison to other single board solutions.
Right now x1900 crossfire and 7900 gtx sli are over $1000 investments, and don't really compete with the 7950 GX2 -- unless we look at the 7950 GX2 in SLI. As we couldn't get quad SLI on the 7950 GX2 working for this article, we decided we save the comparison to that copetition later. It does seem pretty clear fromt these tests that the 7950 GX2 in SLI will be able to trump any other solution in its market segment.
Also, the 7950 GX2 doesn't require an SLI board -- which is a great advantage of current multi-GPU solutions. In many cases, putting two other solutions in SLI won't be an option for users who upgrade to a 7950 GX2.
But
Please understand that I certainly appreciate the requests for the inclusion of the 1900xt crossfire and the 7900 gtx crossfire as a reference point to what is currently possible on the highest end of the spectrum. In future articles involving the 7950 GX2 we will address this issue. Thanks very much for your feedback.
poohbear - Thursday, June 8, 2006 - link
50% failure rate? dude, do u know how this percentage thing works?! that would mean 1 in 2 79XX cards fail. please, bs is a great thing and we have plenty of it on the net, but try to atleast make your bs somewhat believable.nullpointerus - Monday, June 5, 2006 - link
No, it isn't. They only wanted to reply to a particular point within his post.
Inkjammer - Monday, June 5, 2006 - link
50% failure rate? Where are you getting those numbers from?z3R0C00L - Monday, June 5, 2006 - link
I got the number from polling various website forums.. including HardOCP.eVGA, XFX and BFG claim low to non-existant issues. My polls show an avg of 48% failure rate. It's on HardOCP... go and check out the forums.
kilkennycat - Monday, June 5, 2006 - link
Just to reinforce another poster's comments. Oblivion is now the yardstick for truly sweating a high-performance PC system. A comparison of a single GX2 vs dual 7900GT in SLI would be very interesting indeed, since Oblivion pushes up against the 256Meg graphics memory limit of the 7900GT (with or without SLI), and will exceed it if some of the 'oblivion.ini' parameters are tweaked for more realistic graphics in outdoor environments, especially in combo with some of the user-created texture-enhancements mods.Crassus - Monday, June 5, 2006 - link
That was actually my first thought and the reason I read the article ... "How will it run Oblivion?". I hope you'll find the time to add some graphs for Oblivion. Thanks.TiberiusKane - Monday, June 5, 2006 - link
Nice article. Some insanely rich gamers may want to compare the absolute high-end, so they may have wanted to see 1900XT in Crossfire. It'd help with the comparison of value.George Powell - Monday, June 5, 2006 - link
Didn't the ATI Rage Fury Maxx post date the Obsidian X24 card?Also on another point its a pity that there are no Oblivion benchmarks for this card.
Spoelie - Monday, June 5, 2006 - link
Didn't the voodoo5 post date that one as well? ^^Myrandex - Monday, June 5, 2006 - link
For some reason page 1 and 2 worked for me, but when I tried 3 or higher no page would load and I received a "Cannot find server" error message.JarredWalton - Monday, June 5, 2006 - link
We had some server issues which are resolved now. The graphs were initially broken on a few charts (all values were 0.0) and so the article was taken down until the problem could be corrected.ncage - Monday, June 5, 2006 - link
This is a very cool but what would be a better idea if nvidia would use the socket concept where you can change out the VPU just like you can a cpu. So you could buy a card with only one VPU and then add another one later if you needed it....BlvdKing - Monday, June 5, 2006 - link
Isn't that what PCI-Express is? Think of a graphics card like a slot 1 or slot A CPU back in the old days. A graphics card is a GPU with it's own cache on the same PCB. If we were to plug a GPU into the motherboard, then it would have to use system memory (slow) or use memory soldiered onto the motherboard (not updatable). The socket idea for GPUs doesn't make sense.DerekWilson - Monday, June 5, 2006 - link
actually this isn't exactly what PCIe is ...but it is exactly what HTX will be with AMD's Torrenza and coherent HT links from the GPU to the processor. The CPU and the GPU will be able to work much more closely together with this technology.