Its interesting how the XFX card (the only RoHS card - uses less lead and other hazardous chemicals) uses more power. I wonder if this will be true of other RoHS devices.
I can tell you that the motherbord I use, which is also RoHS 'certified' (Asrock AM2NF4G-SATA2) runs pretty dahmed cool (sub 95F, when ambient is 80F ish), doesnt even use active cooling for the chipset etc either. Reguardless, if its the actual cause or not, I think its well worth it in the long run.
It's possible I suppose ... but it seems to me that you'd want to attatch the bar to the pcb at more than two points if this is the case. And you might also want to connect it to the slot cover for the added support of the case screw. Granted, I'm not a mechanical engineer, but it seems to me that connecting one part of the pcb to another like this would just move any moment created by the weight of the HSF somewhere else on the pcb.
I've also never seen a graphics board bend under normal use. Intel motherboards are another story though. :-)
Whether or not its made for this, I do have a good use for it: having this bar makes it easier to find a place to grab when removing the card. Sometimes it's tough to find a spot on the pcb to grab, and sometimes the HSF solution isn't mounted in such a way that it's stable enough to use either (I distinctly remember the 6600 GT really disliking any contact with the HSF). This doesn't apply to the huge heat-sink-is-bigger-than-my-forearm solutions though -- they're usually bolted on pretty tight.
I'd have to go with the Leadtek card. Near to the BFG in almost every level of performance, nearly equal in watt consumption, lower heat output under load, a couple of (suspect, I admit) games included. the (maybe) $20 more is worth it for the heat output alone to me.
The 7900 GS launch article compared the 7900 GT to the stock 7900 GS, which you can take a look at here: http://anandtech.com/video/showdoc.aspx?i=2827">http://anandtech.com/video/showdoc.aspx?i=2827. We tested these overclocked 7900 GSs on the same system, so you can compare the numbers directly (with the exception of Oblivion which we tested with different quality settings for this article).
Hmm, only roughly 5FPS more on the 7900GS vs the 7600GT acrossed the board. Thats pretty sad, but I think I know what I'll be doing when I get a conroe system going, I'll be adding another 7600GT for SLI . . .
across the board is a little off I think ... in bw2 and oblivion, yes the fps difference is low. But when 4.2 fps is the increase over 17 (a 24% difference), you can't ignore it -- it does make a big difference. I would tend to argue that at these very low framerates, a 5 fps difference is much more noticable than the difference between 60 and 120 fps. In most other tests (especially with AA) frame rate differences were much higher in addition to being higher precent differences.
quote: For some reason, BFG's website lists it as 525MHz. We'll double check our sample, but we listed the speed of the card we reviewed.
Let me clarify:
"The BFG 7900 GS OC's core clock is set at 520MHz, a 70MHz increase over the standard NVIDIA 7900 GS"
It’s listed at 540MHz everywhere else in this article (including the benchmarks).
Maybe uniqueness can come in shades of blue if it can't be grey :-) I see your point, but sometimes taking a little liberty with language gives us the ability to succinctly convey something like the idea that "this is mostly the same as everything else with a slight difference in one area".
I really don't like the line graphs... very hard to read when more than three or four lines are close together. Get much more meaningful data much more quickly from traditional bar graphs or a simple table with numbers.
the line graphs include a table -- just ignore the top part :-)
the problem with bar graphs are that they don't clearly show trends between cards over different resolutions, they don't show the impact of increasing resolution for each card, and they take up qutie a bit more space.
we'd love to hear more good suggestions on ways we can better present our data though.
I agree that they are harder to read. Persoanlly I like the way they show how the cards scale. Unfortunetly while it's nice to have all the data right there. I used to be able to show non-techie friends a page or two from your review to pointout the performance advantages of one video chipset over the other.
With the new line graphs and data grids I need to make my own simple bar graphs from your data to show friends info relivant to thier purchase decision. (I would never re-publish these graphs with your data... just used to give non-techie friends better direction.)
As a side note. I personally know quite a few people out there debating over the current $100-$240 range: The new X1300 XT (Which is baqsicly a X1600 Pro), X1600 XT, 7600GS, 7600GT, X1900GT, 7900GS, 7900GT (with rebates) ... you might want to take note of these if you ever toss up more Low-Midrange buying guides.
"The EVGA 7900 GS's heat sink is slightly longer than the reference 7900 GS's heat sink and it has two gill-shaped cutouts exposing some copper ridges from the inside of the sink. Aside from that, the card has the signature EVGA black coloring, with their logo and card name in clear view on the face."
You could also mention that it covers the RAM chips. That's kind of the point of its design and something nice to have.
Finally happy to see that eVGA only cooler get tested, looks like they knew what they were doing desiging it and using it on many of their G70 cards...
Should also note the eVGA has had a $20 MIR on it for over a week, ties it with the XFX for best priced... but the warranty, non-stock cooler, and HDCP support ought to make it an obvious choice between the two.
XFX makes a passively cooled 7950GT, but puts a fan on their 7900GS? Maybe it's a price issue (more expensive to passively cool a card and less profit margin at the $200 price range)?
I'd like to see how these cards compare to a 7600GT, as I currently own a eVGA 7600GT KO, and will be upgrading my current system to a conroe, and MAY consider another GFX card, especialy one this in-expencive, or maybe I'll just go the 7600GT SLI route . . .
Josh probably would have done well to provide some more specific reference to the previous two Anandtech articles on the the 7900GS that Derek did, including some links to those articles, since that is where you'll find more information on how these cards compare to a wider array of video cards, including the 7600GT. However, while they tested the new 7900GS in a SLI configuration in one of those prior articles, I don't think they inclduded results from a 7600GT SLI for comparison.
I'm not sure what article might have that in it for reference.
In the Oblivion test, why are do the X1800GTO and 7800GT both score 0 in the bar graph, despite their non-zero results in the line graph directly above it?
I'm assuming that the answer is roughly 0, but with such similar specifications, is there any hope? I've got a single 7800GT in an SLI board and can't find a second at the moment.
Re fan power consumption, you're unlikely to be looking at more than 1 watt difference across the board.
At this point, NVIDIA will not support SLI between prodcuts with different names -- even if they have the same pipeline configuration.
We have mentioned that this would be quite a good incentive for people to get behind SLI, but it seems like they are worried about implying that it could work in cases where it can't.
Our suggestion is to make sli between parts that could work together an unsupported option. We haven't been able to figure out how to hack the driver to allow it, and we don't think NVIDIA will allow it.
Don't you think that testing these cards without any FSAA is being kind of lazy? Anyone knowledgable enough to actually read this review most likely will be using FSAA, so that kind of makes this a waste....
Also, did you at least turn the drivers up to high quality?
I'd have to agree. Most people with 1280x1024 LCDs that I've met prefer to leave FSAA off (if they even mess with the setting) to get the best possible frame rates. While the max framerate might not dip below 60 ... it's the minimum framerate spike that will effect competition.
I used to enjoy turning on FSAA for Everquest, but for anything more FPS competition oriented I don't know anyone who uses FSAA unless they have SLI. (Which would be why they got SLI ... to run FSAA without noticable impact to framerates.)
As far as driver settings, we test with default driver settings with the exception of vsync which is disabled. We do the same with ATI hardware; we leave catalyst AI on its default setting. We find this is the best way to keep our tests consistent for an article like this.
I wonder if part of the difference in heat/power consumption had to do with how much power was supplied to the fans. Slow running fan=low power=high heat. Faster fan=higher power=low heat.
Meh, I'm not sure a tiny fan could ever put a dent in a 20 watt difference though.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
42 Comments
Back to Article
sum1 - Friday, September 22, 2006 - link
I posted this discrepancy twice, days ago, hasn’t anyone else noticed yet? It’s listed at 540MHz everywhere else in this article (including the benchmarks).
Josh Venning - Saturday, September 23, 2006 - link
It's been fixed. Thanks for pointing this out, and we apologize for not fixing it sooner.PerfectCr - Thursday, September 21, 2006 - link
Fan Noise? How do I know how loud/quiet the fans are? Do they throttle?kmmatney - Tuesday, September 19, 2006 - link
Its interesting how the XFX card (the only RoHS card - uses less lead and other hazardous chemicals) uses more power. I wonder if this will be true of other RoHS devices.yyrkoon - Wednesday, September 20, 2006 - link
I can tell you that the motherbord I use, which is also RoHS 'certified' (Asrock AM2NF4G-SATA2) runs pretty dahmed cool (sub 95F, when ambient is 80F ish), doesnt even use active cooling for the chipset etc either. Reguardless, if its the actual cause or not, I think its well worth it in the long run.Zaitsev - Tuesday, September 19, 2006 - link
On page 3 third line, "NVIDIDA intends for it to be a direct competitor to ATI's X1900 GT"Zaitsev - Tuesday, September 19, 2006 - link
Sorry, that should be page 2.Josh Venning - Tuesday, September 19, 2006 - link
fixed, thanksHoward - Tuesday, September 19, 2006 - link
I believe the bar is there to reduce PCB bending under weight.DerekWilson - Tuesday, September 19, 2006 - link
It's possible I suppose ... but it seems to me that you'd want to attatch the bar to the pcb at more than two points if this is the case. And you might also want to connect it to the slot cover for the added support of the case screw. Granted, I'm not a mechanical engineer, but it seems to me that connecting one part of the pcb to another like this would just move any moment created by the weight of the HSF somewhere else on the pcb.I've also never seen a graphics board bend under normal use. Intel motherboards are another story though. :-)
Whether or not its made for this, I do have a good use for it: having this bar makes it easier to find a place to grab when removing the card. Sometimes it's tough to find a spot on the pcb to grab, and sometimes the HSF solution isn't mounted in such a way that it's stable enough to use either (I distinctly remember the 6600 GT really disliking any contact with the HSF). This doesn't apply to the huge heat-sink-is-bigger-than-my-forearm solutions though -- they're usually bolted on pretty tight.
Bonesdad - Tuesday, September 19, 2006 - link
I'd have to go with the Leadtek card. Near to the BFG in almost every level of performance, nearly equal in watt consumption, lower heat output under load, a couple of (suspect, I admit) games included. the (maybe) $20 more is worth it for the heat output alone to me.Also, why no noise output comparisons?
Nimbo - Tuesday, September 19, 2006 - link
Why ATI cards are not overclock in the reviews? Are they bad overclokers? Why are not factory overclock versions?DerekWilson - Tuesday, September 19, 2006 - link
ATI's current generation of GPUs have not been good overclockers. It is also not as easy to find ATI factory overclocked cards.We will look at ATI overclocking in similar roundups of ATI cards.
formulav8 - Tuesday, September 19, 2006 - link
I was hoping to see a 7600GT including in the mix to see what I would have to gain from a 7900gs. :(Jason
Josh Venning - Tuesday, September 19, 2006 - link
The 7900 GS launch article compared the 7900 GT to the stock 7900 GS, which you can take a look at here: http://anandtech.com/video/showdoc.aspx?i=2827">http://anandtech.com/video/showdoc.aspx?i=2827. We tested these overclocked 7900 GSs on the same system, so you can compare the numbers directly (with the exception of Oblivion which we tested with different quality settings for this article).yyrkoon - Wednesday, September 20, 2006 - link
Hmm, only roughly 5FPS more on the 7900GS vs the 7600GT acrossed the board. Thats pretty sad, but I think I know what I'll be doing when I get a conroe system going, I'll be adding another 7600GT for SLI . . .DerekWilson - Wednesday, September 20, 2006 - link
across the board is a little off I think ... in bw2 and oblivion, yes the fps difference is low. But when 4.2 fps is the increase over 17 (a 24% difference), you can't ignore it -- it does make a big difference. I would tend to argue that at these very low framerates, a 5 fps difference is much more noticable than the difference between 60 and 120 fps. In most other tests (especially with AA) frame rate differences were much higher in addition to being higher precent differences.DerekWilson - Tuesday, September 19, 2006 - link
http://anandtech.com/video/showdoc.aspx?i=2827">http://anandtech.com/video/showdoc.aspx?i=2827sum1 - Tuesday, September 19, 2006 - link
"The BFG 7900 GS OC's core clock is set at 520MHz, a 70MHz increase over the standard NVIDIA 7900 GS"It’s listed at 540MHz everywhere else.
"EVGA"
Is usually written eVGA.
"Something slightly unique about this 7900 GS..."
Uniqueness does not come in shades of grey.
rushfan2006 - Tuesday, September 19, 2006 - link
You are WRONG on all 3 of your points....Stop being so damn anal for the sake of just busting stones because you are bored.sum1 - Wednesday, September 20, 2006 - link
Let me clarify:
"The BFG 7900 GS OC's core clock is set at 520MHz, a 70MHz increase over the standard NVIDIA 7900 GS"
It’s listed at 540MHz everywhere else in this article (including the benchmarks).
DerekWilson - Tuesday, September 19, 2006 - link
For some reason, BFG's website lists it as 525MHz. We'll double check our sample, but we listed the speed of the card we reviewed.http://www.bfgtech.com/7900GS_256_PCIX.html">http://www.bfgtech.com/7900GS_256_PCIX.html
EVGA is actually EVGA despite the fact that people tend to lowercase the leading 'e'. Check their own press releases on their site.
http://www.evga.com/about/pressrelease/default.asp...">http://www.evga.com/about/pressrelease/...t.asp?re...
Maybe uniqueness can come in shades of blue if it can't be grey :-) I see your point, but sometimes taking a little liberty with language gives us the ability to succinctly convey something like the idea that "this is mostly the same as everything else with a slight difference in one area".
yacoub - Tuesday, September 19, 2006 - link
I really don't like the line graphs... very hard to read when more than three or four lines are close together. Get much more meaningful data much more quickly from traditional bar graphs or a simple table with numbers.DerekWilson - Tuesday, September 19, 2006 - link
the line graphs include a table -- just ignore the top part :-)the problem with bar graphs are that they don't clearly show trends between cards over different resolutions, they don't show the impact of increasing resolution for each card, and they take up qutie a bit more space.
we'd love to hear more good suggestions on ways we can better present our data though.
Questar - Tuesday, September 19, 2006 - link
Stacked bars do.
DerekWilson - Tuesday, September 19, 2006 - link
i think those are harder to read than line graphs.VooDooAddict - Tuesday, September 19, 2006 - link
I agree that they are harder to read. Persoanlly I like the way they show how the cards scale. Unfortunetly while it's nice to have all the data right there. I used to be able to show non-techie friends a page or two from your review to pointout the performance advantages of one video chipset over the other.With the new line graphs and data grids I need to make my own simple bar graphs from your data to show friends info relivant to thier purchase decision. (I would never re-publish these graphs with your data... just used to give non-techie friends better direction.)
As a side note. I personally know quite a few people out there debating over the current $100-$240 range: The new X1300 XT (Which is baqsicly a X1600 Pro), X1600 XT, 7600GS, 7600GT, X1900GT, 7900GS, 7900GT (with rebates) ... you might want to take note of these if you ever toss up more Low-Midrange buying guides.
yacoub - Tuesday, September 19, 2006 - link
"The EVGA 7900 GS's heat sink is slightly longer than the reference 7900 GS's heat sink and it has two gill-shaped cutouts exposing some copper ridges from the inside of the sink. Aside from that, the card has the signature EVGA black coloring, with their logo and card name in clear view on the face."You could also mention that it covers the RAM chips. That's kind of the point of its design and something nice to have.
Kougar - Tuesday, September 19, 2006 - link
Finally happy to see that eVGA only cooler get tested, looks like they knew what they were doing desiging it and using it on many of their G70 cards...Should also note the eVGA has had a $20 MIR on it for over a week, ties it with the XFX for best priced... but the warranty, non-stock cooler, and HDCP support ought to make it an obvious choice between the two.
mostlyprudent - Tuesday, September 19, 2006 - link
XFX makes a passively cooled 7950GT, but puts a fan on their 7900GS? Maybe it's a price issue (more expensive to passively cool a card and less profit margin at the $200 price range)?yyrkoon - Tuesday, September 19, 2006 - link
I'd like to see how these cards compare to a 7600GT, as I currently own a eVGA 7600GT KO, and will be upgrading my current system to a conroe, and MAY consider another GFX card, especialy one this in-expencive, or maybe I'll just go the 7600GT SLI route . . .Spacecomber - Tuesday, September 19, 2006 - link
Josh probably would have done well to provide some more specific reference to the previous two Anandtech articles on the the 7900GS that Derek did, including some links to those articles, since that is where you'll find more information on how these cards compare to a wider array of video cards, including the 7600GT. However, while they tested the new 7900GS in a SLI configuration in one of those prior articles, I don't think they inclduded results from a 7600GT SLI for comparison.I'm not sure what article might have that in it for reference.
Sc4freak - Tuesday, September 19, 2006 - link
In the Oblivion test, why are do the X1800GTO and 7800GT both score 0 in the bar graph, despite their non-zero results in the line graph directly above it?Josh Venning - Tuesday, September 19, 2006 - link
Thanks for pointing this out. It's been fixed.Woodchuck2000 - Tuesday, September 19, 2006 - link
...of one of these working in SLI with a 7800GT?I'm assuming that the answer is roughly 0, but with such similar specifications, is there any hope? I've got a single 7800GT in an SLI board and can't find a second at the moment.
Re fan power consumption, you're unlikely to be looking at more than 1 watt difference across the board.
VooDooAddict - Tuesday, September 19, 2006 - link
Find someone with a 7800GT and offer them one of the 7900GS Overclocked versions in exchange.DerekWilson - Tuesday, September 19, 2006 - link
At this point, NVIDIA will not support SLI between prodcuts with different names -- even if they have the same pipeline configuration.We have mentioned that this would be quite a good incentive for people to get behind SLI, but it seems like they are worried about implying that it could work in cases where it can't.
Our suggestion is to make sli between parts that could work together an unsupported option. We haven't been able to figure out how to hack the driver to allow it, and we don't think NVIDIA will allow it.
Martrox - Tuesday, September 19, 2006 - link
Don't you think that testing these cards without any FSAA is being kind of lazy? Anyone knowledgable enough to actually read this review most likely will be using FSAA, so that kind of makes this a waste....Also, did you at least turn the drivers up to high quality?
imaheadcase - Tuesday, September 19, 2006 - link
Mmm, most people don't use FSAA. Majority of users can't tell a diffrence with it on or off.VooDooAddict - Tuesday, September 19, 2006 - link
I'd have to agree. Most people with 1280x1024 LCDs that I've met prefer to leave FSAA off (if they even mess with the setting) to get the best possible frame rates. While the max framerate might not dip below 60 ... it's the minimum framerate spike that will effect competition.I used to enjoy turning on FSAA for Everquest, but for anything more FPS competition oriented I don't know anyone who uses FSAA unless they have SLI. (Which would be why they got SLI ... to run FSAA without noticable impact to framerates.)
Josh Venning - Tuesday, September 19, 2006 - link
Thanks for the comment. We didn't include FSAA tests in this article because it isn't a full analysis of 7900 GS performance, but a comparison between different 7900 GS products. For an in-depth look at the 7900 GS performance in more games and settings take a look at the 7900 GS launch coverage(http://www.anandtech.com/video/showdoc.aspx?i=2827...">http://www.anandtech.com/video/showdoc.aspx?i=2827..., and the 7950 GT article which has some 7900 GS SLI numbers (http://www.anandtech.com/video/showdoc.aspx?i=2833...">http://www.anandtech.com/video/showdoc.aspx?i=2833....As far as driver settings, we test with default driver settings with the exception of vsync which is disabled. We do the same with ATI hardware; we leave catalyst AI on its default setting. We find this is the best way to keep our tests consistent for an article like this.
giantpandaman2 - Tuesday, September 19, 2006 - link
I wonder if part of the difference in heat/power consumption had to do with how much power was supplied to the fans. Slow running fan=low power=high heat. Faster fan=higher power=low heat.Meh, I'm not sure a tiny fan could ever put a dent in a 20 watt difference though.