Comments Locked

29 Comments

Back to Article

  • Leyawiin - Thursday, February 18, 2010 - link

    At this late stage of the game I decided to buy one of these. I game at 1680 x 1050 on XP and at that resolution this seems to be the best performing card for the money. The way it was built (better quality techniques and materials) is just icing on the cake. I don't want to wait for Fermi any longer (and I bet they will be out of my price range when/if they appear) and I don't want to spend $85-100 more for an HD 5850 for the small improvement it would give me on a 22" monitor. Should be a good match for my X4 955.
  • chizow - Monday, October 12, 2009 - link

    While its understandable why it took so long to do a review like this (first retail part clocked this high), these kind of OC scaling results would be much more useful closer to the launch of a product line to better determine the impact of individual clockspeeds and core/functional unit modifications.

    Derek did a few of these OC scaling drill-downs for ATI 4890 and GTX 275 I believe, but they were also very late given the GTX 260/280 and 4850/4870 had already been released for months. They would've been much more helpful if they were done at launch alongside the main reviews to give prospective buyers a better idea of the impact of actual hardware differences vs. software/artificial differences like clockspeed.

    The problem is Nvidia and ATI both mix hardware and clockspeed differences on these parts to obfuscate the actual performance delta between the parts, which is particularly significant because the ASICs are typically the same sans artificially neutered functional units. At the very least, launch reviews should normalize clockspeeds when possible to give a better idea of the impact of actual hardware differences.

    For example, with the 4850 vs. 4870, you have 625MHz vs 750MHz on the core along with GDDR3 and GDDR5. You can't really do much about the bandwidth disparity, but you can try to clock that 4850 up to 750MHz, which would give you a much better idea of the impact of the actual hardware differences and bandwidth. Similarly for the GTX 260 to 275, the original GTX 260s were clocked at 576/1242 and 275s were clocked at 633/1350. Normalizing clocks would then isolate the differences from the additional 8 TMU and 24 SP cluster rather than the artificial/binned difference in clockspeeds.

    To bring it full circle, doing these comparisons earlier wouldn't give you the guarantee a product would run at max overclocks like this factory OC'd part would, it would just set expectations when it actually mattered so that when an OC'd part like this does come along, you could just refer to the comparison done months ago and say "Yeah it performs and scales just as we thought it would when we did this Overclocking Comparison on GT200 parts 18 months ago".
  • SirKronan - Monday, October 12, 2009 - link

    There are two conclusions that would matter at all here:

    Performance per dollar
    and
    Performance per watt

    They got the performance per dollar, and the results aren't surprising at all. But this is half an article without the performance per watt aspect. As soon as they get a new killawatt or measure with a UPS that has a meter, and update the results, this article will actually be meaningful.

    An article comparing a lesser card overclocked to a faster card at stock MUST contain power usage comparison or it's missing half its teeth.
  • chizow - Tuesday, October 13, 2009 - link

    I typically don't find vertical price to performance comparisons relevant because you're always going to have trouble competing with those dirt cheap options that are free AR. Sure you may get great FPS per dollar, but what good does that do you when the aggregate FPS aren't playable? Similarly, with CF or SLI, its very difficult for the higher-end parts to keep ahead of the price and performance options that multi-GPU offer from lower-end parts.

    Performance per watt in this range of parts isn't hugely useful either imo, its been this way for some time, high-end graphics cards consume a lot of power. There's no panacea that's going to fix this, if a card is in the upper tier of performance for its time its probably going to consume a similar amount of power to its predecessors. This only changes as newer processes and faster parts are introduced, where they typically outperform older parts in the same segment using similar power, or slightly less.

    I personally prefer comparisons to be made based on performance, then you can do a lateral comparison of price and power consumption within that performance segment. That way you have an expected perfomrance level and can make an informed decision about other key characteristics, like price and power consumption.
  • shotage - Monday, October 12, 2009 - link

    I agree with this. Power Utilization followed by noise is something I would be concerned about with overclocked cards.

    Personally I don't think it's worth the money for a DX10 card anymore, even if it is fast. If I want a card its going to be DX11 capable.
  • Leyawiin - Monday, October 12, 2009 - link

    From the early reviews I've read there's no contest between the HD 5770 and a stock GTX 260 (216) in all but the most ATI friendly titles. The gap will be even greater with this card. This fact its very cool and quiet for the performance makes it even more compelling. And yes, the HD 5770 will be $160. If you can get it for that price (or get it at all for weeks after its release). I agree with Ryan on this one. Darn good deal for the price.
  • macs - Monday, October 12, 2009 - link

    My stock gtx 260 can overclock to those frequency as well. I can't see any reason to buy those overclocked (and overpriced) video cards...
  • mobutu - Monday, October 12, 2009 - link

    Yep. At 160 usd the 5770 will be (probably, we will find out tommorrow) very very very close in performance to this card priced at 200 usd.
    So the choice is/will be clear.
  • poohbear - Monday, October 12, 2009 - link

    im not quite sure what the point of this review is since the 5800 series have been released, in ur conclusion u didnt even mention anything about DX11 and how the 260 is not even capable of handling DX 10.1 let alone the future DX11. For people who keep their graphics cards for 2 years before upgrading this is an important factor.
  • Alexstarfire - Monday, October 12, 2009 - link

    I fail to see how DX11 matters since no game is even slated to use it yet, that I've heard of anyway. Might be useful if you keep your car 2 years, but I'm guessing by that time the card probably won't be able to handle DX11 much like the first DX10 cards couldn't handle DX10 very well when it first debuted. Of course, just going off of pricing and performance I wouldn't get this card anyway, or any nVidia card for that matter. I'm just saying.
  • palladium - Monday, October 12, 2009 - link

    Just wondering, with HAWX, is DX10.1 enabled for ATI cards?
  • Ryan Smith - Monday, October 12, 2009 - link

    No.
  • Nfarce - Monday, October 12, 2009 - link

    I just ask because I bought a stock EVGA 275 and have it overclocked quite nicely, which puts it above the performance of this o/c 260. Even AT posted about the 275's performance capabilities in an article back on June 4. You aren't really comparing apples to apples here other than one being purchased factory overclocked and others being purchased factory stock. No serious gamer ever keeps a video card stock just like a CPU.
  • Ryan Smith - Monday, October 12, 2009 - link

    Absolutely. User overclocking is by no means guaranteed, whereas factory overclocking is as good as anything else sold.

    As I stated in the article this card is a poor choice if you intend to do any overclocking on your own, but if you're the kind of person that does not do any overclocking (and I do know "serious gamers" that don't touch their card's clocks) then this is just as good as a GTX 275.
  • Abhilash - Monday, October 12, 2009 - link

    It is not worth the 25% premium over a stock gtx260.

    Where is the power consumption results???
  • Ryan Smith - Monday, October 12, 2009 - link

    My Kill A Watt decided to kill itself during some testing this weekend. There wasn't time to get it replaced and run new tests while still meeting all of the article deadlines this week. It'll be back soon™.
  • SirKronan - Monday, October 12, 2009 - link

    That's what I was wondering from the first page of the review: "Ok, so it performs like a 275, but how much power does it consume to do the same amount of work?" The title and conclusion indicate the performance is there for $10 to $20 less, but I kept looking on the review pages for the only thing I really wanted to know: "How do they differ on power?"

    I am typically one who praises Anand's articles, but I wouldn't have even published this without at least some kind of power figures. I understand that your Killawatt got "killed" (er... died, heh), but at least give us figures from a UPS that has a wattage meter built in. What was the difference in overall power consumption? That would at least give us an idea of how much extra power the 260 OC'd is going to use versus a 275. If you game enough, the power savings might even nearly negate the extra $10 you save over 2 or 3 years, depending on where you live.
  • Finally - Monday, October 12, 2009 - link

    Thanks for pointing this out. I was about to ask that.
    I guess that is the card's weak spot that would stand in the way of a "recommendation"...

    Under the rug, under the rug...
  • 7Enigma - Monday, October 12, 2009 - link

    Ryan did mention in the comments above that his Kill-A-Watt died during testing so that would explain why the info is not there.

    What should have been mentioned (I may have missed it) in the article was this explaination. No where did I find it, and like most of us my first thought was, OK but how much more power is this thing using, as that makes a big difference in my personal buying decisions (and why the 5850 is so darn likable across the board).
  • Stas - Sunday, October 11, 2009 - link

    As noted, 4890 is $20 cheaper than this Gigabyte card. Performance almost equal. But don't forget that you can easily get extra 100-150Mhz on the 4890 GPU with stock cooling, and 100-200 on memory. Which would make it 5-10% faster. So now we have a card (HD4890) that's cheaper ($20) AND faster than Gigabyte GTX260 O/C. I think it's a no brainer. Of course, Gigabyte did a great job with this card (I love Gigabyte), but you can only compete so well, when the limitation is set by the chip's architecture. Out of all GTX260 cards, this one is probably the best. But it isn't the best value or performance when compared to HD4890.
    P.S. Even with both cards at stock, in games where GTX260 prevails, it only does so by 10% or so. Wherever the HD4890 comes atop, it beats the other by up to 30%.
  • Jamahl - Sunday, October 11, 2009 - link

    It must be difficult to review fairly right now but this was a nice review.

    I do feel that the overall recommendation should be towards the 5850 more strongly. Yes it is priced a little higher, yes it is "only" 25% faster and 30% more costly, but the additional features double that 25% faster and we all know this.

    I suspect the 5850 is going to increase the gap as time goes on, and I believe most of us will agree with that too.

    A more forceful point on the pricing of this card (the 260 being reviewed) would have been another alternative. Overall it was a decent and interesting review.
  • DC 10 - Saturday, December 26, 2009 - link

    This Review was a long time in the coming!!!

    Yet this review seems to almost seems to have a chip on it's shoulder - when it is clearly seen to be just as good card as a GTX275-280 - I got the impression that it was regrettable that this card was too good - kinda silly to me

    I have the MSI GTX260 OCv3 and It's OC'd to 690 on the core / 1436 shader / and 1200 mem clock- It pretty much comes close to a GTX280 -

    Anything in the GTX275-280 range this card or MY MSI Card which is the same as this card, but with Better Cooling (not factory cooling) should be considered easily.

    FYI - MSI did a great job by going with a Non traditional cooling setup for their OCV3 cards - especially SLI or Tri-SLI - This card reviewed and my MSI card simply reveal what is at stake here - Pricing and Marketing...

    The review showed what most of us already knew who own these cards - That NVidia could slash prices the way AMD has done - instead of being uppidy knuckle-heads...

    It's good to see Anandtech - review something like this - could not deny what these cards cand do price/performance wise - hard to ignore...
  • yacoub - Sunday, October 11, 2009 - link

    I dunno, the 5770 is shaping up nicely for a $160 card:
    http://www.rage3d.com/board/showthread.php?t=33953...">http://www.rage3d.com/board/showthread.php?t=33953...
    http://www.hardforum.com/showthread.php?t=1458978">http://www.hardforum.com/showthread.php?t=1458978
    http://www.fudzilla.com/content/view/15876/1/">http://www.fudzilla.com/content/view/15876/1/
  • palladium - Monday, October 12, 2009 - link

    Hmm... wonder how SiliconDoc would reply to this.
  • The0ne - Monday, October 12, 2009 - link

    Why would you even care? O.o
  • teohhanhui - Monday, October 12, 2009 - link

    He's got an appointment with the psychiatrist.
  • sparkuss - Sunday, October 11, 2009 - link

    Is there something I should read into this difference in the charts?

    I'm probably going to update to 5850/5870 before a full box upgrade and I'm following all the reviews I can find.
  • Ryan Smith - Sunday, October 11, 2009 - link

    The 5850 data has been added. We pruned some old data to keep the charts smaller, and I pruned a little too much there.

    As for the 5870; Anand and I have matching rigs, but it's not possible to replicate thermal/noise characteristics. He did the noise/thermal testing for the 5800 series articles, while I did this one. As a result new data based on the cards I have was collected, and at the moment I don't have a 5870.
  • sparkuss - Sunday, October 11, 2009 - link

    Thanks,

    Just wanted to be sure I didn't miss something in the text.

Log in

Don't have an account? Sign up now