NVIDIA GeForce GTX 295 Coming in January

by Derek Wilson on 12/18/2008 9:00 AM EST
Comments Locked

69 Comments

Back to Article

  • nyran125 - Friday, January 9, 2009 - link

    I dont understand why people even bother buying any of these cards when an 8800 GTS ultra or GTS 512mb which are alot cheaper than any of these cards are running the latest games right now with AA on and as we speak on Maximum graphics. No games out right now are even close to Crysis technology graphically and an 8800gts or ultra runs it smooth as. Why waste your money on these cards? When its going to be awhile before we even get any games looking better than Crysis.
  • smlforever - Friday, December 26, 2008 - link

    hey !!! i cant't enter chinese this website why?
  • smlforever - Friday, December 26, 2008 - link

    ????????,HD4850?GTX9800GTX+????
    ATI???NIVIDA?,??????,?NFS12??COD5?
    ?????
  • falko2904 - Friday, December 19, 2008 - link

    I just found the EVGA GeForce GTX 260 SSC at BUY.com for $211.99 with free shipping, with an instant coupon good through 01/04/09 for $30 off, bringing the cost to $181.99, with a PayPal purchase you get $15 cash back from PayPal (not instant, though), bringing the cost to 176.99, with a $10 MIR from EVGA it brings the price to 166.99.


    WOW!!

    Here is the link:

    http://www.buy.com/prod/evga-geforce-gtx-260-ssc-8...">http://www.buy.com/prod/evga-geforce-gt...ed-video...

    The instant coupon is not evident, but appears when you add it to the cart. I was considering a EVGA 9800GTX until I found this. Bought it.
  • falko2904 - Tuesday, December 23, 2008 - link

    I see that I failed math, the discounted prices should be $166.99 after PayPal, and $156.99 after MIR.

    And I see people talking smack about the 192/216/240 core products. When a company markets a product, initially there are production problems and errors that will cause the yield of the product to be affected. One way to overcome this is with the ability to disable the failed or questionable sections of the chip. This increases per wafer yield and reduces waste and lost income, and reduces the cost of all bin levels of the product to be reduced to acceptable levels. You can hack and re-enable features, taking the chance that you got a product that was misbinned or manages to barely pass the tests. Further along in production the company is forced to misbin product to meet the demand of the lower binned products. These can usually be up-featured with a hack with reasonable assurance it will work reliably. These companies are in the business of profit, but do not for the most part gouge their customers. It is the reality of a business that requires multibillion dollar investments every time the have to setup a production line for a product that uses a new process.

    Another way to look at this is that you are paying for a given set of features, if you do not agree with the price for that set of features, then don't buy it. When you buy Windows Vista Home Premium, that DVD contains all Windows Vista versions, but you only get to enable the features you paid for.
  • SiliconDoc - Wednesday, December 24, 2008 - link

    You're another one that has bought the idiot's line. I guess you assume that just the correct amount of shaders are magically errored on the 192 for instance, so they disabled them and released the card as such. I'm sorry you're that stupid.
  • Razorbladehaze - Friday, December 19, 2008 - link

    OK so i will start out saying that the performance numbers from 5 of my usual daily online tech mags, look similar. That's Good. The performance of the GT295 looks to be the tops after release (as long as this soft release isn't a facade for retail performance, which i don't think will be the case). I'm impressed.

    Personally im my experince and my friends' experiences i will have to agree with what some on here have mentioned that CrossfireX is more stable, and usually performs better across a range of uses. But this continues to be a mostly subjective (as opposed to objective = evidence based)topic.

    But what really had me want to write this posting is this...

    The only objective nVidia had in this release was to take back the performance crown. This product was never in a release schedule, and was reactionary. Usually this results in a sloppy product, but let's hope that's not the case. It is obvious that they tested the idea of two GT260s/GT260-216 (Im guessing the 280's failed as well especially with power/performance ratio)and neither configuration bested the 4870x2. So once again they tweaked their GPU's and crammed this into a new process (making me nervous), to make this card competitive.

    So this is really not bad, but what is... is that Nvidia is not really pushing their own agenda. This speaks really poorly for Nvidia's state of affairs and for what company stands.

    Nivida has been renaming previous products to give a fresh look (something ATI also did with 9550/X1050 and to a lesser degree with 2600/2400/3400/3600 hardware) and some slight tweaking to deceive and take advantage of the consumer.

    Nvidia is also guilty of fraudulent behavior and will not own up to it(rambus), and has really bad recent quality control problems (8600/8800's/8600m/chipsets both pc & macbooks) in their manufacturing processes.

    Being on top is a Great thing, but it's also how you got there.

    Excellence is an art won by training and habituation. We do not act rightly because we have virtue or excellence, but rather we have those because we have acted rightly. We are what we repeatedly do. Excellence, then, is not an act but a habit.
    - Aristotle

    I wish i could send this to Nvidia's top people and counsel them on what it means and how they should apply it to their company.
  • SiliconDoc - Sunday, December 21, 2008 - link

    In other words, sir, the proper DISSING is this:
    The GTX260 192 is just a crippled GTX260 216 (or one would say because of order - the 260/216 is an uncripppled 260 192)
    We KNOW they told us - and now we see they have taken the 260 and enabled all 240 shaders...
    So what they DO - do - is cripple cards, when not crippling them won't cost a dime more.
    See, they whack their own product to make more "products" so they can get hype and tiers and dollars and bean counters and enthusiasts going gaga over it all.
    Both videocard companies do it, and that's what should make us al sick, because they are spending a LOT OF MONEY doing it - and it would be MUCH CHEAPER to just make a MUCH CHEAPER line of higher end cards without all the whack daddy crippling and finagling they do.
    Can you even imagine how many bean counting guru's and cheeseheads it takes to devise all the various flavors with chops and slices to the exact parts of the cores and the bios or the pcb design or whatever, they need to whack to get it done ?
    See, that's what they are doing.
    Just like long after Intel had multiple public fits over overclocking - the decided the gigantic "unlocked EE" overclocking mega $$$$ product was "cool" - and went insane selling it.
    So that's what they are doing - hacking around - and only letting out the "good stuff" at a very high price, even though - it doesn't cost to let more out the door with much higher performance - but chopping and hacking makes for busy bee work...
    ( we've seen the bios hacks that unlock crippled features - and there's a whole lot more crippling going on )
  • Razorbladehaze - Sunday, December 21, 2008 - link

    Okay so the ignorance is clear in that this poster missed the whole message in what I posted.
  • SiliconDoc - Thursday, December 25, 2008 - link

    " Things are starting out pretty well for the new GeForce GTX 295 card - it is able to thwomp the AMD Radeon HD 4870 X2 card in all resolutions tested! At 16x12 the GTX 295 has a 28% performance edge, at 20x15 it has a 23% lead and at 25x16 the new GTX 295 wins by 25% on average frame rate. Just as importantly, the minimum frame rates are also much much higher with the NVIDIA solutions, even the GTX 260+ is ahead of the HD 4870 X2 in this regard."

    Awwww... http://www.pcper.com/article.php?aid=651&type=...">http://www.pcper.com/article.php?aid=651&type=...

    Notice the last line ?
    " even the GTX 260+ is ahead of the HD 4870 X2 in this regard "

    YES, someone tells the truth.

  • SiliconDoc - Sunday, December 21, 2008 - link

    Jiminy crickets, can I buy you a TWIMTBP dartboard so you can get out some anger?
    I think you missed the boat entirely. I think it's great the two companies are competing, and they have a war going - I bet it's motivational for both developer teams.
    I'd also bet the "sandwhich" method NVidia used will be good learning for cards down the line, because it doesn't look like shrinkage is going to beat transistor count raising.
    Since the 260 / 192 and 216 only slightly beat the 4870 / 512 and 1024 - I disagree NVidia "tried it with them" and found it not good enough - they well knew to open it to 240 shaders for more of an advatage in the X2 boost. That's just simple plain common sense - that your analysis in that matter didn't have. A bitter putdown it was.
    Now, the point you missed that is valid, is what BOTH card companies do, and so do the cpu makers- and they cover it up by claiming problems with "yield". And the enthusiast public sucks it down like mental giant candy - oh so smart is the geek end user he knows why these things happen - and he or she blabbers it to all his or her peers... oh so smart the blabbering corporate protecter geek end user is...
    The bottom line is they CRIPPLE cards, gpu's, cpu's, bios, whatever it is they need to do - cut pipelines, chop shaders, whack cache...
    And somehow the supposedly so bright geek enthusiast squads babble the corporate line "Well, no we uhh.. we have a bad chip come out, and uhh... well we have to disable part of it - err, uhh, we just like disable it and mark it lower..."
    (add the speed binning - which occurs - to validate all the other chanting BS - and it works - for the corporate line)
    When the real truth is THEY PLAN THE MARKET SEGMENT - THEY CHOP AND DISABLE AND SLICE AND CUT AND REMOVE AND ADD BACK IN - TO GET WHAT THEY WANT.
    Yes, there are some considerations - # of ram chips to be on vidcard for example for bit width - but this they decide...
    They don't just shoot it straight out the way they HAVE TO because of "low yields".
    The only part that is valid is SPEED BINNING.
    Somehow though, the excuses for 'em are everywhere, as if wonder the cpu or GPU die " has the exactly correct portion magically come out retarded " - the portion they want disabled, or changed, for their TIER insertion of exactly proper product class.
    I can't stand it when BS becomes widespread belief.
  • Razorbladehaze - Sunday, December 21, 2008 - link

    What does this even mean?

    "Since the 260 / 192 and 216 only slightly beat the 4870 / 512 and 1024 - I disagree NVidia "tried it with them" and found it not good enough - they well knew to open it to 240 shaders for more of an advantage in the X2 boost. "

    First of all this is a total contradiction to logic if I'm indeed reading this correctly. I'll even let you keep the idea that 260 performs better (which it never has, although the 216 does border with the 4870 much better).

    But why then does nvidia release another version (216) with increased shaders? It's not because they suddenly have better yields, which could be the case, but once again it was reactionary. From the overall landscape of nvidia's moves they would have been happy to go on with their original design.

    Further more the logic once again is not there that nvidia would increase shaders again, when releasing their product and waiting for the decrease in process size (probably the larger of the two factors) before releasing the GT295. Unless of course they feel that HAD too in order to ensure they would beat out the 4870X2. If what is posited as true in the quote by silicon doc, logically nvidia would have wasted no time and likely released a GT295 much closer to the release date of the 4870X2.

    Finnaly what the hell are you talking about "for more of an advantage in the X2 boost."

    Does this mean the 4870X2 had a boost? Or are you trying to state that there is some special feature on 4870X2 that provides a boost (which there is not unless this is a reference to OC'ing through overdrive). Perhaps though and what I interpret this as a mistake or typo that you really said X2 when referencing the GT295.

    And with that interpretation the whole comment, " they well knew to open it to 240 shaders for more of an advantage in the X2 boost. " only goes to reinforce the logic that nvidia felt they had to make another REACTIONARY change to their own designs to gain the edge.

    Finally I'm done posting in here, so doc you're more than welcome to flame all you want. I've already spend enough of my precious time replying to the senseless dribble.

  • SiliconDoc - Monday, December 22, 2008 - link

    You're a red freak dude. "Reactionary design". LOL
    roflmao
    You need some help.
    Quoting Aristotle to Nvidia - and you want to send it to their people.- Like I said you've got a huge chip on your shoulder, so you yap out big fat lies.
    Can you stop yourself sometime soon ? I hope so.
  • SiliconDoc - Sunday, December 21, 2008 - link

    You're the one who said they tried it(dual) with the 260 and found it failed, not to mention you claiming (pulling the rabbit out of your spat) they tried it with 280 and found heat or watts untenable.... another magic trick with your mind.
    Obviously the reason they opened up to 240 shaders is because they could. There's ZERO room left in 4780- none fella. Zip.Nada.
    I guess you know what Nvidia engineers do because you figured it out all on your own, and they are so stupid, they tried your idiot idea, and found it didn't work.
    (NO, YOU SAID IT BECAUSE YOU HAVE A HUGE CHIP ON YUOR SHOULDER - AND YOU'RE OBVIOUSLY WRONG BECAUSE IT'S DUMB, DUMB, DUMB.)
    That's the very type of stuff that we've had to put up with for so long now, - and the next round is already here.
    Endless lying pukehead statements made by the little war beasts.
    I'm sick of it - I don't care if you're not lying, don't care if you're just making up stupid dissing stuff, but to lie up a big can of crud then pretend it's true - and expect others to accept it - NO.
  • SiliconDoc - Sunday, December 21, 2008 - link

    Perhaps you forgot the 4870 had a 512 version released, then a 1024.
    Maybe that would explain it to you, but I doubt it.
  • Razorbladehaze - Sunday, December 21, 2008 - link

    This is funny, "I can't stand it when BS becomes widespread belief. "

    I've noticed that this poster is quite guilty of spinning BS for others to buy into.

  • rocky1234 - Friday, December 19, 2008 - link

    Problems in farcry2 with crossfire? I have a 4870x2 & the game has worked fine from the day I got & I got it about three days after it was released...I play on maxed out settings & 1080P without issue's & the game always ran smooth The only reason I stopped palying it is because I got bored of the game.

    This release of the 295 is just a PR move for Novidia nothing more they beleive they have to be Number 1 or nothing because if they are number 1 they can charge what ever they want for their video cards.

    I am just waiting for all those websites to start doing the reviews & when they Novidai card wins a few they trash the AMD card because it lost by a few frames even though the card will let you play pretty much any game at high res & maxed settings. Oh yeah I did find a use for my old 8800GT/9800GT I now use it as a slave card for PhysX the only problem is where are they games so far the ones that use it are low in numbers & the game play does not seem to get better by having it except now when the Novida card kicks in & does the CPU's work my quad core can take a small nap...lol

  • SiliconDoc - Tuesday, December 30, 2008 - link

    " This release of the 295 is just a PR move for Novidia nothing more they beleive they have to be Number 1 or nothing because if they are number 1 they can charge what ever they want for their video cards. "

    Oh, so I guess if I ( like tens of millions of others) only have a single 16x pci-e like on the awesome Gigabyte P45 > GA-EP45-UD3R , or GA-EP45-UD3L , or UD3LR, or ASUS P5Q SE, P5Q, P5Q SE+, or Biostar T-Force, or endless number of P35 motherboards, or 965, G31, G33, etc... I should just buy the ATI product, and screw me if I can't stand CCC, or any number of other things, I'll just have to live with the red card for top frames... YOU DON'T WANT ME OR TEN MILLIONS OTHERE ALLOWED TO BUY A TOP DUAL NVIDIA ? !?!?!

    " I am just waiting for all those websites to start doing the reviews & when they Novidai card wins a few they trash the AMD card because it lost by a few frames even though the card will let you play pretty much any game at high res & maxed settings."

    Well I am waiting for it too, and I can hardly wait, since those same websites compared 2 gpu cores to one core, and declared by total BS, 2 cores the winner. If you start screaming it's best single slot solution like dozens have for 6 months, then tell me and mine it's a waste for NVidia to do the same thing, don't mind me when I laugh in your face and consider you a liar, ok ?

    " Oh yeah I did find a use for my old 8800GT/9800GT I now use it as a slave card for PhysX"

    Ahh, good. You won't be doing that with a red card anytime soon, and if you do, we see by the news, IT WILL BE NVIDIA with a rogue single developer supporting it ! (not ati - supporting their own cards for it - so much for customer care and cost consciousness)
    _________________________________________________________________

    So, you think the tens of millions of single slot 16x, so NO 16x 16x, or just 16x 4x PCI-E users deserve a chance at a top end card other than ATI ? OH, that's right you already basically said " NO !" - and squawked it's a pr crown stunt...

    How about SLI board owners who don't want to go ATI ? You think they might be interested in DUAL in one slot or QUAD ?
    No, you didn't give it a single brain cell of attentiveness or thought...
    Duhh... drool out a ripping cutdown instead....

    I'm sure I'll hear, I'm the one with the "problem", by another spewing propaganda.
  • initialised - Thursday, December 18, 2008 - link

    [quote=Anand]Making such a move is definitely sensible, but it is at the highest end (2560x1600 with tons of blur (I mean AA, sorry))[/quote]

    No, according to the Geordise translation (that we coined on the way home from CyberPower in a traffic jam) AF is Deblurrin' while AA is Dejaggin'), please ammend the article.
  • JarredWalton - Thursday, December 18, 2008 - link

    No, antialiasing adds some blurriness to images. It smooths out jaggies on edges by blurring those edges. It's one of the reasons I don't think 4xAA or even AA in general should be at the top of the IQ features people should enable; I only turn it on when everything else is enabled, and I start at 2xAA and progress from there. I'm with Derek in that 2xAA with high resolution displays is more than enough for all but the most demanding people.
  • AdamK47 - Thursday, December 18, 2008 - link

    The clock speeds are way lower than what I had expected, especially since this is 55nm.
  • SuperGee - Saturday, December 20, 2008 - link

    Then your take a lot factor not in to accaunt.

    1 ) GTX280 isn't only bigger chip due to 65nm.
    But also 1,5 times more Transistors then RV770.
    So it still is a bigger chip but 55nm makes it less extreem.
    2 ) with that a 55nm GT200 on 280 speeds draws a lot more power to beat RV770.
    3 ) while the audiencee blame GT200 65nm as a power draw king this doesn't make a RV770 a green chip. It dissipate also far over 100Watts.
    4 ) 4870x2 with its 275Watt is just heater like GTX280 even more and ATI is pushin it to.
    5 ) To not make a next king in power heating 300Watt is the limit.
    6 ) GT200 55 nm is to power hungry to full take the potentional out of GT200. They could do GT285 speeds. but that would be in the 365Watts. 280 speeds 320Watts.
    7 ) You get a ULV GT200x2 with 289Watt tad more then it direct competition and enough power to beat it with a small margin.

    In sight mistake
    A ) 55nm is mo miricla solution. The size of GT200 would make it a RV770 ish thing only from 40nm. GT200 on 40nm makes sense. To fill the gap to GT300 possible nextgen DX11 part. nV doesn't need a new 40nm dX10 chip. They have GT200.
    B ) RV770 isn't a candidate for enviomental prices, draws still a lot of power.

    From my history of 80386S 250Watt to 550Watt

    So to me it's just as expected. I speculate about the name GTX265X2 But they dropped X2 for new number GTX295
  • SiliconDoc - Sunday, December 21, 2008 - link

    But the 4870 should be compared to either 260 because that's where it's stats rest.
    What has already been docmunted endless times is the 4870 is 1-3 watts less than the 260 in full 3d use, whole the 260 is 30 watts less at idle.
    So the 4870 is the WORSE card when it comes to power consumption.
    Now if you want to compare it to the 280, why then you're comparing it to the card that beats it soundly, not just a little bit like the 260.
    I saw all the carts with the 4870 supposedly beating the 260 in power consumption, because the 3d consumption was 1-3 watts less, and the 300 watt idle advantage for the 260 was "secondary".
    No, doesn't make sense to me unless you're gaming 10x-30x more than in 2d or on your desktop- and then it would be a tie- but people DON'T have their cards in 3d gaming mode at that percentage of time compared to 2d or "idle".
    So there was plenty of skew about there.
    I don't understand how "judgement" can be so far off, except by thinking the charts I referred to are "auto generated" and use the 3d mode score ONLY for the ordering. Must be too difficult or too much of a hassle to manually change the order, so the reviewer instead of apologizing for the chart generation method just goes along and makes the twisted explanation.
    Then the "fans" just blabber on repeating it.
    That is exactly what I have seen.
    The 260 uses less power than the 4870, and it beats it slightly overall in game benches.
    Now there's the truth. That's actually exactly what all the data says. Oh well.
  • s d - Thursday, December 18, 2008 - link

    4 simple words ...

    n o t ... direct ... x ... 11

    :)
  • SuperGee - Saturday, December 20, 2008 - link

    Could take a while wenn the first DX11 cards come. And must proof them selfs with first for a long time with DX10.
    Then DX11 sDK comes out close after the card possible But the Games take a lot longer.

    It could be 6 month after DX11 runtime and hardware release that the first dx11 game drop in.

    Don't expect it in 2009 more mid 2010 if dX11 is release end 2009.

    till then I would enjoy games with a GTX285
  • michaelheath - Thursday, December 18, 2008 - link

    I know the engineering process for a new card release starts a few years in advance, but what is Nvidia thinking by producing nothing but high-end cards? Sure, a dual 200-series GPU card is a prize pony you can trot out to say, "Yeah, we got the fastest, prettiest one out there." In this day and age, though, the 'halo effect' goes to the company who can produce a high-performance product without associating a high-performance cost-to-own or cost-to-run.

    Nvidia needs to fill the void left in the 200-series' wake. Price-conscious shoppers might go for the 9x00 cards, but the tech-savvy price-conscious buyers know well enough that the 9x00 cards are nothing but re-named or, at best, die-shrinks of 2 year old technology. ATI, on the other hand, has a full range of cards in the 4xx0-series that is new(er) technology, covers a fuller spectrum of price ($50-300 before spiking to $500 for the X2 cards, and $100-200 buys you very respectable performance), and the new generation consistently out-paces the last-gen products.

    Now if ATI's driver team would spend more time on Q/A and fix that shadow bug in Left 4 Dead, I'd dump my 8800 GT 512 in a heart beat for a Radeon 4870 1GB. The only item that would keep me with the green team is if they die-shrink the GTX260, bump the clock speeds considerably, and put a MSRP of $200 on it.
  • SiliconDoc - Sunday, December 28, 2008 - link

    ps - Maybe you should have just been HONEST and come out with it.
    " My 880GT is hanging really tough, it's still plenty good enough to live with, and going with the same card company won't be a different experience, so the 4870 1 gig looks real good but it's $300.00 bucks and that's a lot of money when it isn't better than the 260.
    Why can't NV make a 4870 1 gig killer FOR ME, that makes it worth replacing my HIGH VALUE 880GT that's STILL hanging tough for like $200 ?
    _________________________________________________________________

    Yeah, good luck. With all the cards one could still say it's "moving rather slowly" - since years older tech (8800 series)still runs most monitors (pushing it at 1600x****) , and plays games pretty well.

    You're sitting in a tough spot, the fact is your card has had a lot of lasting value.
  • SiliconDoc - Sunday, December 28, 2008 - link

    Oh wait a minute, I guess I misinterptreted. ATI took their single 4000 series chips, and went wackado on it - and made a whole range of cards, and NVidia took their single 80series core- and went wackado on it - and made a whole range of cards - oh but NV even went a step further and reworked the dies and came up with not just G80 but G82,G84,G90,G92...
    Umm, yeah did ATI do that ? YES ?
    _________________________________________

    I REST MY CASE !
  • SiliconDoc - Sunday, December 28, 2008 - link

    So the cards - NV, all the way down to the 8400GS for 20 bucks at the egg... just don't exist according to you.
    I guess NV should take another corporate board/ceo wannabe's net advice and remake all their chips that already match the knockdown versions of the ATI latest great single chip that can't even match the NV one.
    For instance, the 4850 is compared to the 9800GT, the 9800gtx and 9800GTX+ . Why can't ATI make a chip better that NVidia's several gens older chips ? Maybe that's what you should ask yourself. It would be a lot more accurate than what you said.
    Let me know about these cards "filling the lower tiers".
    GeForce 9400 GT
    GeForce 9500 GT
    GeForce 9600 GT
    GeForce 9600 GSO
    GeForce 9800 GT
    GeForce 9800 GT
    GPU
    GeForce 8400 GS
    GeForce 8500 GT
    GeForce 8600 GT
    GeForce 8600 GT
    GeForce 8800 GT
    GeForce 8800 GT

    ______________________________-

    Those don't exist, right ? The reason NV doesn't make a whole new chip lineup just to please you, you know, one that is "brand new" like the 6mo. or year old 4000 series now, is BECAUSE THEY ALREADY HAVE A WHOLE SET OF THEM.
    _________________________________

    Whatever, another deranged red bloviator on wackoids.
  • sandman74 - Thursday, December 18, 2008 - link


    There is hard performance data on this card at www.bit-tech.net

    Basically it performs like a 280 in SLI in most cases (or thereabouts) which is pretty good, and does indeed beat the 4870 X2.

    I doubt this card is for me though. Too expensive, too hot to run, too much power. Im hoping the GTX 285 will more be appropriate for me.

  • SuperGee - Saturday, December 20, 2008 - link

    There are dozen preview and I see it not comes neer 280SLI.
    the spect of each GPU fall between 260and 280.
    And the result are just like that.

    Sure I seen result of it equal out. but the scaling is top off by a CPU bottneck. It's all flattend out 3 or 5 off the pack.

    // seen the benches of that site.
    What I see in sitaution where the CPU give room for SLI scaling.
    the GTX295 is very neer GTX260+SLI.
    GXT280SLI take a larger lead over the GTX295.
    it's More GTX260. With full shaders.
    Could be that some games arent shader heavy but fill rate.

    So if you pick out CPU limited. And only look at 280SLI and 295 and ignor everything else. You get so wrong impression.
    The games setting that barely fit in 1GB makes 295 dive. where 2870x2 leads.

    So in short where ever SLI may scale beyound 280SLI.
    Like Tri GTX285. :)
    GTX295 is like a GTX260++X2 (240)

    Didn't you notice if Where GTX295 equals GTX280sLI it also equals 4870x2 and GTX260+ and 4870-1GBCF With small differences.
    Me get within second a impression that such game with such setting is CPU bound with these cards.

  • SiliconDoc - Sunday, December 21, 2008 - link

    Well, if the 295 is cpu bound, then it has longer legs to go higher.
    I find it funny that the 4870x2 isn't referred to as cpu bound...
    So that means it's all legged out.
    It's also absolutely clear this 295 is two between the 260 and 280 - So the already completed SLI tests will compare just like the already completed 4870 Cf tests (512 and 1024 /. x2 respectively)
    So the numbers are already out there. Have been for some time one can say.
    The FPS numbers said for a long time the 260 is slightly better than the 4870 - and the 280 is definitely better.
    The 260 wins in more, not in everything ( I noticed the 4870 won all time the Devil may cry 4 - a notable exception to the other game stats), but the cut is clear for those not willing to tell lies. Period.
    Sure it's very close, and bundle and free game, or just whim, or slickness of look, or whatever it is (type of chipset, wanting a single card solution, power connectors - length of case etc etc )- is plenty good enough to make the choice between the two.
    Now the 295 ( = to X2 the 260's released with extra help from all 240 shaders, instead of 192 or 216 ) is a bit more - and so the stats spread will jump a bit more.
    That's just the way it is going to be. That's the way it already is.
    I know, it's very dangerous saying so. My golly, the world might explode.
  • DigitalFreak - Thursday, December 18, 2008 - link

    I'm thinking $599.
  • Lord 666 - Thursday, December 18, 2008 - link

    Sheer size of the current 200's is what is holding me back from purchasing. While I have a P180 case, all bays are populated with drives for work.

    The current SLI'd 8800GTS 640's just squeeze in there.
  • tejas84 - Thursday, December 18, 2008 - link

    Why don't the AMD fanboys stop criticizing Derek Wilson when he speaks the truth? Crossfire is a load of garbage...

    AMD has the worst driver release schedule and game support, thus SLI with its TWIMTBP support is superior in every way... Oh and if Nvidia sell the GTX 205 under the 4870X2 msrp then what will they debt ridden AMD do??

    Plus in case the AMD fanboys don't know your beloved AMD/ATI is circling the drain and I doubt they will survive the economic problems.Layoffs all over, consecutive losses for 2 years.. this is not a company that is "pwning" Nvidia..


    The only company that has been run as ineptly as AMD is General Motors and look at their fate...AMD for chapter 11 in 2009
  • The0ne - Thursday, December 18, 2008 - link

    Nvidia hasn't been doing well themselves. And please don't make comparisons of how bad things are for each company. If the company is not doing well it's NOT doing well.

    Both companies have their share of driver problems. I won't vouch for either at this point in time. My take on all these newer hardware is that the software side (driver mainly) isn't getting the time it requires to catch up. That's the biggest drawback from my perspective.
  • CadESin - Thursday, December 18, 2008 - link

    Why is it bad that AMD beat nvidia, I still have friends that say to be day after day that the 4870 is going to kill the PC market becouse its going to hurt Nvidia and as AMD is going to die, anything that hurts Nvidia is bad....

    I just can't understand why having AMD back in the game makes so many Nvidia fan boys all upset. Sure it means that the Green is not number 1 100% of the time anymore, but it allso means you fan boys got your GTX 280's for a lot less then they would have cost you without the 4800 line of cards.

    I myself run 2 systems, a X48 with 2 4870's and a X58 with 2 GTX 260's, and to be honest they both play about the same. I will note that I have less issues with ATI's drivers then I do with Nvidia's, not FPS wise in game, but crashing to desktop and driver failer.

    I agree with Derek that SLI is a bit better then CF is SOME games, but I allso like to point out that CF distorys SLI in some games as well, CoD4, Grid and Dirt to name a few.

    AMD saved us all from 500+ GTX 280's, and thats good new for everyone. But I do see this new card just turning into the next 9800GX2. Running to hot, drawing to much power, and totaly pawning Nvidia's next high end card, just the way the 9800GX2 beat the GTX280. Stop-gap products are great, soft launch Stop-gap products are less great, soft launch stop-gap products right before the bigest retail time of the year, thats just being desperate.

    Nvidia knows they are loseing, both the marketshare and money. AMD povoed to them that they will not allways be on top, and thats good. Maybe in the end, we can get another 8800GTX out of this, and by that I mean, something truely amazing... not the GTX280 which is something, truely not amazing...
  • MadMan007 - Thursday, December 18, 2008 - link

    "I just can't understand why having AMD back in the game makes so many Nvidia fan boys all upset."

    You answer your own question in the same sentence - 'fanboys.'
  • SiliconDoc - Saturday, December 20, 2008 - link

    It seems to me the problem with fanboyism is the LYING that goes along with it.
    I certainly have seen endless threads where reds jump in, and start slamming away, and they ALL claim they have never had a single issue with an ATI card, and most of them have some never seen anywhere else by me story about some Nvidia card problem.
    Well, that tells me what the problem is right there. I'm not sure what is causing it - I've seen reviews skewed by the same phenomena - I suppose they are "mad" that NVidia got their money in the not so distant past, and now they have a "revenge" company to cling to. I guess that's it. Maybe it's just my shiny car is better than yours syndrome. Maybe if they (realize but can't admit it in public) made the wrong choice, they have to try to convince everyone they "meet" they made the right one. It certainly is a wild pile of lies from what I have seen - and imagine the immense nerd arrogance in the front end enthusiast region that fuels it. Oh, woe unto thee who criticizes the enthusiast "expert of all areas". You might as well call his GF ugly or spit on his muscle car at the show, it would be the same thing as not agreeing his vidcard is the very best...(haha)
    Anyway, a good friend I have with an amazing amount of tech knowledge for his young age and a finely skilled overclocker was treated to my personally purported to be objective opinion on this whole topic - we had some laughs with others in our techtalk TeamS. , but he had perhaps some lingering red favoritism it seemed IMO (unwarranted).
    So he ordered this vidcard recently (ati 3650) and waited and waited for it from the great white north... finally it came... and the nightmare began ... can't overlcock - oh finally got that unglitched - the thing wouldn't do ANY 3d game - I was biting my tongue so hard but with the hassles he was exploding -lol -
    So anyway, driver changes, this and that utility....
    He left just earlier after the several days nightmare as he discussed his 3 system upgrade for the lawyers office - and after I suggested a certain board with ati2100 integrated, his comment was "I don't want ATI on any of those boards".
    That's just the way it goes, or rather that's how it goes - all too often. I know, I know, this is the only time a 3650 ever had any kind of issue, and the 8600gt is way, way, way more prone to like totally exploding...it might destroy your case innards...it's probably radioactive and baby seals are killed to produce it.
  • SkullOne - Thursday, December 18, 2008 - link

    Spoken like a true nVidia fanboy. Can you say Hello Kettle, I'm Pot?

    Nobody should be wishing for AMD to go under because there's nobody to pick up the slack. So unless you like $600 video cards and high priced CPU's go ahead and keep that mentality.
  • derek85 - Thursday, December 18, 2008 - link

    I hardly doubt NVidia can go on a price war with AMD given their huge die size and poorer yield.
  • SiliconDoc - Sunday, December 21, 2008 - link

    Well how about we all be bright about it, and hope both companies survive and prosper ?
    Golly, that would actually be the SANE THING to do.
    No, no there aren't many sane people around.
    I think I'll start a fanboy wars company, and we'll make two teams, and have people sign up for like 5 bucks each, and then they can have their fan wars cheering for their corporate master while they prey for the demise of the disfavored one and argue over it.
    Maybe we can get some kind of additional betting company going, like how they bet on whose going to get the political nominations and get elected... and make it a deranged fanboy stock market.
    We can even get carbon credit points for the low power user arguable leader of the quarter, and award a carbon credit bundle when one company is destroyed (therefore adding to saving the green earth).
    I mean why not ? There's so many loons who want one company or the other to expire, or so their raging argument goes...
    I'd like to ask a simple question though.
    If your fanoyism company can "gut their price structure" in order to destroy the other company, as so many fanboys claim, doesn't it follow that you've been getting RIPPED OFF by your favorite corporate profiteer ? Doesn't it also follow that the other guy has been getting a much more for his or her money ?
    LOL
    The answers are YES, and YES.
    I bet the corporate pigs love this stuff... "my fan rage company can destroy yours by slashing prices because they have a huge profit margin to work with, while your company has been gouging everyone !" the fanboy shrieked.
    Uhhhh... if your company, the very best one, has all the leeway in lowering prices, why then they have been the one raping wallets, not the other guy's company.
    So while you're screaming like a 2 year old about how much more moola your bestes company has from every sale, turn around, and keep chasing your own tail, because your wallet has been unfairly emptied - by the very company you claim could wipe the other out with the huge margin they can lower the prices with.
    Yeah, you're claiming you got robbed blind, and the sad thing is, if there is so much room, why then they COULD HAVE lowered prices a long time ago and wiped out the competition.
    But they didn't, they took that extra money from your fanboy wallet, and you thanked them, then you screamed at the other guy how great it is - and made the empty threat of destruction... and called the enemy the gouger...( when your argument says your company is gouging, with all it's extra profit margin ...)
    Oh well.
    So much for being sane.
    __________________________________________________

    Sanity is : hoping both companies survive and prosper.
  • Razorbladehaze - Sunday, December 21, 2008 - link

    This comment is senseless dribble. It is so ambiguous that it doesn't say anything useful. What can be discerned is a very odd hypothetical situation, that turns into venting about something.

    This comment should be stricken from the record.
  • SiliconDoc - Saturday, December 27, 2008 - link

    Ahh, so you bought the ATI, and now realize that you got "robbed" and they made a big profit screwing you.
    Sorry about that.
    Just stay in denial, it's easier on you that way.
  • chizow - Thursday, December 18, 2008 - link

    They've done a pretty good job of leading price cuts despite the Holiday Inn economics logic employed around various forums. After the initial price cuts in June, GTX 260 has consistently been priced lower than the 512MB 4870 and the GTX 260 c216 has been consistently priced lower than the 1GB 4870. Now that everyone is on a more expensive 55nm process, I'd say any price differences are easily negated by AMD's use of much more expensive GDDR5.
  • SuperGee - Saturday, December 20, 2008 - link

    It's about profits.
    The smaller the Die size the more chips out of a waver. The more chips on a waver bette bin and yield possibility's. Also less waver wast. A wafer is round. chips rectangle shape.

    With 55nm equal for both, nV has still a bigger chip. 1,4Transister while RV770 is 950mil. That a 1,5 difference.

    RV770 is still cheaper to produce. But GDDR5 is a tad expensive. But that not AMD costs but IHV problem.
  • SkullOne - Thursday, December 18, 2008 - link

    ardOcp has a brief preview of the GTX295 and I don't find it anything terribly exciting. It's nothing but a way to try to get a few more people to hold off buying cards until after the holidays and market Quad-SLI a littl bit more.

    I do agree with some of the previous comments about how AMD doesn't allow users to create their own AFR profiles. There is a rumor that it is in the works so as a HD4870 Crossfire user I hope it is true and AMD has been very good at listening to their users. The manual fan control is a good example. Then again Crossfire works on all the games I currently play so I don't really have an issue.
  • magreen - Thursday, December 18, 2008 - link

    Do we get nice mesquite bbq flavored char-grilled video?
  • SiliconDoc - Saturday, December 20, 2008 - link

    Ok, now you rekeyed my record. What we might get is 2 vidoes side by side, the video on the left has a barbeque, but the coals aren't crackling and popping, and the sauce isn't splashing about, and flame lickage is missing. That's no PhysiX - SUPPOSEDLY.
    On the right side of the screen is the "wonderful PhysiX display" with flames licking about, sauce drips and coals crackling off PhysiX sparks!
    Yes, some things (planned deception) in fact are " not good for end users ".
  • strikeback03 - Thursday, December 18, 2008 - link

    The image at the bottom of the page? I'd guess it is probably a reflection in the back plate, maybe the lens was close to the product when that was shot.

    Though a video card that could double as a Foreman grill is possible with a 289W TDP.
  • Goty - Thursday, December 18, 2008 - link

    Sure, SLI scales across a few more games than crossfire does, but the scaling is nearly identical, so I'd probably stay away from comments about the level of scaling.

    If you look here (http://www.bit-tech.net/hardware/2008/12/17/xmas-2...">http://www.bit-tech.net/hardware/2008/1...008-grap..., bit-tech just ran a test over eight of the newest popular games out there and the HD4870 gained an average of 67.5% when adding a second card while the GTX260 gained 70% (and the 280 gained less than either). I'd call that pretty even.
  • DerekWilson - Thursday, December 18, 2008 - link

    that's not the real issue.

    i've said before that though sli seems to scale with more games, crossfire seems to scale better when it does work. we've seen this in past articles.

    yes part of the issue is that AMD's highest end part doesn't perform like a highest end part should in some games. but ...

    the problem is when catalyst drivers break crossfire support for existing titles (like they did for bioshock and a couple others in the past 8 months or so) and when it takes 2+ months to get proper working crossfire driver support on a new platform or on a new game (like core i7 or far cry2 ... the combination about made me want to kill myself).

    crossfire is not as desirable a solution because, while performance is great when it works, it really doesn't "just work" as often as it should. and in those cases where either crossfire or sli don't work out of the box, nvidia gives you a way to force sli while amd doesn't offer any sane way to force a crossfire mode.

    i have no problem recommending sli over crossfire until AMD fixes the way catalyst is developed and deployed.

    i hate to sound like a broken record, but after years of this nonsense, it is just time for things to change.
  • TantrumusMaximus - Thursday, December 18, 2008 - link

    Agreed on the comments regarding paper vs. hard launch however nVidia can usually back their performance claims etc. I'm sure this will trounce a 4870x2 when released as from what I've seen SLI scales better across more games than ATI Crossfire.

    Can't wait to see some benchmarks and more details. I am building a new system in early Feb. and was tentatively doing SLI GTX260 216 boards.
  • Xietsu - Thursday, December 18, 2008 - link

    "That said, how pathetic is that NVIDIA had to go and push this press info out there 3 weeks before availability just to try and slow AMD's momentum during the holiday season."

    Well, it's doubtful that such a niche piece of information will affect sales that much, but I'd replace pathetic with prudent (and then add "it" after "is")! Hehe.
  • smlforever - Friday, December 26, 2008 - link

    ????????,HD4850?GTX9800GTX+????
    ATI???NIVIDA?,??????,?NFS12??COD5?
    ?????
  • Demne - Friday, December 26, 2008 - link

    Xietsu is obviously an ATI fanboy. Who cares if Nvidia pushes there latest graphics card to stall momentum of ATI cards. It is called Business and that is what drives sales towards each platform and creates interest in a product.

    Supposedly.. AMD/ATI are not the kings on the block they once were and any Nvidia card that will beat out a ATI card, I want to hear about it and how it will perform against its competition. Intel has the smoking processors, and if Nvidia has the king of the cards and wants to announce it to hault sales against ATI.. Good for them.
  • DerekWilson - Thursday, December 18, 2008 - link

    while it may not affect volume sales, it absolutely will impact those looking to buy at the high end ... read: it will impact those looking to spend the most money on the part that provides the highest profit margin.

    i'm a big believer in looking at the landscape right now and making decisions based on current reality. we can't know what the future will bring. sometimes this means i'll err on the cautious side and miss out on a deal. and sometimes people willing to sacrifice time for the possibility of a deal will be rewarded.

    it could be that those who wait for the GTX 295 are making a good decision. it could be that the only thing they end up doing is wasting a month they could have spent enjoying their 4870 X2 or something. we won't really know until January.

    it certainly is prudent from a marketing perspective. but we aren't interested in marketing, we are interested in the consumer and our readers. hard launches are good for those buying the cards because they know they can get their hands on what is being talked about. this kind of thing is good for NVIDIA because it stirs up interest and anticipation and people will have expectations.

    but as another commenter pointed out, it is entirely possible that nvidia, like they have done in the past, could decide to not actually release any product called a GTX 295 ever or anything remotely similar to this part we are reporting on. that is the major danger of something that is not a hard launch. it has happened before and it does negatively impact consumers, the competition, and ultimately the company that did it in the first place (though this impact takes longer to materialize).

    ati isn't innocent either ... they've done the same thing in the past. and we absolutely are not being biased here -- if amd were to do the same thing we would react the same way: grateful for the information but warning against the dangers of releasing this info to press before availability for the sake of our readers.


  • chizow - Thursday, December 18, 2008 - link

    Wow Derek, hypocrite much? I don't remember reading similar protest when AMD paper-launched the 4870X2. The difference is you actually had hard numbers accompanying text in that print space! Did Nvidia not send you an ES this time around? Didn't feel as special being one of only 10 in the world like with the 4870X2 paper launch so you felt it wasn't worth putting up preview numbers? How quickly we forget.....
  • SiliconDoc - Saturday, December 20, 2008 - link

    Chizow, wouldn't it be great if someone held off on Santa splurging since they heard about the 3 week friendly heads up announcement ? Guess what else ?
    Wouldn't it be great if it drove 4780x2 prices down for that Tiny Tim that wanted one for xmas but just couldn't squeeze the last few bucks - but now maybe can ?
    How about the rest of the merry, who, thank their lucky stars, now don't run out and buy 2 full version card X, because they know they have a single slot solution coming (wouldn't that include a lot of sli board owners ? I guess it's just TERRIBLE for them *dripping sarcasm*)
    Yes, well thank you for posting your points.
    For a minute there I thought the whole idea stated in the article of being "for our readers and the end user enthusiasts" and therefore against this early HEADS UP was just some sort of corporate doublespeak...( how the reviewer would be paid or compensated, I do not know...)
    Then I almost dropped dead when I read that this 295 "might just not even come out" (OMG - talk about freakish wishful thinking)...
    So thanks to those who posted links to others WHO GOT THE CARD and already posted actual results.
    So let me see, I'm supposed to believe that Nvidia hasn't been running a STREAM of these cards already at their contracted facilities ? Supposedly on Jan 8th, that's the date they finally "mass produce" ?
    NO, SORRY ! It's so INSANE at this point to even suggest this card won't hit the stores for end users - I mean WTH is going on...
    Ok, well if it's a few words this way or that - trying to recall past occurences and add depth to the storyline, and a phrase here or there gets a bit out of hand... well ok... it's just that it is absolutely clear it is more than just that. Maybe the order from on high or from pressure was to scramble the other direction after the ATI driver demands for special end user configs and the demand for more quality releases instead of monthly "somewaht brokens".
    ___________________________

    Ok, so whatever the pressures, let me just finish by saying this early announcement is WONDERFUL for end users, for readers, for ALL OF US.
    It can help drive down the price for 4870x2 "want to purchasers" - and is a FAIR and REASONABLE thing for NVidia to do considering the season and the certain hit they'ed take if they did not.
    This helps ALL OF US.
    PS - I've already seen the red fanboys screaming the 295 reviews are not correct and at some point in time soon someone will post one with reasonable numbers that don't stomp the 4870x2 so badly - in other words, the review numbers must be lies - so that of course absolutely indicates - that NVidia would have to be LOONS not to have this card out on the market soon.
    It also means it was VERY NICE of them to point out what they *didn't have ready in time for Xmas!!shame shame!!{valid criticism)* so that people, we people, wanting to upgrade, fan readers of anandtech and the like, can make a wise choice holding off a bit.
    From the actual hands on reviews I've seen thanks to helpful fellow readers here and elsewhere, it is well worth the 3 week wait if one was thinking of 4870x2. NOT being a fantaic of either company makes that is easily understandable, and I reckon it gets harder, if other conditions exist.
  • DerekWilson - Thursday, December 18, 2008 - link

    You have a terrific point about the fact that we didn't do the same thing with R700. I actually didn't think of it that way, but I definitely see the merit of your assertion and I apologize.

    I can tell you it's not because AMD sent us hardware (NVIDIA sends us hardware too -- that's not an issue at all), and it's not because we are biased for or against one or the other. I just honestly thought these were different circumstances.

    I don't mean to sound like I'm making excuses -- I'm not: I accept responsibility, and I'm sorry.

    To clarify, R700 was a package deal with RV770. We learned about both at the same time and shared information on the direction AMD was going with our first 3 articles that talked about the RV770/R700 architecture and performance. Neglecting a single monolithic high end chip and going with single-card multi-GPU as the sole ultra high end solution was a corporate direction more than a promise of a product. This wasn't some unexpected part, and we were very focused on whether this would be a good direction for the company even more so than would this be a good product.

    Had AMD slipped on R700 and not released it or released it at poor performance, it would have been much bigger and different than missing a part in the lineup -- it would have been a disaster of epic proportions leaving no competition with NVIDIA's high end solutions and proving that AMD's future direction was fundamentally flawed. The success or failure of AMD's RV770 gamble and possibly even the future direction of GPU architecture at AMD rested on whether the strategy paid off top to bottom.

    The GeForce GTX 295 isn't something we knew about or expected before this month, and it honestly doesn't seem vital to NVIDIA's strategy. It is well timed in marketing terms, but sometimes that's just coincidence. The point is though, that you don't know what you are getting until you are able to buy it (and I did, at least, mention that at the end of the R700 preview). Hard launches are better for consumers.

    Honestly, I still feel that these circumstances are different, but if I had it to do over again, I would have emphasized the point that the R700 preview was bigger than a product, that the preview was much more about corporate direction and design philosophy, and that we want to see companies stick to hard launches. I do apologize for not coming through on that in the R700 preview.
  • chizow - Thursday, December 18, 2008 - link

    Well I do agree that the 295 preview is clearly a marketing ploy to sway holiday buyers to wait after New Year's. And I also agree that the GTX 295 isn't integral to NV's strategy, its just a band-aid with only 1 real purpose: to take back the single card crown. I don't think you can really fault Nvidia for that though, they're clearly very good at more than just designing GPUs.

    I also don't fault you for not condemning the 4870X2 soft launch as you were one of the few select sites to receive one. Obviously there's pressure in your business just as any other to not fall behind the competition. I just figured AT had changed their stance due to industry pressures, which is why I was surprised by some of the comments here.

  • SiliconDoc - Sunday, December 28, 2008 - link

    So for the endless thousands of people without 2 pci-e x16 slots on their motherboards, this was just a war for top crown with ATI ?
    I guess choice is a big fat zero in our new socialist economy.
    I suppose everyone here has 2 or 3 pci-e 16x slots on their motherboards.
    Yes, what a pig of a thing for NVidia to do, actually make a card that will offfer THE MAJORITY OF CONSUMERS WHO BOUGHT MOTHERBOARDS - the best possible framerates because they don't have 2 pci-e 16x slots.
    What terrible crown freaks they are.
  • jordanclock - Thursday, December 18, 2008 - link

    No, it is pathetic. Both companies have been doing just fine with hard launches in the past few years (I don't know if you recall the days of soft launches a month in advance of retail availability, or when an announced product never showed up at all) and for either company to take a step back with a paper launch is just stupid.

    If AMD did the same thing, I'm sure they would have been called stupid too. Rightfully so, I might add.

    My problem with this is that nVidia has yet to release a mid-range product in how long? The only way to get a mid-range card from nVidia is to buy the upper end of a previous generation. I would really like to see a release of sub-$200 MSRP 200-series card. That would be news worthy. Not another $500 card that even nVidia doesn't expect to sell all that well.
  • SiliconDoc - Friday, December 19, 2008 - link

    I'm just thrilled by the 3 week early release. I've been attacked relentlessly by red fanboys for merely telling the truth then providing the link.
    So the madder they are about a week early announcement, the happier I am. But that's not the only reason. Despite the redfans screaming unfair, and crying their base is going down unfairly by "marginalized" xmas purchases..LOL... I'd be one ticked off hombre' if NVidia kept their claptrap shut and on Jan. 8th announced their new 4870x2 killer - and of course if the whiners were honest about anything at all, they'ed say so as well. That ONE consideration outwieghs ANYTHING ELSE THEY WHINE ABOUT, PERIOD.
    So, I have to say there are so many I consider complete raving loons because they are so far off the mark they don't have even their own personal pocketbook in mind - which of course is akin to self immolation. Yes, they are fired up. Burning, burning down the house. They just forgot to step outside first, and remember, that my golly, that is their house they might be upgrading.
    So, I certainly HOPE that ATI releases a gigantic secret cheap upgrade card on Jan. 8th, after all the red fanboys splurged their xmas cookie monies on something that DIDN'T have a " early announced release that is "harmful" to the end user".
    YES, WHAT A CROCK.
  • Mr Perfect - Thursday, December 18, 2008 - link

    Yes, the lack of new cards below the 260 is disappointing. They have already renamed 8000 series parts to make the 9000 series parts, and there where reports of the 9000 series parts being renamed as GT 100 parts. Hopefully it was just a rumor.

    You know there is at least one poor sap out there who's going to replace his 8800GT with a 9800GT, and then upgrade again to a GTS 150. The last time you could make a whole chain of "upgrades" and get essentially the same thing was what? The GeForce 2MX/4MX/4000 string?
  • RagingDragon - Sunday, December 21, 2008 - link

    How about Geforce 7600 -> 8600 -> 9500?
  • mczak - Thursday, December 18, 2008 - link

    A GeForce 4 MX is in no way a renamed 2 MX. While true it doesn't have the feature set of "real" GeForce 4 it is indeed a very different chip than 2 MX (with faster performance).
  • StevoLincolnite - Thursday, December 18, 2008 - link

    The Geforce 4 MX was basically a Geforce 2 on Steroids, with it's enhanced memory controller and advanced (For back then) bandwidth saving technology's.

    Dual-monitor support, and a multi-sampling anti-aliasing unit from the Ti series; the improved 128-bit DDR memory controller which was crucial to solving the bandwidth limitations that appeared on the GeForce 2 chips.

    It allowed gamers to play games on the Geforce 4 MX that was originally almost un-playable on a Geforce 2.

    It was also eventually released with PCI-E support and a wider memory bus, funny thing was how the card out-performed the Geforce FX5200 despite having an inferior feature set.
  • larson0699 - Friday, December 19, 2008 - link

    P.I. and way overborrowed from http://en.wikipedia.org/wiki/GeForce_4_Series">http://en.wikipedia.org/wiki/GeForce_4_Series

    In architecture, the 4MX was an original design, not a rebrand as in 8800/9800. For need of a reasonable depiction of its features, however, it *is* most closely related to the 2Ti and performs almost proportionally better (which would lead some to think of it as a derivative).

    The refresh in 2004 put this GPU on a PCI-E card with a bridge chip in between, which I wouldn't call "PCI-E support". It's a damn smart move economically when you're trying to sell the rest of your chips during a transition to a new platform, but I know not a single being who ever splurged on such a mediocre design (at least not the PCX series -- I've got a fried 6600 AGP in my hands from someone who didn't know the wiser).

    And I don't know where you're getting your data, but the FX5200 dusted the MX440 by about 2:1 in most of everything. I had a Radeon 8500 once and was playing on a LAN alongside a buddy with a Radeon 7500, and his card outperformed mine because it wasn't rendering everything mine was. But when I disabled the difference in effects, my card was well ahead. When I hear of an inferior new GPU, I think GMA 3100 < GMA 950, not this.

    Besides, GP's point wasn't about the GF4 but rather the twisted path of "upgrades" we're seeing more and more.

Log in

Don't have an account? Sign up now