When is GT 240 coming out and when are you going to review it? I had expected the GT 220 to be as low as it comes (reaaallly low end), however, I saw some preliminary reviews on other forums on the GT 240, the supposedly new Nvidia 40nm mainstream card with GDDR5 and quite fascinate with result.
No one is going to buy one of these cards by choice - they are going to be thrown out in HP, Dell and Acer PCs under a pretty sticker saying they have POWERFUL GRAPHICS or some other garbage. Much the same as them providing 6600 graphics cards instead of 6600GTs, then again, I would probably rather have a 6600GT because if the DirectX 10 cards that were first released were any indication this thing will suck. I am sure this thing will play Bluray...
(Not because tomshardware is saying so, but because otherwise, it doesn't make sense NV architects to designed a so bandwidth limited GPU) (and based on past architecture design logic)
G 210 standard config CPU core clock is 589MHz, shaders 1402MHz.
(check Nvidia's partner sites)
9600GSO (G94) Memory Bus Width is 256bit not 128bit.
I'm sure there is cooling solution but it will probably going to hurt your wallet. I love ATI but they need to fire their marketing team and hire some more creative people. Nvidia needs to stop under estimating ATI and crush them, now they are just giving ATI a chance to steal some market share back.
Its 40nm and has only 48sp 8rop/16tmu and still only 1360MHz shader clock.Is the TSMC 40nm this bad or what. The 55nm 128sp gt250 has 1800 Mhz shaders.
Could you please try out some overckocking.
We've seen vendor overclocked cards as high as 720MHz core, 1566MHz shader, so the manufacturing process isn't the problem. There are specific power and thermal limits NVIDIA wanted to hit, which is why it's clocked where it is.
Last week I bought a Medion P6620 laptop. It's fitted with an Nvidia 220M graphics card with 512MB of GDDR3 ram. The blurb of the box said it was a DirectX 10.1 card. Have NV released any details about the laptop 220M as well as the desktop 220? All I could find out from Wikipedia & Notebookcheck.net was that the 220M was a 40 micron version of the 9600M GT but the Anandtech review suggests that things are more complicated than this.
My only contribution to the debate is that I picked up this laptop for a paltry £399. Given that my machine has a T6500, 320GB HD, all I can think is that NV has be giving away the 220M to OEMs for pennies!
Finally, has anyone seen any signs of Windows 7 drivers for the 220M? I got a freebie upgrade but I'm a bit scared of using it just in case there were no drivers for a graphics card that NV don't recognise on their own website.
I'm assuming http://www.nvidia.com/object/notebook_winvista_win...">these won't work? I know they don't list 220M, but they're the latest official drivers from NVIDIA; otherwise you're stuck with whatever the notebook manufacturer has delivered. I'd guess NVIDIA will have updated mobile notebook reference drivers relatively soon, though, to coincide with the official Win7 launch.
Many thanks for the link. I think you're right. Although this NV driver page refers to the '200 series', it doesn't explicitly mention the 220M card. Until it does, I'll hang back from doing the Windows 7 upgrade.
As it happens, I've only just made the leap from XP to Vista. Despite all the bad stuff I've read, Vista seems ok so I might just stick with it for a while. I'm finding I can play games ok with the Vista/220M combination. I'm currently playing FEAR 2 at 1366 x 768 with all the detail settings on maximum and the game plays fine. Given the low price of the laptop, I'm more than happy with the graphics performance.
I've never had a true hate for Vista, but there are areas I dislike (several clicks just to get to resolution adjustment, for example). Windows 7 is better in pretty much every way as far as I can tell, although it's not like Vista is horrible and Win7 is awesome. It's more like Vista reached a state of being "good enough" and 7 addresses a few remaining flaws.
Now, if someone at MS would fix the glitch where my laptop power settings keep resetting.... (Every few tests, my battery saving options will reset to "default" and turn screen saver, system sleep, etc. on after I explicitly disabled it for testing. Annoying!)
it's AMAZING for this gen's "lowest end" (at retail)... beats the pants off my 9400GT currently in my HTPC. probably even equals or betters my once only 2 generation old nearly high end HD3850.
this is going in my HTPC in a heartbeat when it hits $30-35 (unless power consumption is stupid high, which i don't think it is. i THINK i read about this somewhere else, and someone was moaning about it using 4 watts more than some ATI part it was being compared to).
Given the more competitive desktop landscape, these new 40nm DX10.1 chips are not impressive at all. However, their mobile derivatives were announced months ago, and in the mobile space where mainstream GPUs seem to be made up of many, many combinations of nVidia's 32SP GPUs, a 40nm 48SP DX10.1 GPU would actually bring something to the table. It'd be great if we can get a review of a notebook with the GT 240M for instance.
It looks like there are laptops with GT 240M starting at around $1100, and the jump from there to a 9800M card (96SPs, 256-bit RAM) is very small. Unless you can get GT 240M laptops for around $800, I don't see them being a big deal. Other factors could change my mind, though - battery life perhaps, or size/form factor considerations.
For some people, the most important detail of any new NVIDIA card is the CUDA Compute Capability. Will it run my scientific simulation code? Only if the CUDA Compute Capability is 1.3 - so that it supports double precision floating point arithmetic. Could we have just one line somewhere to tell us this vital info?
LOL,for dgemm (double-precision matrix multiplication) the GTX 285 is only about twice as fast as a Core i7 920 (using CUBLAS and Intel MKL, respectively).
This suggests you shouldn't run your double precision code on a GT 220. Hell, even running CUDA code on your CPU using emulation might be faster than running on the GT 220..
You're wrong. Given the right problem, tackled using the right algorithm, and well-written code, TESLAs can do 20x the throughput of a Nehalem - even in double precision.
Well, as a developer, I just need it to work. Other machines here have TESLAs and GTX280s, but a low end cool running card would be very useful for development machines.
I believe that the answer to my question is that it's 1.2 (i.e. everything except double precision), so no good for me.
Ryan, if you run 'deviceQuery' from the Cuda SDK, it will tell you all there is to know.
Another goodie would be 'bandwidthTest' for those of us who can't figure out the differences between various DDR and GDDR's and what the quoted clocks are supposed to imply ...
My HD 4670 idles in 165MHz core and 249,8MHz memoy clock and GPU temps 36-45 degrees(passively cooled) as reported by GPU-Z 0.3.5 Is there a possibility that your card didn't lower it's clock during idle?
The first one would idle at 0.9v.
The second would idle at 1.1v. When I edited the BIOS for lover idle voltages, I could not get it to be stable.
The third one turned out to have a much cheaper design - not only did it have slower memory, it had no voltage adjustments and idles at 1.25v (but the correct idle frequency).
Are GT 220 capable of DXVA decoding of h.264 at High Profile Level 5.1?
And videos wiht more than 5 reference frames. Because ATI HD4670 can only do High Profile (HiP) Level 4.1 Blu-ray compatible.
Also what is the deinterlacing on GT 220 if you have monitor and TV in extended mode? Is it vector adaptive deinterlacing?
These questions are important for this video card because it is obvious that it is not for gamer but for HTPC.
On HD 4670 when you have one monitor then you have vector adaptive deinterlacing but if you have two monitors or monitor and TV and they are in extended mode then you only have "bob" deinterlacing.
I'm not sure if this is driver bug or hardware limitation.
I have another GT 220 card due this week or early next. Drop me an email; if you have something I can use to test it, I will gladly try it out. I have yet to encounter anything above 4.1 though; it seems largely academic.
So my inner nerd that just has to know is confused. Are these truly GT200-based or G9x-based? Different sources say different things. In a way the GT200 series was an improvement on G9x anyway but with enough significant low level changes to make it different. The article calls these GT200 *series* but that could be in name only. It's not clear if that means smaller process cut down die GT200-based or added feature G9x-based.
nVidia has you confused, and thusly, their plan has succeeded. It's really the price/performance ratio that it's at making any difference. Don't bother yourself with details on the renaming schemes. It's a new shiny!
lol. "These aren't the details you're looking for" *waves hand* Yeah I know it's just a nitty gritty detail and the performance is what matters. I'd still like to know though :)
Upon checkin, it seems that there is indeed this 48 SP spec for 9600 GSO but its proper name is 9600GSO 512. So nv use the same exact thing (8800GS) and renamed it to another product (9600GSO) without improving anything. And now queitly chg the 9600GSO and lower the SP to half and din even chg the name? Why dun they release a 120 SP's GTX 280? Or simply renamed 9800 GTX to GTX 280?
Actually they did take a 9800 and release it as a GTX280, of a fashion.
The mobile GTX280 is just an 8800/9800 card rebadged and with all its SPs enabled (128). The mobile 8800/9800 had only either 96 or 112 ( I can't remember), so they made a 128 SP version and called it the GTX280-M
Why is my fav site which is Anandtech can make such lousy silly mistakes? Ryan Smith, where did ur 9600 GSO came from? The spec of it is all wrong. It is a renamed 8800 GS with the same G92 core as 8800GT/8800GTS/9800GTX. It basically got 96 SP's with 192 bit memory bus. Even nvidia website is correct for a change. Look. http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">http://www.anandtech.com/cpuchipsets/showdoc.aspx?...
tell me, enlighten me, where did ur 9600 GSO come from??????
There are 2 9600GSOs. The old one was G92 based and had 96SP. The new one is G94 based (9600GT) and has 48SP. The old one is no longer produced, while the new one is the current 9600GSO, and is the GSO NVIDIA and its partners are referring to when they compare the GT 220 to the 9600GSO.
We actually tested an old model 9600GSO, but that's only because it's the slowest thing we have on-hand that's above a 9500GT.
Thanks for replying Ryan. I just cant help thinking nvidia has gone to another low yet again. This new products coverage is basically too little too late and too slow and too expensive. Ppl looking for low end card can get their needs met by going for equivalently priced ATI cards. Despite releasing such slow card n so late in the market, they still refuse to sell it at lower price. How can GT220 worth USD69-79? A Radeon HD4670 easily can outperform it while costing similar or less (depending on ur location). And wht is G210 crap? 16 SP's? Nvidia muz be joking and must be laughing at every single ignorant noob stupid customers who would purchase a crap like that for like what? 40-50 dollars? Gotta be kidding me man. It doesnt even worth half that amount. Mayb if its 10 dollars, I will recommend it to ppl with 10 dollars budget for graphics card.
well, to me this looks like nvidia taking too long to finish a product that was nearly done 3 quarters ago
by nvidia's 2009 standards, you can expect GT300 to come out around 2010Q2
(I know they'll have some sort of launch much earlier, but I'd expect it to be just press samples, with less than spectacular clocks and a dustbuster fan, sitting somewhere in between 5870 and 5870x2, for a price that's irrelevant because of lack of availability... until some new respin comes around, as I said, close to 2010Q2)
Of course, discerning consumers know better and demand new architectures! It wouldn't make any sense to accept old parts and rebadges that offer 2x the performance at a lower price!
man i never thought i'd be saying this, but nvidia needs to get their shiat together!!! we need competition!!! they're getting trashed by AMD, what happened to em?
Everybody know NVIDIA is downplaying until they see how well Window 7 will be. Plus ATI releasing 5800 series with DX.11(software/games not going to be compatible with it until 2011).
Huh? The new Radeons are compatible with every game the old radeons and geforce cards are and much better/faster.
And there is a small list of games with DX11 features being released very soon that ONLY the new radeons can take advantage of.
And nVdia isn't downplaying anything. They simply DO NOT have a answer to ATI's new cards at this time. And apprently it won't be till the first part of next year that they will have their answer.
Why I keep seeing people trying to downplay nvidia's faults is beyond me?
Well I guess we have to wait and see. You cannot assume they do not have the answer. It is not the right time to release something at the caliber yet. I'm not bias but it seem that people are saying that ATI have won but there is no facts/comparison. Sure you can compare ATI 5800 series to the GT295/275(old graphic) but I think everybody want to see GT300 series face off with 5800 series. Remember what happen to ATI when NVIDIA came out with SLI? ATI release crossfire(not innovated) just to try and match NVIDIA instead of creating something more innovated. ATI never had a solution to defeat SLI and that is the fact.
I wouldn't say ATI has "won", but they are currently leading. NVIDIA isn't releasing Fermi right now because they can't -- they don't have the hardware ready. The card shown was a mock-up part, and you don't use a fake card if you have real product ready. All signs are Jan/Feb 2010 for the GT300 release. That gives ATI a full four months of being the ONLY DX11 GPU supplier, right at a major buying time for consumers. NVIDIA isn't out by any stretch of the imagination -- just as ATI wasn't out with the 2000 and 3000 series, and NVIDIA weathered the FX 5000 times. Short-term, though, this has to be hurting.
On the other hand, I can say that NVIDIA is the way to go on virtually any gaming laptop right now. ATI has some competitive parts, yes, but I wouldn't touch them until they get reference drivers for all major parts on their site. Depending on laptop manufacturers for driver updates is a really bad idea, and NVIDIA thankfully addressed that area a while back.
There are several games that have some DirectX 11 features out right now. Perhaps the more accurate thing to say is that DirectX 11 will not see feature set adoption en mass until sometime in 2011.
I think ATI somewhat admits this as they spent a good deal of time tweaking some of its driver and hardware features to boost the performance of directX 9 engine games. There was something about that on Anand a few weeks ago about that.
You show the 9600GSO winning the majority of the benches you decided to allow it to take part in, it is cheaper then the 4670, and the 4670 is the clear winner?
Why do you bother quoting the price of the 9600GT when you refused to show benchmarks for it?
Right now on the Egg you can get a 9600GSO for $40 AR, $60 before rebate. The article may be right in terms of the parts that are launching being a bad value, but more then anything that is because of how soundly they are bested by nV's existing parts- which are already faster and cheaper then the 4670.
It's one of those 96SP GSOs based on G92. We include for reference only; you can't buy them any more (and the 96SP model listed on Newegg is wrong, it's a 48SP model).
From all reports and feedback I've seen, that Asus part is 96SP for $40 AR (was actually $35 at one point!) just as Ben stated.
I don't really care about this segment of parts for actual 3D so you may be right. I was only interested in that Asus 96SP because it was identified by many users as "the card" to get for dedicated PhysX on a budget. I was hoping to find a low-power consumption candidate on 40nm for dedicated PhysX, but it seems these parts have too few SP to be viable.
Why not include benches of the nV part that you quoted the price for then(btw- NewEgg is also listing one of the 384MB 96SP 9600GSOs for $56.49)?
It is rather misleading to put it mildly that you decide to include benches for one part that is quite a bit cheaper, and quite a bit slower, and then quote the price on the next rung up on the performance ladder. It would be akin to showing benches for the GTX275 and lamenting it because the GTX285 was so expensive, it doesn't make any sense.
The graphs on the "Temperature and Noise" page are high-to-low. Usually they're the opposite, following the idea that lower numbers are better. I know, I know, it's a lame nitpick.
As for the card: If they added this checkbox feature for OEMs (which I believe) then why bother allowing for retail distribution? This card really doesn't bring anything that isn't already done with cards they already have available. Seems like a rather silly release.
Seems recently there have been a lot of low-end cards with 1GB of memory onboard. Is this just a marketing ploy, or is there actually some scenario where it would be useful to have lots of memory but not much processing power?
The 1GB on low end cards is a marketing thing. The "average consumer" equates more memory with performance and generally have little to no knowledge about the importance of GPU type, speed, or graphics memory bandwidth.
nVidia and ATi have been taking advantage of this to help add profit margin since at least the GeForce 4 MX generation.
This could have been a killer HTPC card...one year ago.
I understand the need for nvidia to come up with sth for this market segment, and I like very much both the improved audio and the low power consumption.
But lets be honest here. This card has no place in the current retail market. It merely looks as a project to max margins for OEM use. OEMs can market a "Brand new card with a smaller process, DX10.1 support and 1 whole GB of memory goodness", but retail vendors and nvidia will have a hard time convincing anyone to buy this instead of going for a 9600 or a 4670.
Since margins seem godly, nvidia should lower the price and try to saturate the market segment with these cards in a HTPC (suicide) mission. That would at least make sense to consumers.
I'd love to see this around $40 and make it's way into inexpensive OEM desktops. The GT 220 would be perfect for low end desktops if the price gets low enough. This would enable cheap eMachines to play MMOs and keep PC gaming alive.
But as it stands right now 4650 and 4670 are the two cards I currently use to build low end desktops for others.
A joke launch for what I can only hope is supposed to be a joke product.
Mind you, at least NV have launched _something_ at 40nm for the general market rather than OEMs only.
Glad to see AMD pretty much destroying Nvidia on the graphics front (for now). They're processor division is doing so badly, and we need our CPU competition...
I doubt that AMD is destroying or will even destroy Nvidia in any respect. What they are doing is providing them with credible competition and that is what we all benefit from. Although I will say this whole renaming cards and providing meagre performance increases is wearing a little thin.
I enjoy the fact that I can choose between similar performance levels in cards from either camp for less money than when the 8800 series came out (I am referring to the GTX/Ultra parts). Now I know I can see your skin turning a solid shade of AMD Red but remember we have no answer from Nvidia yet, so the only thing that has occurred here is that ATI has beaten them to the market - perhaps Nvidia would rather be late and avoid another 5800 Vacuum Cleaner launch, I am sure that cost them dearly.
AMD/Nvidia really have to ensure that they spend time ensuring their products are going to rival the Larrabee/DX 11 threat that very well has the possibility to clean them both up. I would rather have to install a graphics card and a processor of my choice than to have a single intel provider of all in my PC. As you can see nvidia is trying to push multiple uses for their GPU which seems pretty smart to me.
They are not destroying NVIDIA. I think we have to wait and see NVIDIA big guns(GT300 series) before we can truly judge who is beating up on who. I consider this a defensive move until Windows 7 release and to see how the market react over the ATI 5800 series. Remember what happen to the ATI 9700/9800 series, we all know what happen after that. :)
Tool
So because nVidia is going to bring out something in 2-4 months, AMD isn't doing better?
And how is an overpriced budget card a response to brand new, high end cards?
They already know how the market has responded to the 5800 cards. As many as they can make are being sold.
They only responded because that's like the newest ATI technology since long ass time. When you are down the only way is up or bankrupt. SLI was a knockout blow to ATI and just now they barely catching up.
Really? Then prove to me that SLI wasn't a knockout blow? After SLI came out, who brought ATI? I love ATI but those are the facts. Show me some proof that AMD stock is up since the merger.
Errm, Valve's latest hardware survey shows that only 2.39% of gamers are using 2+ GPUs with SLI or Crossfire. ATI has a 27.26% marketshare.
Of those who did buy multi-GPU solutions, some may be "hidden" (GTX295, the various X2 solutions), in which case it had no impact whatsoever (since it's presented as a single card). Some may have used it as an upgrade to an existing card, in which case SLI/Crossfire may not have driven their decision.
It's true that SLI (2.14%) has greatly outsold Crossfire (0.25%), but that's such a tiny market segment that it doesn't amount to much.
ATI has managed to hold on to a respectable market share. In fact, their 4800 series cards are more popular than every single nVidia series except for the 8800 series.
So, I think I've sufficiently proven that SLI wasn't a knockout blow... It was barely a tickle to the market at large.
When Sli came out? Stop mentioning ancient news. Right now, Sli n Xfire r abt equally sucks. Heard of Hydra? Thats the cool stuff dude. And yeah nvidia is very innovative indeed, renaming old products to look new to deceive customers, shave half the spec of a products n keep the same name (9600gso), releasing crappy products n selling it overprice.... MAN! Thats really innovative dun u think?
Die painfully okay? Prefearbly by getting crushed to death in a
garbage compactor, by getting your face cut to ribbons with a
pocketknife, your head cracked open with a baseball bat, your stomach
sliced open and your entrails spilled out, and your eyeballs ripped
out of their sockets. Fucking bitch
I really hope that you get curb-stomped. It'd be hilarious to see you
begging for help, and then someone stomps on the back of your head,
leaving you to die in horrible, agonizing pain. Faggot
Shut the fuck up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
Let's face it. Nvidia is NOT competitive at every front at every single price point. From ultra low end to mid range to ultra high end, tell me, which price point is nvidia being competitive?
Well, of cos I believe Fermi will be something different. I truly believe so. In fact, given that HD5870's slightly below par performance for its spec (very likely bcos memory bandwith limited), and Fermi being on a much larger die and higher transistor count, I EXPECT nVidia next gen Fermi to easily outperform HD5870. Just like how GTX285 outperform HD4890. But by how much? For almost 100 USD more for juz 5-10% improvements? I believe this will likely be the case with Fermi vs 5870. Surely its faster, but ur mayb paying 100% more to get 25% extra fps.
CONCLUSION: Even if Nvidia retake the top single GPU performance crown, they were never a winner in price to performance at ANY price point. They care about profits than they care about you.
I agree what your conclusion. Definitely price point ATI has always been on the top of their game but NVIDIA innovations is what make the two apart. But who knows, maybe one day ATI/AMD comes out with CPU/GPU solution that will change the technology industry. That would be cool.
My son plays GTA, FIFA, POP, Tomb Raider, NFS etc. in my P4, 3.06 GHz WinXP m/c with N 9400 GT (MSI) 1GB card without any problem in a 19inch LCD monitor. Now that I am planning to exchange the 4 year old m/c with a new i5 650, 3.2 GHz, Win7 m/c fitted with GT220 1 GB card, please tell me whether he will find the new machine a better one to play games with.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
80 Comments
Back to Article
abs0lut3 - Tuesday, October 13, 2009 - link
When is GT 240 coming out and when are you going to review it? I had expected the GT 220 to be as low as it comes (reaaallly low end), however, I saw some preliminary reviews on other forums on the GT 240, the supposedly new Nvidia 40nm mainstream card with GDDR5 and quite fascinate with result.MegaSteve - Tuesday, October 20, 2009 - link
No one is going to buy one of these cards by choice - they are going to be thrown out in HP, Dell and Acer PCs under a pretty sticker saying they have POWERFUL GRAPHICS or some other garbage. Much the same as them providing 6600 graphics cards instead of 6600GTs, then again, I would probably rather have a 6600GT because if the DirectX 10 cards that were first released were any indication this thing will suck. I am sure this thing will play Bluray...Deanjo - Tuesday, October 13, 2009 - link
"NVIDIA has yet to enable MPEG-4 ASP acceleration in their drivers"Not true, they have not enabled it in their Windows drivers.
They are enabled in the linux drivers for a little while now.
ftp://download.nvidia.com/XFree86/Linux-x86_64/190...">ftp://download.nvidia.com/XFree86/Linux-x86_64/190...
VDP_DECODER_PROFILE_MPEG4_PART2_SP, VDP_DECODER_PROFILE_MPEG4_PART2_ASP, VDP_DECODER_PROFILE_DIVX4_QMOBILE, VDP_DECODER_PROFILE_DIVX4_MOBILE, VDP_DECODER_PROFILE_DIVX4_HOME_THEATER, VDP_DECODER_PROFILE_DIVX4_HD_1080P, VDP_DECODER_PROFILE_DIVX5_QMOBILE, VDP_DECODER_PROFILE_DIVX5_MOBILE, VDP_DECODER_PROFILE_DIVX5_HOME_THEATER, VDP_DECODER_PROFILE_DIVX5_HD_1080P
*
Complete acceleration.
*
Minimum width or height: 3 macroblocks (48 pixels).
*
Maximum width or height: 128 macroblocks (2048 pixels).
*
Maximum macroblocks: 8192
Deanjo - Tuesday, October 13, 2009 - link
I should also mention XBMC already supports this as well in linux.Transisto - Tuesday, October 13, 2009 - link
zzzzzzzzzzzzzzzzz...............Souleet - Monday, October 12, 2009 - link
I guess the only place that actually selling Palit right now is newegg. http://www.newegg.com/Product/ProductList.aspx?Sub...">http://www.newegg.com/Product/ProductLi...&Des...MODEL3 - Monday, October 12, 2009 - link
Great prices, lol (either they have old 55nm stock or the 40nm yields are bad or they are crazy, possibly the first)Some minor corrections:
G 210 ROPs should be 4 not 8 (8 should be the Texture units, GT220 should have 8 ROPs and 16 Texture units)
http://www.tomshardware.co.uk/geforce-gt-220,revie...">http://www.tomshardware.co.uk/geforce-gt-220,revie...
(Not because tomshardware is saying so, but because otherwise, it doesn't make sense NV architects to designed a so bandwidth limited GPU) (and based on past architecture design logic)
G 210 standard config CPU core clock is 589MHz, shaders 1402MHz.
(check Nvidia's partner sites)
9600GSO (G94) Memory Bus Width is 256bit not 128bit.
http://www.nvidia.com/object/product_geforce_9600_...">http://www.nvidia.com/object/product_geforce_9600_...
58W should be the figure NV is giving when GT 220 is paired with GDDR3, with DDR3 the power consumption should be a lot less.
Example for GDDR3 vs DDR3 power consumption:
http://www.techpowerup.com/reviews/Palit/GeForce_G...">http://www.techpowerup.com/reviews/Palit/GeForce_G...
http://www.techpowerup.com/reviews/Zotac/GeForce_G...">http://www.techpowerup.com/reviews/Zotac/GeForce_G...
Souleet - Monday, October 12, 2009 - link
I'm sure there is cooling solution but it will probably going to hurt your wallet. I love ATI but they need to fire their marketing team and hire some more creative people. Nvidia needs to stop under estimating ATI and crush them, now they are just giving ATI a chance to steal some market share back.Zool - Monday, October 12, 2009 - link
Its 40nm and has only 48sp 8rop/16tmu and still only 1360MHz shader clock.Is the TSMC 40nm this bad or what. The 55nm 128sp gt250 has 1800 Mhz shaders.Could you please try out some overckocking.
Ryan Smith - Tuesday, October 13, 2009 - link
We've seen vendor overclocked cards as high as 720MHz core, 1566MHz shader, so the manufacturing process isn't the problem. There are specific power and thermal limits NVIDIA wanted to hit, which is why it's clocked where it is.Joe90 - Monday, October 12, 2009 - link
Last week I bought a Medion P6620 laptop. It's fitted with an Nvidia 220M graphics card with 512MB of GDDR3 ram. The blurb of the box said it was a DirectX 10.1 card. Have NV released any details about the laptop 220M as well as the desktop 220? All I could find out from Wikipedia & Notebookcheck.net was that the 220M was a 40 micron version of the 9600M GT but the Anandtech review suggests that things are more complicated than this.My only contribution to the debate is that I picked up this laptop for a paltry £399. Given that my machine has a T6500, 320GB HD, all I can think is that NV has be giving away the 220M to OEMs for pennies!
Finally, has anyone seen any signs of Windows 7 drivers for the 220M? I got a freebie upgrade but I'm a bit scared of using it just in case there were no drivers for a graphics card that NV don't recognise on their own website.
JarredWalton - Monday, October 12, 2009 - link
I'm assuming http://www.nvidia.com/object/notebook_winvista_win...">these won't work? I know they don't list 220M, but they're the latest official drivers from NVIDIA; otherwise you're stuck with whatever the notebook manufacturer has delivered. I'd guess NVIDIA will have updated mobile notebook reference drivers relatively soon, though, to coincide with the official Win7 launch.Joe90 - Tuesday, October 13, 2009 - link
Many thanks for the link. I think you're right. Although this NV driver page refers to the '200 series', it doesn't explicitly mention the 220M card. Until it does, I'll hang back from doing the Windows 7 upgrade.As it happens, I've only just made the leap from XP to Vista. Despite all the bad stuff I've read, Vista seems ok so I might just stick with it for a while. I'm finding I can play games ok with the Vista/220M combination. I'm currently playing FEAR 2 at 1366 x 768 with all the detail settings on maximum and the game plays fine. Given the low price of the laptop, I'm more than happy with the graphics performance.
JarredWalton - Tuesday, October 13, 2009 - link
I've never had a true hate for Vista, but there are areas I dislike (several clicks just to get to resolution adjustment, for example). Windows 7 is better in pretty much every way as far as I can tell, although it's not like Vista is horrible and Win7 is awesome. It's more like Vista reached a state of being "good enough" and 7 addresses a few remaining flaws.Now, if someone at MS would fix the glitch where my laptop power settings keep resetting.... (Every few tests, my battery saving options will reset to "default" and turn screen saver, system sleep, etc. on after I explicitly disabled it for testing. Annoying!)
plonk420 - Monday, October 12, 2009 - link
it's AMAZING for this gen's "lowest end" (at retail)... beats the pants off my 9400GT currently in my HTPC. probably even equals or betters my once only 2 generation old nearly high end HD3850.this is going in my HTPC in a heartbeat when it hits $30-35 (unless power consumption is stupid high, which i don't think it is. i THINK i read about this somewhere else, and someone was moaning about it using 4 watts more than some ATI part it was being compared to).
ltcommanderdata - Monday, October 12, 2009 - link
Given the more competitive desktop landscape, these new 40nm DX10.1 chips are not impressive at all. However, their mobile derivatives were announced months ago, and in the mobile space where mainstream GPUs seem to be made up of many, many combinations of nVidia's 32SP GPUs, a 40nm 48SP DX10.1 GPU would actually bring something to the table. It'd be great if we can get a review of a notebook with the GT 240M for instance.JarredWalton - Monday, October 12, 2009 - link
It looks like there are laptops with GT 240M starting at around $1100, and the jump from there to a 9800M card (96SPs, 256-bit RAM) is very small. Unless you can get GT 240M laptops for around $800, I don't see them being a big deal. Other factors could change my mind, though - battery life perhaps, or size/form factor considerations.apple3feet - Monday, October 12, 2009 - link
For some people, the most important detail of any new NVIDIA card is the CUDA Compute Capability. Will it run my scientific simulation code? Only if the CUDA Compute Capability is 1.3 - so that it supports double precision floating point arithmetic. Could we have just one line somewhere to tell us this vital info?jasperjones - Monday, October 12, 2009 - link
LOL,for dgemm (double-precision matrix multiplication) the GTX 285 is only about twice as fast as a Core i7 920 (using CUBLAS and Intel MKL, respectively).This suggests you shouldn't run your double precision code on a GT 220. Hell, even running CUDA code on your CPU using emulation might be faster than running on the GT 220..
apple3feet - Wednesday, October 14, 2009 - link
You're wrong. Given the right problem, tackled using the right algorithm, and well-written code, TESLAs can do 20x the throughput of a Nehalem - even in double precision.Ryan Smith - Monday, October 12, 2009 - link
I don't have that information at this moment. However this is very much the wrong card if you're going scientific work for performance reasons.apple3feet - Wednesday, October 14, 2009 - link
Well, as a developer, I just need it to work. Other machines here have TESLAs and GTX280s, but a low end cool running card would be very useful for development machines.I believe that the answer to my question is that it's 1.2 (i.e. everything except double precision), so no good for me.
jma - Monday, October 12, 2009 - link
Ryan, if you run 'deviceQuery' from the Cuda SDK, it will tell you all there is to know.Another goodie would be 'bandwidthTest' for those of us who can't figure out the differences between various DDR and GDDR's and what the quoted clocks are supposed to imply ...
vlado08 - Monday, October 12, 2009 - link
My HD 4670 idles in 165MHz core and 249,8MHz memoy clock and GPU temps 36-45 degrees(passively cooled) as reported by GPU-Z 0.3.5 Is there a possibility that your card didn't lower it's clock during idle?vlado08 - Monday, October 12, 2009 - link
My question is to Ryian of course.Ryan Smith - Monday, October 12, 2009 - link
Yes, it was idling correctly.KaarlisK - Monday, October 12, 2009 - link
The HD4670 cards differ.I've bought three:
http://www.asus.com/product.aspx?P_ID=Z9qnCFnOUNDM...">http://www.asus.com/product.aspx?P_ID=Z9qnCFnOUNDM...
http://www.gigabyte.com.tw/Products/VGA/Products_O...">http://www.gigabyte.com.tw/Products/VGA/Products_O...
http://www.asus.com/product.aspx?P_ID=g6LDXHUo0EzV...">http://www.asus.com/product.aspx?P_ID=g6LDXHUo0EzV...
The first one would idle at 0.9v.
The second would idle at 1.1v. When I edited the BIOS for lover idle voltages, I could not get it to be stable.
The third one turned out to have a much cheaper design - not only did it have slower memory, it had no voltage adjustments and idles at 1.25v (but the correct idle frequency).
vlado08 - Monday, October 12, 2009 - link
Are GT 220 capable of DXVA decoding of h.264 at High Profile Level 5.1?And videos wiht more than 5 reference frames. Because ATI HD4670 can only do High Profile (HiP) Level 4.1 Blu-ray compatible.
Also what is the deinterlacing on GT 220 if you have monitor and TV in extended mode? Is it vector adaptive deinterlacing?
These questions are important for this video card because it is obvious that it is not for gamer but for HTPC.
On HD 4670 when you have one monitor then you have vector adaptive deinterlacing but if you have two monitors or monitor and TV and they are in extended mode then you only have "bob" deinterlacing.
I'm not sure if this is driver bug or hardware limitation.
Ryan Smith - Monday, October 12, 2009 - link
I have another GT 220 card due this week or early next. Drop me an email; if you have something I can use to test it, I will gladly try it out. I have yet to encounter anything above 4.1 though; it seems largely academic.MadMan007 - Monday, October 12, 2009 - link
So my inner nerd that just has to know is confused. Are these truly GT200-based or G9x-based? Different sources say different things. In a way the GT200 series was an improvement on G9x anyway but with enough significant low level changes to make it different. The article calls these GT200 *series* but that could be in name only. It's not clear if that means smaller process cut down die GT200-based or added feature G9x-based.Inquiring nerds want to know!
Silverel - Monday, October 12, 2009 - link
It doesn't really matter though does it?nVidia has you confused, and thusly, their plan has succeeded. It's really the price/performance ratio that it's at making any difference. Don't bother yourself with details on the renaming schemes. It's a new shiny!
MadMan007 - Monday, October 12, 2009 - link
lol. "These aren't the details you're looking for" *waves hand* Yeah I know it's just a nitty gritty detail and the performance is what matters. I'd still like to know though :)Seramics - Monday, October 12, 2009 - link
Upon checkin, it seems that there is indeed this 48 SP spec for 9600 GSO but its proper name is 9600GSO 512. So nv use the same exact thing (8800GS) and renamed it to another product (9600GSO) without improving anything. And now queitly chg the 9600GSO and lower the SP to half and din even chg the name? Why dun they release a 120 SP's GTX 280? Or simply renamed 9800 GTX to GTX 280?Lonyo - Monday, October 12, 2009 - link
Actually they did take a 9800 and release it as a GTX280, of a fashion.The mobile GTX280 is just an 8800/9800 card rebadged and with all its SPs enabled (128). The mobile 8800/9800 had only either 96 or 112 ( I can't remember), so they made a 128 SP version and called it the GTX280-M
Seramics - Monday, October 12, 2009 - link
Why is my fav site which is Anandtech can make such lousy silly mistakes? Ryan Smith, where did ur 9600 GSO came from? The spec of it is all wrong. It is a renamed 8800 GS with the same G92 core as 8800GT/8800GTS/9800GTX. It basically got 96 SP's with 192 bit memory bus. Even nvidia website is correct for a change. Look. http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">http://www.anandtech.com/cpuchipsets/showdoc.aspx?...tell me, enlighten me, where did ur 9600 GSO come from??????
Ryan Smith - Monday, October 12, 2009 - link
There are 2 9600GSOs. The old one was G92 based and had 96SP. The new one is G94 based (9600GT) and has 48SP. The old one is no longer produced, while the new one is the current 9600GSO, and is the GSO NVIDIA and its partners are referring to when they compare the GT 220 to the 9600GSO.We actually tested an old model 9600GSO, but that's only because it's the slowest thing we have on-hand that's above a 9500GT.
Seramics - Monday, October 12, 2009 - link
Thanks for replying Ryan. I just cant help thinking nvidia has gone to another low yet again. This new products coverage is basically too little too late and too slow and too expensive. Ppl looking for low end card can get their needs met by going for equivalently priced ATI cards. Despite releasing such slow card n so late in the market, they still refuse to sell it at lower price. How can GT220 worth USD69-79? A Radeon HD4670 easily can outperform it while costing similar or less (depending on ur location). And wht is G210 crap? 16 SP's? Nvidia muz be joking and must be laughing at every single ignorant noob stupid customers who would purchase a crap like that for like what? 40-50 dollars? Gotta be kidding me man. It doesnt even worth half that amount. Mayb if its 10 dollars, I will recommend it to ppl with 10 dollars budget for graphics card.gwolfman - Monday, October 12, 2009 - link
To me, this looks like nVidia's trial run of some GT300 technology (audio over PCIe bus for example) before it's released.samspqr - Monday, October 12, 2009 - link
well, to me this looks like nvidia taking too long to finish a product that was nearly done 3 quarters agoby nvidia's 2009 standards, you can expect GT300 to come out around 2010Q2
(I know they'll have some sort of launch much earlier, but I'd expect it to be just press samples, with less than spectacular clocks and a dustbuster fan, sitting somewhere in between 5870 and 5870x2, for a price that's irrelevant because of lack of availability... until some new respin comes around, as I said, close to 2010Q2)
yacoub - Monday, October 12, 2009 - link
Two years after releasing the 800GT, NVidia releases a card with... half the performance!lol. what a waste. so how's the 5770/5750 review coming along? that'll be more interesting.
chizow - Monday, October 12, 2009 - link
Of course, discerning consumers know better and demand new architectures! It wouldn't make any sense to accept old parts and rebadges that offer 2x the performance at a lower price!yacoub - Monday, October 12, 2009 - link
er, 8800GT. fingers...poohbear - Monday, October 12, 2009 - link
man i never thought i'd be saying this, but nvidia needs to get their shiat together!!! we need competition!!! they're getting trashed by AMD, what happened to em?Souleet - Monday, October 12, 2009 - link
Everybody know NVIDIA is downplaying until they see how well Window 7 will be. Plus ATI releasing 5800 series with DX.11(software/games not going to be compatible with it until 2011).formulav8 - Monday, October 12, 2009 - link
Huh? The new Radeons are compatible with every game the old radeons and geforce cards are and much better/faster.And there is a small list of games with DX11 features being released very soon that ONLY the new radeons can take advantage of.
And nVdia isn't downplaying anything. They simply DO NOT have a answer to ATI's new cards at this time. And apprently it won't be till the first part of next year that they will have their answer.
Why I keep seeing people trying to downplay nvidia's faults is beyond me?
Jason
Souleet - Monday, October 12, 2009 - link
Well I guess we have to wait and see. You cannot assume they do not have the answer. It is not the right time to release something at the caliber yet. I'm not bias but it seem that people are saying that ATI have won but there is no facts/comparison. Sure you can compare ATI 5800 series to the GT295/275(old graphic) but I think everybody want to see GT300 series face off with 5800 series. Remember what happen to ATI when NVIDIA came out with SLI? ATI release crossfire(not innovated) just to try and match NVIDIA instead of creating something more innovated. ATI never had a solution to defeat SLI and that is the fact.JarredWalton - Monday, October 12, 2009 - link
I wouldn't say ATI has "won", but they are currently leading. NVIDIA isn't releasing Fermi right now because they can't -- they don't have the hardware ready. The card shown was a mock-up part, and you don't use a fake card if you have real product ready. All signs are Jan/Feb 2010 for the GT300 release. That gives ATI a full four months of being the ONLY DX11 GPU supplier, right at a major buying time for consumers. NVIDIA isn't out by any stretch of the imagination -- just as ATI wasn't out with the 2000 and 3000 series, and NVIDIA weathered the FX 5000 times. Short-term, though, this has to be hurting.On the other hand, I can say that NVIDIA is the way to go on virtually any gaming laptop right now. ATI has some competitive parts, yes, but I wouldn't touch them until they get reference drivers for all major parts on their site. Depending on laptop manufacturers for driver updates is a really bad idea, and NVIDIA thankfully addressed that area a while back.
brybir - Monday, October 12, 2009 - link
Your statement is only partially true.There are several games that have some DirectX 11 features out right now. Perhaps the more accurate thing to say is that DirectX 11 will not see feature set adoption en mass until sometime in 2011.
I think ATI somewhat admits this as they spent a good deal of time tweaking some of its driver and hardware features to boost the performance of directX 9 engine games. There was something about that on Anand a few weeks ago about that.
BenSkywalker - Monday, October 12, 2009 - link
You show the 9600GSO winning the majority of the benches you decided to allow it to take part in, it is cheaper then the 4670, and the 4670 is the clear winner?Why do you bother quoting the price of the 9600GT when you refused to show benchmarks for it?
Right now on the Egg you can get a 9600GSO for $40 AR, $60 before rebate. The article may be right in terms of the parts that are launching being a bad value, but more then anything that is because of how soundly they are bested by nV's existing parts- which are already faster and cheaper then the 4670.
Ryan Smith - Monday, October 12, 2009 - link
It's one of those 96SP GSOs based on G92. We include for reference only; you can't buy them any more (and the 96SP model listed on Newegg is wrong, it's a 48SP model).chizow - Monday, October 12, 2009 - link
Is this the one you're referring to Ryan?http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.a...E1681412...
From all reports and feedback I've seen, that Asus part is 96SP for $40 AR (was actually $35 at one point!) just as Ben stated.
I don't really care about this segment of parts for actual 3D so you may be right. I was only interested in that Asus 96SP because it was identified by many users as "the card" to get for dedicated PhysX on a budget. I was hoping to find a low-power consumption candidate on 40nm for dedicated PhysX, but it seems these parts have too few SP to be viable.
BenSkywalker - Monday, October 12, 2009 - link
Why not include benches of the nV part that you quoted the price for then(btw- NewEgg is also listing one of the 384MB 96SP 9600GSOs for $56.49)?It is rather misleading to put it mildly that you decide to include benches for one part that is quite a bit cheaper, and quite a bit slower, and then quote the price on the next rung up on the performance ladder. It would be akin to showing benches for the GTX275 and lamenting it because the GTX285 was so expensive, it doesn't make any sense.
Ryan Smith - Monday, October 12, 2009 - link
Which part? The 9600GT, or the 48SP GSO?What you see are all of the low-end cards we were able to get our hands on for this article.
jordanclock - Monday, October 12, 2009 - link
The graphs on the "Temperature and Noise" page are high-to-low. Usually they're the opposite, following the idea that lower numbers are better. I know, I know, it's a lame nitpick.As for the card: If they added this checkbox feature for OEMs (which I believe) then why bother allowing for retail distribution? This card really doesn't bring anything that isn't already done with cards they already have available. Seems like a rather silly release.
Ryan Smith - Monday, October 12, 2009 - link
Fixed. Thanks.jonup - Monday, October 12, 2009 - link
Ryan, the Palit table shows their 1GB card's frame buffer as 512MB.strikeback03 - Monday, October 12, 2009 - link
Seems recently there have been a lot of low-end cards with 1GB of memory onboard. Is this just a marketing ploy, or is there actually some scenario where it would be useful to have lots of memory but not much processing power?Concillian - Monday, October 12, 2009 - link
The 1GB on low end cards is a marketing thing. The "average consumer" equates more memory with performance and generally have little to no knowledge about the importance of GPU type, speed, or graphics memory bandwidth.nVidia and ATi have been taking advantage of this to help add profit margin since at least the GeForce 4 MX generation.
Zingam - Monday, October 12, 2009 - link
This will be one craptastic graphics card for the masses! Yay!Good luck, NVIDIA!
Dante80 - Monday, October 12, 2009 - link
This could have been a killer HTPC card...one year ago.I understand the need for nvidia to come up with sth for this market segment, and I like very much both the improved audio and the low power consumption.
But lets be honest here. This card has no place in the current retail market. It merely looks as a project to max margins for OEM use. OEMs can market a "Brand new card with a smaller process, DX10.1 support and 1 whole GB of memory goodness", but retail vendors and nvidia will have a hard time convincing anyone to buy this instead of going for a 9600 or a 4670.
Since margins seem godly, nvidia should lower the price and try to saturate the market segment with these cards in a HTPC (suicide) mission. That would at least make sense to consumers.
uibo - Saturday, October 17, 2009 - link
"This could have been a killer HTPC card...one year ago. "... for me to poop on.
VooDooAddict - Monday, October 12, 2009 - link
I'd love to see this around $40 and make it's way into inexpensive OEM desktops. The GT 220 would be perfect for low end desktops if the price gets low enough. This would enable cheap eMachines to play MMOs and keep PC gaming alive.But as it stands right now 4650 and 4670 are the two cards I currently use to build low end desktops for others.
Lonyo - Monday, October 12, 2009 - link
A joke launch for what I can only hope is supposed to be a joke product.Mind you, at least NV have launched _something_ at 40nm for the general market rather than OEMs only.
yoyojam - Monday, October 12, 2009 - link
Glad to see AMD pretty much destroying Nvidia on the graphics front (for now). They're processor division is doing so badly, and we need our CPU competition...MegaSteve - Tuesday, October 20, 2009 - link
I doubt that AMD is destroying or will even destroy Nvidia in any respect. What they are doing is providing them with credible competition and that is what we all benefit from. Although I will say this whole renaming cards and providing meagre performance increases is wearing a little thin.I enjoy the fact that I can choose between similar performance levels in cards from either camp for less money than when the 8800 series came out (I am referring to the GTX/Ultra parts). Now I know I can see your skin turning a solid shade of AMD Red but remember we have no answer from Nvidia yet, so the only thing that has occurred here is that ATI has beaten them to the market - perhaps Nvidia would rather be late and avoid another 5800 Vacuum Cleaner launch, I am sure that cost them dearly.
AMD/Nvidia really have to ensure that they spend time ensuring their products are going to rival the Larrabee/DX 11 threat that very well has the possibility to clean them both up. I would rather have to install a graphics card and a processor of my choice than to have a single intel provider of all in my PC. As you can see nvidia is trying to push multiple uses for their GPU which seems pretty smart to me.
Souleet - Monday, October 12, 2009 - link
They are not destroying NVIDIA. I think we have to wait and see NVIDIA big guns(GT300 series) before we can truly judge who is beating up on who. I consider this a defensive move until Windows 7 release and to see how the market react over the ATI 5800 series. Remember what happen to the ATI 9700/9800 series, we all know what happen after that. :)teldar - Monday, October 12, 2009 - link
ToolSo because nVidia is going to bring out something in 2-4 months, AMD isn't doing better?
And how is an overpriced budget card a response to brand new, high end cards?
They already know how the market has responded to the 5800 cards. As many as they can make are being sold.
Souleet - Monday, October 12, 2009 - link
They only responded because that's like the newest ATI technology since long ass time. When you are down the only way is up or bankrupt. SLI was a knockout blow to ATI and just now they barely catching up.RubberJohnny - Monday, October 12, 2009 - link
Ahh Souleet, you are a candidate for the most uninformed reader on Anandtech.SLI a knockout blow HAHAHA, thats the funniest thing i've read today!
Souleet - Tuesday, October 13, 2009 - link
Really? Then prove to me that SLI wasn't a knockout blow? After SLI came out, who brought ATI? I love ATI but those are the facts. Show me some proof that AMD stock is up since the merger.Guspaz - Tuesday, October 13, 2009 - link
Errm, Valve's latest hardware survey shows that only 2.39% of gamers are using 2+ GPUs with SLI or Crossfire. ATI has a 27.26% marketshare.Of those who did buy multi-GPU solutions, some may be "hidden" (GTX295, the various X2 solutions), in which case it had no impact whatsoever (since it's presented as a single card). Some may have used it as an upgrade to an existing card, in which case SLI/Crossfire may not have driven their decision.
It's true that SLI (2.14%) has greatly outsold Crossfire (0.25%), but that's such a tiny market segment that it doesn't amount to much.
ATI has managed to hold on to a respectable market share. In fact, their 4800 series cards are more popular than every single nVidia series except for the 8800 series.
So, I think I've sufficiently proven that SLI wasn't a knockout blow... It was barely a tickle to the market at large.
Seramics - Tuesday, October 13, 2009 - link
When Sli came out? Stop mentioning ancient news. Right now, Sli n Xfire r abt equally sucks. Heard of Hydra? Thats the cool stuff dude. And yeah nvidia is very innovative indeed, renaming old products to look new to deceive customers, shave half the spec of a products n keep the same name (9600gso), releasing crappy products n selling it overprice.... MAN! Thats really innovative dun u think?Souleet - Tuesday, October 13, 2009 - link
Are you ignorant or something, ATI fanboy. GT220 is a 40nm and 9600GSO is a 65nm. How can you say they just changed the name? I thought so...gx80050 - Monday, October 12, 2009 - link
Die painfully okay? Prefearbly by getting crushed to death in a
garbage compactor, by getting your face cut to ribbons with a
pocketknife, your head cracked open with a baseball bat, your stomach
sliced open and your entrails spilled out, and your eyeballs ripped
out of their sockets. Fucking bitch
I really hope that you get curb-stomped. It'd be hilarious to see you
begging for help, and then someone stomps on the back of your head,
leaving you to die in horrible, agonizing pain. Faggot
Shut the fuck up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
gx80050 - Monday, October 12, 2009 - link
Fuck off and die retardSeramics - Monday, October 12, 2009 - link
Let's face it. Nvidia is NOT competitive at every front at every single price point. From ultra low end to mid range to ultra high end, tell me, which price point is nvidia being competitive?Well, of cos I believe Fermi will be something different. I truly believe so. In fact, given that HD5870's slightly below par performance for its spec (very likely bcos memory bandwith limited), and Fermi being on a much larger die and higher transistor count, I EXPECT nVidia next gen Fermi to easily outperform HD5870. Just like how GTX285 outperform HD4890. But by how much? For almost 100 USD more for juz 5-10% improvements? I believe this will likely be the case with Fermi vs 5870. Surely its faster, but ur mayb paying 100% more to get 25% extra fps.
CONCLUSION: Even if Nvidia retake the top single GPU performance crown, they were never a winner in price to performance at ANY price point. They care about profits than they care about you.
Souleet - Monday, October 12, 2009 - link
I agree what your conclusion. Definitely price point ATI has always been on the top of their game but NVIDIA innovations is what make the two apart. But who knows, maybe one day ATI/AMD comes out with CPU/GPU solution that will change the technology industry. That would be cool.formulav8 - Monday, October 12, 2009 - link
NVidia brought out the FX5800 Ultra??
TRIDIVDU - Tuesday, September 21, 2010 - link
My son plays GTA, FIFA, POP, Tomb Raider, NFS etc. in my P4, 3.06 GHz WinXP m/c with N 9400 GT (MSI) 1GB card without any problem in a 19inch LCD monitor. Now that I am planning to exchange the 4 year old m/c with a new i5 650, 3.2 GHz, Win7 m/c fitted with GT220 1 GB card, please tell me whether he will find the new machine a better one to play games with.Thatguy97 - Tuesday, June 30, 2015 - link
nvidias mid range was shit back then