Intel have no bargaining power in the gamer circle. Even if they withheld licensing for the next gen platform. Gamer will just stay with the current gen chipset for nVidia SLi. Since games are usually much more GPU bound than CPU.
In my case, I'm a serious gamer (but FPS lite)). I just clocked over 40 hours on Mass Effect PC since installing it last Thursday evening. In my current setup with a E6750 and 8800GTS. I still have tons of upgrade path both in CPU and GPU without moving onto the next Intel platform.
As an IC designer, I can tell you right away that 3D graphics on the scale of the products that NVidia/ATI produce is not easy. Just look at the demise of Matrox, S3 and others.
I think Intel is going to have problems getting the performance of their offerings to a competitive level in the near future but they do have alot of resources and it might be different 5 years down the line.
All Intel want is SLI on their chips (AS DO A LOT OF GAMERS)... so neck up you little arrogant prick and licence it to them! Don't come out with your little chest puffed our playing the tough guy! If you lease SLI technology to Intel so their highend chipsets will support SLI (Officially! Without having to use hacked drivers) for say $50US, and Intel SLI enabled all their X38/X48 boards, imagine the money that would come in. But you're too busy trying to hold on to the pathetic market share of your pathetic chipsets. There are so many gamers like me out there that would gladly purchase a second high end nvidia card and SLI them, but wont, because there is no way we would use an nvidia chipset... I would pay a $50US premium on a mobo to have SLI on an Intel chipset, and then I would buy another high end card. So put your pride aside and give them (AND US) what they want! More money for you, better gaming platform for us.
Lots of Love,
Kenour.
p.s. Yes I'm still pissed off about the rumour that SLI would be available on the X38 :P It was reported here and Tom's from memory, then retracted a week later... Was the happiest week of my life :P (well, in regards to the PC world).
I think nVidia is holding on to SLI as a marketing gimmick, because SLI doesn't make economic sense except for an extremely small market of wealthy and elitist gamers. I don't see any real value to SLI aside from the bragging rights of somewhat increased performance at a huge cost, and I think nVidia's strategy is guided by this knowledge.
SLI uses a lot more power, generates much more heat, is buggier, harder to set up, and all this while offering diminishing returns compared with a dual or even single GPU card. In fact, unless you're SLI'ing the latest and greatest cards, you are better off with a non-SLI setup. Realistically, only a very tiny minority of gamers would ever go for an SLI set-up, so I'm guessing nVidia understands there is not much potential for financial gain.
The intel/nvidia combo is totally the "it" combo in computer gaming and has been for some time. AMD is working on "tidy-little-packages" with their new integrated graphics platform that can just about "game" right now, not in 2010.
Nvidia, not Intel are the people that need to be working on an Intel platform equivalent in the integrated sector.
I am glad to be an Nvidia customer, I am also glad to see their not taking cheap-shots at AMD. They even came out kind of defending AMD which is understandable, both are smaller companies and both respect each other's products.
I can just picture it now: AMD laptops with synergy for $500 or less and no equivalent Intel solution due to a lack of cooperation with Nvidia.
Well the thing is I think that Intel has no choice. The x86-cpu is DEAD . The heatwall keeps the frequenzy down (seen any 4 GHz chip's lately?)
and well they cant keep adding another core forever. Intel is in dire PANIC, belive me. They must branch out and the GPU, PPU and maybe a little audioPU is the chips with any development years left in them.
And no there are no quantum or laser chips yet...
Come on, if a blond guy from Sweden like me can understand this why dont you spell it out for everybody?
So do you think that Intel is content and everything is going according to plan? We should be at 10 GHz now if according to that plan, and using the netburst architecture...
The 3,8 GHz P4 was so hot that Intel had to ship it with high-expensive thermal paste. Otherwise it throttle constantly.
It's strange to me that everybody(hardware sites for example) seems to think this heat thing is a little snag, a bump in the road. It isen't !
'Oh lookey, not i get 2 cores for the price of one. How nice!'
The chipmakers are trying to hide the crises their in. (Stock prices..)
Why else do they buy GPU and PPU-makers?
I dont think intel has a leg to stand on in the graphics market.
The point i was making is if intel wants to sell core duo at 4GHZ its very doable since people can clock these to 4GHZ today on air cooling. Thats the only point I was making.
I listened to the whole presentation.
Nvidia has a whole computer on a chip. Didn't even know they had this. Was impressed, this will be nice for mobile devices. Have to wait and see where this goes.
Cuda known it for as long as anyone else. I cant wait till compressors for zip, Encoding, etc all become real time. Something no CPU will ever pull off.
We all know intel is weak in graphics, intel has tons of cash. I dont think Nvidia is going anywhere and they'll most likely get bigger in time.
Theres only 2 companies in the world that can make this kind of graphics technology AMD/Nvidia. To make claim that intel can just magically make a gpu to compete in a few years is crazy imo.
As I read the article, but I wasn't wholly convinced about the arguments made by the CEO. As we have seen with EEE and other low cost computers, the current technology was about serving the first billion of people. But most people still don't have computers, because they are too expensive for them.
Nvidia, not fully addressing even the first billion, because of its expensive discrete solutions, will see its market share shrink. Besides, there are many consumer electronics devices that would benefit from a low powered "System-on-a-chip".
Intel has Atom+chipset, AMD bought ATI precisely because it want to offer low powered "System-on-a-chip" (but also multicore high performing parts).
It would only make sense for Nvidia to buy VIA. VIA Isaiah processor seems promising. This was they could cater to a smaller high-end market with discrete solutions and to a growing market for low cost integrated solutions.
Seems Nvidia does not like to be in the receiving end.
I do remember Nvidia spreading lies about PowerVR's Kyro 3D cards sometime back when it looked like they might have a chance to be the third factor in 3D gaming hardware.
With ATIAMD in crisis I think it's great that Nvidia and Intel start competing even though I sincerely hope ATIAMD to come back strong and kick both their asses. After all I can't recall the red/green guys using unfair tactics and like to see integrity rewarded.
"Jensen is known as a very passionate, brilliant and arrogant guy but going against Intel on a frontal full scale might be the worst thing that they ever decided. Nvidia went from close to $40 to current $19.88 which means that the company has to do something to fix this but this is simply too much."
there's a huge differences when audio is being process on a many core cpu like intel and on a stand alone pci card.
putting the pci card in you can feel the cpu less bogged down, and the motherboard chipsets generating less heat.
An Integrated gpu, audio, and many cores doesn't solve the problem, there will be bandwith issues too.
Nvidia should hit Intel hard with a low powered, high performanced gpu, to prove a point.
NVidia will never be able to compete on the low power arena with intel. Intel just have a better process and fabs for that process. NVidia has other companies building their chips. Plus graphics chips don't go with a new process like cpus do.
Well of course it's a wine reference, consider the audience: Institutional investors. These are people who are much more likely to spend $500 worth of disposable income on a bottle of Chateau Something-Or-Other instead of a GeForce 9800 GTX.
Also note the Mondavi reference because they're in Napa just on the other side of San Fran from nVidia HQ in Silicon Valley.
And it's still a bit odd seeing such strong words from nVidia against Intel considering that nVidia/Intel is the main enthusiast platform out there these days (as opposed to an all-AMD solution).
Really quite enjoyed this, makes me all the more confident in the products Intel is currently developing.
I mean really, how similar does NVIDIA's ranting sound compared to AMD's back when they were on top? No question that they're more competent than AMD, but they've done just as good a job at awakening a previously complacent beast at Intel. Heh, and they've never had to compete with someone that has a marked manufacturing advantage before...
Intel is no beast in these parts. Their track record in the discrete segment and drivers up to this day is complete failure. Until they execute on both the hardware and software, both monumental tasks, they'll continue to be right where they are in the discrete market (i.e. no where).
Its easier to talk about the blue FUD machine than the other (albeit troubled) competitor that is actually competing with your own company on all fronts.
Intel and AMD aim two things:
1. Integrated low power graphics - implemented in mobile computerized devices: laptops, UMPCs, smart phones, video/audio players, etc. These market has the fastest growth.
2. Paralel processing; the design and thus the know how of present GPU players in paralel processing is immerse. Such tech solutions would be suitable in financial, military, scientific modeling, which markets command hefty profit margins.
These are the reasons why AMD bought ATI
My point - corporations do things which will acccelerate margins, or accelerate growth. Financial analysts are not interested in nominal values only.
Intel was to choose either acquisition or internal development of products. It seems like they chose internal approach, since ATI was already bought, and Nvidia purchase is too expensive and complicated to make financial sense. Sun Microsystems and IBM are already situated well in the high margin paralel processing market. However, IBM recently was screwed with government ban on orders, and since they moved so many strategic operations overseas, I don't see them easily coming back to the big margin market. HP abandoned their PARISK line a long time ago, so they rely on AMD and Intel for chips supply now. So, exciting time for Intel and AMD for grabbing new market teritorries.
Nvidia is left to the discrete graphics market only. It is popular market across the gamers, magazines and general consumer, but it is not the market where the huge money are done. And, I don't see collision between Intel and Nvidia interests, except the Mobile market. What investors are warned about is that the big guys curbed opportunities for revenue and profit growth.
"You already have the right machine to run Excel. You bought it four years ago... How much faster can you render the blue screen of death?" -- Jen Hsun-Huang
Given that this was in response to questions about nVidia's Vista driver problems, I don't know that this helps nVidia's case. Apparently those devices best able to render the BSoD quickly are those made by nVidia. This is not something that will become a new benchmark any vendor would care to win.
I would like a video card that will run both Excel *and* the latest games, Mr. Hsun-Huang.
Those Steam figures look familiar Derek. ;) I'm surprised JH didn't bring up the Microsoft class action suit as another example of Intel integrated chipsets failing miserably. Nice peak into the current market climate, although there wasn't as much discussion about the future as I had hoped.
I dont see how a company who has 0% market share above integrated graphics is going to motivate or get devs to write game engines to do ray tracing vs rasterization. John Carmack had an interview about this 2 months ago and he wasnt impressed with what Intel has and wasnt convinced Ray Tracing is better at everything than rasterization. He felt it would be a hybrid situation at best and Intel is dreaming.
There is a special place in my heart and in gaming history for John Carmack, but I don't think he's necessarily the one to trust when it comes to forecasting the industry anymore.
Doom3 the single player game was disappointing, and the engine never really had a big hit game, either.
Now maybe if Valve or Epic weighed in with similar comments...
In addition, while there wasn't one crazy breakthrough hit (and on the PC, what really is these days?), I would guess that in total installed copies of Doom 3, Quake 4, Prey, and Quake Wars is pretty competitive to some of the other contemporary engines.
Why is it in Jen-sun's best interest to draw attention to Intel's failed IGP?
Consider the end user experience - I tried using Intel's IGP - and became so horribly frustrated that I abandoned the IGP altogether in disgust! As a competitor, Jen-sun cannot buy such a powerful motivator to drive customers to nVidia (or ATI), right?
Jen-sun should be praising Intel for their IGP, and encourage them to continue the "good" work for nVidia! Don't ridicule Intel - Don't dare them to beat you.
Jen-sun mis-managed this Financial Meeting and cannot retract his indignation - He has challenged Intel to a Dual, and he cannot win!
thanks for pointing out the obvious to all of us w/ a grade 3 and above education Griswold. Now, do us all a favor and go "fuk" yourself, and dont tell me how to spell fuk on the internet. Thank you very much.
Intel are arguably a long term company.
It may be that no one can see anything happening in the near future, but give it time and we will see things shifting I am sure.
They are in it for the long haul, but they also want to show they are making short term steps to get there.
The Atom is by no means a finished platform, nor does it operate where Intel are aiming for, but it's a start on the road.
Umm, no. Predicting the future is rarely entirely accurate or precise, no matter how much of an expert you may be. Prognosticators who are experts are usually wrong as often as they're right. Experts are just as fallible as anyone else, if not more.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
43 Comments
Back to Article
Wiz33 - Wednesday, June 4, 2008 - link
Intel have no bargaining power in the gamer circle. Even if they withheld licensing for the next gen platform. Gamer will just stay with the current gen chipset for nVidia SLi. Since games are usually much more GPU bound than CPU.In my case, I'm a serious gamer (but FPS lite)). I just clocked over 40 hours on Mass Effect PC since installing it last Thursday evening. In my current setup with a E6750 and 8800GTS. I still have tons of upgrade path both in CPU and GPU without moving onto the next Intel platform.
sugs - Sunday, May 11, 2008 - link
As an IC designer, I can tell you right away that 3D graphics on the scale of the products that NVidia/ATI produce is not easy. Just look at the demise of Matrox, S3 and others.I think Intel is going to have problems getting the performance of their offerings to a competitive level in the near future but they do have alot of resources and it might be different 5 years down the line.
kenour - Tuesday, April 15, 2008 - link
Dear Jen-sun,All Intel want is SLI on their chips (AS DO A LOT OF GAMERS)... so neck up you little arrogant prick and licence it to them! Don't come out with your little chest puffed our playing the tough guy! If you lease SLI technology to Intel so their highend chipsets will support SLI (Officially! Without having to use hacked drivers) for say $50US, and Intel SLI enabled all their X38/X48 boards, imagine the money that would come in. But you're too busy trying to hold on to the pathetic market share of your pathetic chipsets. There are so many gamers like me out there that would gladly purchase a second high end nvidia card and SLI them, but wont, because there is no way we would use an nvidia chipset... I would pay a $50US premium on a mobo to have SLI on an Intel chipset, and then I would buy another high end card. So put your pride aside and give them (AND US) what they want! More money for you, better gaming platform for us.
Lots of Love,
Kenour.
p.s. Yes I'm still pissed off about the rumour that SLI would be available on the X38 :P It was reported here and Tom's from memory, then retracted a week later... Was the happiest week of my life :P (well, in regards to the PC world).
ielmox - Wednesday, April 16, 2008 - link
I think nVidia is holding on to SLI as a marketing gimmick, because SLI doesn't make economic sense except for an extremely small market of wealthy and elitist gamers. I don't see any real value to SLI aside from the bragging rights of somewhat increased performance at a huge cost, and I think nVidia's strategy is guided by this knowledge.SLI uses a lot more power, generates much more heat, is buggier, harder to set up, and all this while offering diminishing returns compared with a dual or even single GPU card. In fact, unless you're SLI'ing the latest and greatest cards, you are better off with a non-SLI setup. Realistically, only a very tiny minority of gamers would ever go for an SLI set-up, so I'm guessing nVidia understands there is not much potential for financial gain.
SLI is a bit of a white elephant to most people.
gochichi - Monday, April 14, 2008 - link
The intel/nvidia combo is totally the "it" combo in computer gaming and has been for some time. AMD is working on "tidy-little-packages" with their new integrated graphics platform that can just about "game" right now, not in 2010.Nvidia, not Intel are the people that need to be working on an Intel platform equivalent in the integrated sector.
I am glad to be an Nvidia customer, I am also glad to see their not taking cheap-shots at AMD. They even came out kind of defending AMD which is understandable, both are smaller companies and both respect each other's products.
I can just picture it now: AMD laptops with synergy for $500 or less and no equivalent Intel solution due to a lack of cooperation with Nvidia.
perzy - Monday, April 14, 2008 - link
Well the thing is I think that Intel has no choice. The x86-cpu is DEAD . The heatwall keeps the frequenzy down (seen any 4 GHz chip's lately?)and well they cant keep adding another core forever. Intel is in dire PANIC, belive me. They must branch out and the GPU, PPU and maybe a little audioPU is the chips with any development years left in them.
And no there are no quantum or laser chips yet...
Come on, if a blond guy from Sweden like me can understand this why dont you spell it out for everybody?
Galvin - Monday, April 14, 2008 - link
Actually hitting 4gz for intel would be easy. hell a lot of people get those things to 4gz on air.So yeah they could do 4gz if they wanted to :)
perzy - Tuesday, April 15, 2008 - link
So do you think that Intel is content and everything is going according to plan? We should be at 10 GHz now if according to that plan, and using the netburst architecture...The 3,8 GHz P4 was so hot that Intel had to ship it with high-expensive thermal paste. Otherwise it throttle constantly.
It's strange to me that everybody(hardware sites for example) seems to think this heat thing is a little snag, a bump in the road. It isen't !
'Oh lookey, not i get 2 cores for the price of one. How nice!'
The chipmakers are trying to hide the crises their in. (Stock prices..)
Why else do they buy GPU and PPU-makers?
Galvin - Tuesday, April 15, 2008 - link
I dont think intel has a leg to stand on in the graphics market.The point i was making is if intel wants to sell core duo at 4GHZ its very doable since people can clock these to 4GHZ today on air cooling. Thats the only point I was making.
Galvin - Sunday, April 13, 2008 - link
I listened to the whole presentation.Nvidia has a whole computer on a chip. Didn't even know they had this. Was impressed, this will be nice for mobile devices. Have to wait and see where this goes.
Cuda known it for as long as anyone else. I cant wait till compressors for zip, Encoding, etc all become real time. Something no CPU will ever pull off.
We all know intel is weak in graphics, intel has tons of cash. I dont think Nvidia is going anywhere and they'll most likely get bigger in time.
Theres only 2 companies in the world that can make this kind of graphics technology AMD/Nvidia. To make claim that intel can just magically make a gpu to compete in a few years is crazy imo.
segerstein - Saturday, April 12, 2008 - link
As I read the article, but I wasn't wholly convinced about the arguments made by the CEO. As we have seen with EEE and other low cost computers, the current technology was about serving the first billion of people. But most people still don't have computers, because they are too expensive for them.Nvidia, not fully addressing even the first billion, because of its expensive discrete solutions, will see its market share shrink. Besides, there are many consumer electronics devices that would benefit from a low powered "System-on-a-chip".
Intel has Atom+chipset, AMD bought ATI precisely because it want to offer low powered "System-on-a-chip" (but also multicore high performing parts).
It would only make sense for Nvidia to buy VIA. VIA Isaiah processor seems promising. This was they could cater to a smaller high-end market with discrete solutions and to a growing market for low cost integrated solutions.
BZDTemp - Saturday, April 12, 2008 - link
Seems Nvidia does not like to be in the receiving end.I do remember Nvidia spreading lies about PowerVR's Kyro 3D cards sometime back when it looked like they might have a chance to be the third factor in 3D gaming hardware.
With ATIAMD in crisis I think it's great that Nvidia and Intel start competing even though I sincerely hope ATIAMD to come back strong and kick both their asses. After all I can't recall the red/green guys using unfair tactics and like to see integrity rewarded.
Finally I would Anandtech to be more critical when reporting from such venues. Try and google Kyro Nvidia and pdf to find the old story or just check out the pdf directly: ftp://ftp.tomshardware.com/pub/nvidia_on_kyro.pdf">ftp://ftp.tomshardware.com/pub/nvidia_on_kyro.pdf
duron266 - Saturday, April 12, 2008 - link
"Jensen is known as a very passionate, brilliant and arrogant guy but going against Intel on a frontal full scale might be the worst thing that they ever decided. Nvidia went from close to $40 to current $19.88 which means that the company has to do something to fix this but this is simply too much."duron266 - Friday, April 11, 2008 - link
NVIDIA...too high-profile,if they were going to vanish,
Jen-Hsun would be the number one to blame...
anandtech02148 - Friday, April 11, 2008 - link
there's a huge differences when audio is being process on a many core cpu like intel and on a stand alone pci card.putting the pci card in you can feel the cpu less bogged down, and the motherboard chipsets generating less heat.
An Integrated gpu, audio, and many cores doesn't solve the problem, there will be bandwith issues too.
Nvidia should hit Intel hard with a low powered, high performanced gpu, to prove a point.
epsilonparadox - Friday, April 11, 2008 - link
NVidia will never be able to compete on the low power arena with intel. Intel just have a better process and fabs for that process. NVidia has other companies building their chips. Plus graphics chips don't go with a new process like cpus do.poohbear - Friday, April 11, 2008 - link
very nice article, but how many of us are gonna understand the analogy:"someone's kid topping of a decanted bottle of '63 Chateau Latour with an '07 Robert Mondavi."
wtf is that?!? im guessing he's talking about wine, but whatever.
kb3edk - Friday, April 11, 2008 - link
Well of course it's a wine reference, consider the audience: Institutional investors. These are people who are much more likely to spend $500 worth of disposable income on a bottle of Chateau Something-Or-Other instead of a GeForce 9800 GTX.Also note the Mondavi reference because they're in Napa just on the other side of San Fran from nVidia HQ in Silicon Valley.
And it's still a bit odd seeing such strong words from nVidia against Intel considering that nVidia/Intel is the main enthusiast platform out there these days (as opposed to an all-AMD solution).
Khato - Friday, April 11, 2008 - link
Really quite enjoyed this, makes me all the more confident in the products Intel is currently developing.I mean really, how similar does NVIDIA's ranting sound compared to AMD's back when they were on top? No question that they're more competent than AMD, but they've done just as good a job at awakening a previously complacent beast at Intel. Heh, and they've never had to compete with someone that has a marked manufacturing advantage before...
tfranzese - Sunday, April 13, 2008 - link
Intel is no beast in these parts. Their track record in the discrete segment and drivers up to this day is complete failure. Until they execute on both the hardware and software, both monumental tasks, they'll continue to be right where they are in the discrete market (i.e. no where).Griswold - Friday, April 11, 2008 - link
I bet it gave you wet dreams.jtleon - Friday, April 11, 2008 - link
Griswold....You need to return to Wally World!LOL
jtleon
Synosure - Friday, April 11, 2008 - link
It seems like everyone is just ignoring Amd and their hybrid solution. It would have been nice to hear his opinion on it.Griswold - Friday, April 11, 2008 - link
Its easier to talk about the blue FUD machine than the other (albeit troubled) competitor that is actually competing with your own company on all fronts.Ananke - Friday, April 11, 2008 - link
Intel and AMD aim two things:1. Integrated low power graphics - implemented in mobile computerized devices: laptops, UMPCs, smart phones, video/audio players, etc. These market has the fastest growth.
2. Paralel processing; the design and thus the know how of present GPU players in paralel processing is immerse. Such tech solutions would be suitable in financial, military, scientific modeling, which markets command hefty profit margins.
These are the reasons why AMD bought ATI
My point - corporations do things which will acccelerate margins, or accelerate growth. Financial analysts are not interested in nominal values only.
Intel was to choose either acquisition or internal development of products. It seems like they chose internal approach, since ATI was already bought, and Nvidia purchase is too expensive and complicated to make financial sense. Sun Microsystems and IBM are already situated well in the high margin paralel processing market. However, IBM recently was screwed with government ban on orders, and since they moved so many strategic operations overseas, I don't see them easily coming back to the big margin market. HP abandoned their PARISK line a long time ago, so they rely on AMD and Intel for chips supply now. So, exciting time for Intel and AMD for grabbing new market teritorries.
Nvidia is left to the discrete graphics market only. It is popular market across the gamers, magazines and general consumer, but it is not the market where the huge money are done. And, I don't see collision between Intel and Nvidia interests, except the Mobile market. What investors are warned about is that the big guys curbed opportunities for revenue and profit growth.
joegee - Friday, April 11, 2008 - link
"You already have the right machine to run Excel. You bought it four years ago... How much faster can you render the blue screen of death?" -- Jen Hsun-HuangGiven that this was in response to questions about nVidia's Vista driver problems, I don't know that this helps nVidia's case. Apparently those devices best able to render the BSoD quickly are those made by nVidia. This is not something that will become a new benchmark any vendor would care to win.
I would like a video card that will run both Excel *and* the latest games, Mr. Hsun-Huang.
-Joe G.
chizow - Friday, April 11, 2008 - link
Those Steam figures look familiar Derek. ;) I'm surprised JH didn't bring up the Microsoft class action suit as another example of Intel integrated chipsets failing miserably. Nice peak into the current market climate, although there wasn't as much discussion about the future as I had hoped.DerekWilson - Friday, April 11, 2008 - link
heh yeah ... but the steam numbers still say absolutely nothing about the state of the market in early 2007.they are a good indicator for what is happening now, and i never meant to imply other wise.
i'd love to see something more forward looking as well...
Genx87 - Friday, April 11, 2008 - link
I dont see how a company who has 0% market share above integrated graphics is going to motivate or get devs to write game engines to do ray tracing vs rasterization. John Carmack had an interview about this 2 months ago and he wasnt impressed with what Intel has and wasnt convinced Ray Tracing is better at everything than rasterization. He felt it would be a hybrid situation at best and Intel is dreaming.Pyrosma - Saturday, April 12, 2008 - link
John Carmack wasn't very impressed with Half Life when he first saw it, either. And it was built with his game engine. Oops.panfist - Friday, April 11, 2008 - link
There is a special place in my heart and in gaming history for John Carmack, but I don't think he's necessarily the one to trust when it comes to forecasting the industry anymore.Doom3 the single player game was disappointing, and the engine never really had a big hit game, either.
Now maybe if Valve or Epic weighed in with similar comments...
StormEffect - Friday, April 11, 2008 - link
It was called Prey and it was fairly successful.Sunrise089 - Friday, April 11, 2008 - link
In addition, while there wasn't one crazy breakthrough hit (and on the PC, what really is these days?), I would guess that in total installed copies of Doom 3, Quake 4, Prey, and Quake Wars is pretty competitive to some of the other contemporary engines.Conroe - Friday, April 11, 2008 - link
If Intel could integrate a GPU that actually could run games what do you think would happen to nvidia? He sounds a little frightened to me.jtleon - Friday, April 11, 2008 - link
Why is it in Jen-sun's best interest to draw attention to Intel's failed IGP?Consider the end user experience - I tried using Intel's IGP - and became so horribly frustrated that I abandoned the IGP altogether in disgust! As a competitor, Jen-sun cannot buy such a powerful motivator to drive customers to nVidia (or ATI), right?
Jen-sun should be praising Intel for their IGP, and encourage them to continue the "good" work for nVidia! Don't ridicule Intel - Don't dare them to beat you.
Jen-sun mis-managed this Financial Meeting and cannot retract his indignation - He has challenged Intel to a Dual, and he cannot win!
Regards,
jtleon
Griswold - Friday, April 11, 2008 - link
"He has challenged Intel to a Dual, and he cannot win!"A dual what? Dual-core maybe?
Its spelled d-u-e-l.
jtleon - Friday, April 11, 2008 - link
Thanks Griswold...saw the mistake as I hit the Post button - unfortunately this site does not offer an "edit" after the fact!poohbear - Friday, April 11, 2008 - link
thanks for pointing out the obvious to all of us w/ a grade 3 and above education Griswold. Now, do us all a favor and go "fuk" yourself, and dont tell me how to spell fuk on the internet. Thank you very much.jtleon - Friday, April 11, 2008 - link
No doubt Jen-sun is very afraid. Intel could buy his entire engineering team - should they so choose.However, such fear is a vital ingredient (always has been) to generate true innovation. We should be worried if Jen-sun is not afraid.
Regards,
jtleon
Lonyo - Friday, April 11, 2008 - link
Intel are arguably a long term company.It may be that no one can see anything happening in the near future, but give it time and we will see things shifting I am sure.
They are in it for the long haul, but they also want to show they are making short term steps to get there.
The Atom is by no means a finished platform, nor does it operate where Intel are aiming for, but it's a start on the road.
zsdersw - Friday, April 11, 2008 - link
Prognosticators, no matter how well qualified or respected, are very often wrong.UNHchabo - Friday, April 11, 2008 - link
You only think this is true because the ones who are wrong are often the only ones you remember.Example:
"Spam will be a thing of the past in two years' time." -Bill Gates, 2004
zsdersw - Saturday, April 12, 2008 - link
Umm, no. Predicting the future is rarely entirely accurate or precise, no matter how much of an expert you may be. Prognosticators who are experts are usually wrong as often as they're right. Experts are just as fallible as anyone else, if not more.