I find this whole thing laughable. I remember Intel's previous attempts at graphics. I remember people saying the same things back then: "Intel is going to dominate the next gen graphics" etc. It is for this reason I have pretty much ignored everything to do with Larrabee. I figured it would be as successful as their previous attempts were.
The nVidia guy is right. GPU's are hard to make. Look at all of the work they and AMD have put into graphics. Intel doesn't have the patience to develop a real GPU. After awhile they are forced to give up.
A lot has changed though. Back in the early days, everything was hardwired, and a GPU had very little to do with a CPU. GPUs would only do very basic integer/fixedpoint math, and a lot of the work was still done on the CPU.
These days, GPUs are true processors themselves, and issues like IEEE floating point processing, branch prediction, caching and such are starting to dominate the GPU design.
These are exactly the sort of thing that Intel is good at.
nVidia is now trying to close the gap with Fermi... delivering double-precision performance that is more in line with CPUs (where previous nVidia GPUs and current AMD GPUs have a ratio of about 1:5 for single-precision vs double-precision, rather than 1:2 that nVidia is aiming for), adding true function calling, unified address space, and that sort of thing.
With the introduction of the GeForce 8800 and Cuda, nVidia clearly started to step on Intel's turf with GPGPU. Fermi is another step closer to that.
If nVidia can move towards Intel's turf, why can't Intel meet nVidia halfway? Because some 15 years ago, Intel licensed video technology from Real3D which didn't turn out to be as successful as 3dfx' Voodoo? As if that has ANYTHING at all to do with the current situation...
You make excellent points. I will happily acknowledge that things have changed a lot. However, Intel has not. Like I say, I doubt they have the patience to develop a fully-functional competitive GPU. It takes too long. A couple of business cycles, after all of the grand press releases, someone up in the business towers decides that it's taking too long, and it's not worth the investment.
I think when they decide to do these things, the boardroom people think they'll be #1 inside of a year, and when it doesn't turn out that way, they pull the plug. THAT's why they won't pull it off.
Well, it's hard to say at this point.
Maybe it was a good decision to not put Larrabee into mass production in its current form. That saves them a big investment on a product that they don't consider to have much of a chance of being profitable.
They haven't cancelled the product, they just want another iteration before they are going to put it on the market.
Who knows, they may get it right on the second take. I most certainly hope so, since it will keep nVidia and AMD on their toes.
___________________________________________________________
"Do I believe the 48-core research announcement had anything to do with Larrabee's cancelation? Not really. The project came out of a different team within Intel. Intel Labs have worked on bits and pieces of technologies that will ultimately be used inside Larrabee, but the GPU team is quite different."
___________________________________________________________
It wasn't the Larrabee team or any other product development team that canceled Larrabee, from what I gather. It was management, and management sits above the development teams and is the entity responsible for making product cancellations as well as deciding which products go to market. I don't think the 48-core research announcement had anything directly to do with Larrabee's cancellation, either, because that 48-core chip doesn't itself exist yet. But I surely think the announcement was made just prior to the Larrabee cancellation in order to lessen the PR impact of Larrabee getting the axe, and I also don't think Larrabee being canceled just before the week-end was an accident, either.
________________________________________________________
"With the Radeon HD 5870 already at 2.7 TFLOPS peak, chances are that Larrabee wasn't going to be remotely competitive, even if it came out today. We all knew this, no one was expecting Intel to compete at the high end. Its agents have been quietly talking about the uselessness of > $200 GPUs for much of the past two years, indicating exactly where Intel views the market for Larrabee's first incarnation.
Thanks to AMD's aggressive rollout of the Radeon HD 5000 series, even at lower price points Larrabee wouldn't have been competitive - delayed or not."
_______________________________________________________
Well, if Larrabee wasn't intended by Intel to occupy the high end, and wasn't meant to cost >$200 anyway, why would the 5870's 2.7 TFLOPS benchmark peak matter at all? It shouldn't have mattered at all, should it, as Larrabee was never intended as a competitor? Why bring up the 5870, then?
And, uh, what were the "real-time ray tracing" demos Intel put on with Larrabee software simulators supposed to mean? Aside from the fact that the demos were not impressive in themselves, the demos plainly said, "This is the sort of thing we expect to be able to competently do once Larrabee ships," and I don't really think this fits the "low-middle" segment of the 3d-gpu market. Any gpu that *could* do RTRT @ 25-30 fps consistently would be, by definition, anything but "low-middle end," wouldn't it?
That's why I think Larrabee was canceled simply because the product would have fallen far short of the expectations Intel had created for it. To be fair, the Larrabee team manager never said that RTRT was a central goal for Larrabee--but this didn't stop the tech press from assuming that anyway. Really, you can't blame them--since it was the Intel team who made it a part of the Larrabee demos, such as they were.
_______________________________________________________
"It's not until you get in the 2013 - 2015 range that Larrabee even comes into play. The Larrabee that makes it into those designs will look nothing like the retail chip that just got canceled."
_______________________________________________________
Come on, heh...;) If the "Larrabee" that ships in 2013-2015 (eons of time in this business) looks "nothing like the retail chip that just got canceled"--well, then, it won't be Larrabee at all, will it? I daresay it will be called something else entirely.
Point is, if the Larrabee that just got canceled was so uncompetitive that Intel canceled it, why should the situation improve in the next three to five years? I mean, AMD and nVidia won't be sitting still in all that time, will they?
What I think is this: "Larrabee" was a poor, hazy concept to begin with, and Intel canceled it because Intel couldn't make it. If it is uncompetitive now, I can't see how a Larrabee by any other name will be any more competitive 3-5 years from now. I think Intel is aware of this and that's why Intel canceled the product.
I really think It's a shame. I was hoping for a discreet Intel larrabee-card. It would be perfect for me that aren't a die hard gamer but still wants decent performance. I don't think the "software" route is a bad one, people don't seem to realize that that's exactly what AMD/ATi and nVidia is doing. Vidcards don't understand DX and OGL. Plus Video-cards is for so much more then DX and OGL now days. It's shaders that help in other apps, it's OpenCL, it's code written for specific gpus and so on. Also Intels Linux support is rock solid. An discreet GPU from Intel with Open source drivers would really be all it can be. Something the others wouldn't spend the same amount of resources trying to accomplish. Even the AMD server boards uses Intel network chips because they can actually write software. But at least we can look forward to the Arrandale/Clarkdale / H55. But I was also looking forward to something more resembling workstation graphics. But at least I hope Arrandale/Clarkdale helps to bring multimedia to alternative systems. Even though that will never fully happen in patent-ridden world. But at least it's easier to fill the gap with good drivers that can handle video. I also see no reason why Apple wouldn't go back to Intel chipsets including IGPs.
I was hoping they'd release this as an add in card for servers. Since it uses general processors, having an add in card with 32 or what not mediocre cpu's and being able to allocate them to virtual servers would be amazing.
I figured that Intel would cancel Larabee because of cost. An Intel core2duo wolfdale processor with 6mb of l2 cache has 410 million transistors and would easily sell for $200. My cheap geforce 9600gso would probably sell for $70 and the gpu itself has 505 transistors. I figure that even if Intel makes competitive gpus, they would barely eek out a profit selling these larabee gpus.
At this point I wouldn't be all that surprised to hear an announcement for a new discrete graphics microarchitecture. Intel has an advantage with nVidia and AMD both being fabless, and both using TSMC. A whole article could be written about that alone.
Larabee was heckled around the nVidia/AMD circles at the notion of a software-driven, general purpose GPU. I wanted to believe that Intel had an ace up its sleeve, but nVidia and AMD have been doing this for a while. You couldn't just dismiss what they were saying.
The whole "fusion" architecture seems to be the future of the CPU / GPU, so it would surprise me if Intel dropped out of the discrete business completely. It makes sense to invest in graphics, and it makes sense to offset your R&D costs by selling your product in another market (i.e. mobile graphics -> discrete).
If I was to guess, I would assume that after performance profiling their engineers came about to "If we had purpose-specific hardware we could ..." You can use your imagination there.
Anyway, I would expect a new microarchitecture to be announced in the next 6 months or so. That might explain why they didn't kill larrabee completely. We'll probably see a GP, many-core architecture with purpose-specific instructions (Like SSE). x86 would be good, and I can see where Intel would benefit from that, but I wouldn't put it past Intel to go with a new instruction set specific to GPUs. I could see that instruction set quickly becoming a standard, too.
x86 instructions are good, but this instruction set fares well with multi-tasking branch-heavy threads, integer computing, and general I/O. GPUs have a very heavy FP workload that's usually streaming, and this has been handled with x86 extensions. I can see x86 support being a burden on the overall architecture, so it seems only natural to take lessons learned and start over.
Intel knows instruction sets. Intel knows compilers. Intel knows how to fab. My disappointment in not seeing an Intel GPU this year is mitigated by my excitement to see what Intel will be coming out with in its place.
Larrabee already HAS purpose-specific instructions (LRBni).
It's not that different from what nVidia does with Fermi, really. It's just that nVidia doesn't use the x86 ISA as their basis. Doesn't matter all that much when you use C++ anyway.
I don't know about you but I am not surprise nor will be if they get investigated. But then neither am I about other companies trying to win market share. Dell does it, AMD does it, MS does it, Apple is the king of it all, etc.
It's a topic that's not related to this topic as it's huge by itself.
Intel has a problem, and that is that while they can do a VERY good job when focused on making ONE very good product, they don't do a very good job in multiple areas.
Look at processors....Intel has been focused on making the best processors out there, but really, they don't make ANYTHING else that is really amazing. Chipset quality is good, because that SUPPORTS their CPU business. GPU designs are pathetic and so far behind AMD and NVIDIA that I won't recommend ANY system with an Intel graphics product. Their networking products are also less than amazing.
So, does Intel do ANYTHING really well other than processors and chipsets? Even when they try to branch out into other areas and invest a LOT of money, the results have yet to impress anyone.
Even when they have tried to come up with an all new design it has flopped EVERY time. The last successful all new design was the Pentium Pro(even though the original flopped, the design ended up in the Pentium 2 and eventually into the Core and Core 2 line of processors)....so, Intel has come up with THREE really original designs. The old 8086/8088, the Pentium, and the Pentium Pro. Itanium flopped, and what else will Intel do.
Uhhh... the 286, 386 and 486 were all completely unique architectures aswell, and all were very successful.
As for Intel's IGP... Sure, Intel goes for a design that is as cheap and simple as possible, but in that light they do well. Intel's IGPs are VERY cheap, and they support everything you might need... They have DX10 support, video acceleration, DVI/HDMI/HDCP and all that.
Given the amount of transistors that Intel spends on their IGPs and the number of processing units inside their IGPs, they are in line with what nVidia and AMD offer. They just apply the technology on a smaller scale (which is good for cost and for power consumption/heat, which matter more to a lot of people than overall performance).
OK, so Intel GPU's can't touch ATI/AMD or nVidia's best, we know this, we've always known this. Who has more GPU market share?
You want a list of other things that Intel does well?
- Network controllers: some of the best money can buy.
- Consumer-level SSDs: Theirs are the best on the market in the benchmarks that matter the most for performance (random R/W). AFAIK, even the X25-E doesn't have a super-cap or other cache protection, so I don't consider that to be a truly "enterprise" drive.
Yes, that's why Intel spent a a billion+ on Larrabee, for no reason at all, because they already ship the highest #s of GPUs. Intel loves to just toss money to the wind, it doesn't matter at all.
What? Talk about a straw man argument. The GP said nothing to imply that Larrabee didn't matter, but was just responding to the ludicrous assertion by the GGP that Intel didn't do anything else well/successfully.
Why so much shit towards Intel. Their IGP's are not for you, that's fine, but there's no reason to call it a bad product out of rage. The IGP itself just works for MOST computer users. If this wasn't the case EVERYONE would have a discrete solution, and Intel wouldn't have such a huge marketshare in the Graphics market, no one would allow it, not the OEM's, or MS.
They do work, just not for you, not for gaming. Sometimes people forget that they are not the majority. I'd say anyone reading Anandtech does not fall in this category, who have always been fine with Intel's IGP, or at least getting along fine with it.
Intel's announcement wasn't too surprising or devastating, it just makes things a bit less interesting.
I am a long time reader of AnandTech (from its inception) and am not surprised by the above statement coming from this site. But I beg to differ; Intel has proven in recent news that they are indeed top nomine for the 'evil empire" award-perhaps for as long as 10 years. If all worldly claims ring true, Intel is at best an inhibitor of technology (except their own). What will the world be like without Intel’s presence is always interesting- it will spark competition and innovation, once again, out of 2 car garages (figurative) where PC’s began.
I also asked about Haswell in the comments of the last Larrabee cancelation article. Because it was my understanding that some future version of Larrabee was planed to be integrated into the Haswell architecture as a GPU/AVX2 hybrid.
Anand himself mentioned something like this in his IDF 2009 Larrabee Demo article: "Nehalem was architected to be very modular, Jasper Forest is just another example of that. Long term you can expect Nehalem cores (or a similar derivative) to work alongside Larrabee cores, which is what many believe Haswell will end up looking like."
..never mind the fact they've had to pull firmware releases TWICE in a row to fix major data loss issues. Yeah, Intel's got a great track record with SSD technology right now. :p
Or how about substantiated claims about Intel's X25-E SSD having hardware/firmware problems which cause performance slowdowns and operational time-outs -- and at least one documented account of an enterprise storage provider who has kicked Intel to the curb, Pillar Data Systems Inc?
quote: Instead, the first version of Larrabee will exclusively be for developers interested in playing around with the chip. And honestly, though disappointing, it doesn't really matter.
This can't be taken seriously, really. Larrabee is a very very important product for Intel. Without it, they will have nothing to fight AMD and Nvidia in the inevitable HPC sector using some form of GPU acceleration. It's going to happen, which is why Larrabee was born in the first place. To say that "it doesn't matter" is asinine at best, of course it matters! It matters a whole hell of a lot, and the failure of Intel to make Larrabee a sellable and viable product is a huge setback, financially and otherwise. Intel didn't sink a vast amounts of money into Larrabee just to have it barely survive as some sort of developer platform. This was Intel's full on fray into making a real graphics chip.
How does Intel expect to fight Fusion now? They will have nothing that will come even remotely close. This whole article is a big fat attempt at trying to soften the blow, it's beyond silly and honestly fairly insulting to the readers. Sites like this are supposed to tell it like it is, and the truth is Larrabee is a giant, multi-billon dollar flop. Yes Intel can absorb it, but that doesn't make the failure any less significant in its own right.
I would also like to add that AMD gets flogged and beaten with the point that they don't have a processor that can hang with Intel's best. But at least they have a processor that is fairly close. Turn things around, Intel has zero in graphics that can compete with AMD, yet Intel gets a pass.
Intel should stop fooling around trying to make an x86 based graphics chip and design one properly, or buy up the IP to do it. Larrabee is essentially Intel trying to use x86 to monopolize the graphics space, but found out the hard way that x86 is terrible at doing graphics, the reason real GPUs exist it he first place.
IMHO the are building this in x86 because sometime in the future you will just have the 100+ core array seen the Intel slides on which each core can be used for graphics or as CPU. So you actually don't have any distinction anymore between CPU and GPU.
This would be pretty nice since the system can be balanced very nicely theoretically that your neither GPU nor CPU limited.
You seem to miss the part where Intel isn't canceling Larrabee as a consumer part permanently, they are just delaying consumer release. Pretty much everything you just said is irrelevant if you look past the one line you quoted.
quote: You seem to miss the part where Intel isn't canceling Larrabee as a consumer part permanently, they are just delaying consumer release.
Who cares? Something that exists in a lab or in obscurity for years on end doesn't do Intel any good, never mind the consumer. You can't buy Larrabee, and won't be able to for who knows how long, maybe never. Which makes it a failure, period. No amount of spin and excuses can change that. Apply the same logic to any other product, and say, well it's not available, won't be for years, but it doesn't matter it isn't actually canceled! All is well, nothing to worry about.
Pure nonsense. Suppose Fermi is never released, and Nvidia announces in 3 months that it won't be a real product, but will live on as an internal project. Will anyone really being saying, it doesn't matter? Of course not. The only reason Intel can get away with this is because it's not there core business. But that is going to hurt them going forward, because the GPU and CPU are on a collision course, it's only a matter time. Which again, is exactly the reason Larrabee was conceived in the first place.
It does Intel a TON of good if they can get functional, realistic hardware into the hands of developers to play with and help them refine their product. I doubt Intel wants to release a part that no one is willing to buy because of the inevitable problems it will have upon launch, regardless of how well or poorly it performs. By holding back and letting the developers pick it apart they have more time to improve Larrabee and are more likely to be able to limit/counter the difficulties they will encounter when it's finally launched.
quote: It does Intel a TON of good if they can get functional, realistic hardware into the hands of developers to play with and help them refine their product.
So you're saying up until this point, Intel had no Larrabee cards in the hand of developers?
You know what does even more good? Having and actual product on the market AND in the hands of developers. That's how hardware becomes successful.
quote: By holding back and letting the developers pick it apart they have more time to improve Larrabee and are more likely to be able to limit/counter the difficulties they will encounter when it's finally launched.
How is holding back going to somehow magically make Intel able to improve Larrabee at a faster rate than they were working at before? It's not.
MORE developers.... Sony tried to get game studios to develop for the Cell, but a lack of development meant that there were few reasons to buy a PS3 in the first year or two after its release (unless you wanted a Blu-Ray player). Intel could have a more successful launch if they can familiarize Larrabee with and receive feedback from as many developers as possible.
And I never said that holding back will "improve Larrabee at a faster rate", I said that they'll have fewer problems with the product when they do launch it.
Seriously, you seem to be trying too hard. Intel has working hardware but they don't feel it's ready. I'm willing to bet their major hurdle right now is software more than hardware, but either way if it isn't ready then it isn't ready. Trying to break into a new market with a half-done product would be a stupider move than delaying it until it IS ready.
"Intel could have a more successful launch if they can familiarize Larrabee with and receive feedback from as many developers as possible."
Here's the rub with this hypothesized plan. Familiarizing ones self with a product isn't some simple matter. This requires taking people away from productive products to simply learn another option. What business in its right mind would do this? This would amount to Intel getting free testers. Moving personnel over to new chipsets is an expensive endeavor and companies won't simply test Larrabee, espeically since the when, or even if, of its release on a mass scale. As it stands, "getting familiar" is either dumping cash into a huge hole or completely upside down when present values of the investment are considered.
This is why it took developers a long time to get ramped up on the PS3 and Cell. Sony had to foot the bill to get companies like Naugty Dog, Insomniac, and Konami to put out major games that actually took advantage of Cell. Cell is showing to be totally dead in the water everywhere else. Consumer computing, the big dollars, hasn't picked up on it and likely never will. As such, Sony had to show developers personally that money could be made by investing in the skills to properly use Cell, even though after the PS3 it's likely a dead technology.
This is the only way Intel can get Larrabee off the ground. This notion that software development companies are going to waste valuable resources beta testing an unknown quality needs to be dropped. Intel has to fund and market a product that uses Larrabee and prove that it can make it because, otherwise, all we have to work on is Intel's history, which is far from something Intel wants to advertise when it comes to graphics processing.
The thing us that with a full x86 core + graphic they will never EVER catch a pure GPU like radeon 5800. They will be ALWAYS behind for size and cost, only if people wouldnt care about price it could work (and that wont happen). Its like running a graphic trough some emulator, they will always need more ressources than a pure GPU. Not to mention that the performance in new games will be higly driver dependant. All games would need a specific driver aproach to run at full speed.
You know, there are companies out there that uses IGP to run their machines. And some of these companies are quite big. There isn't a huge need to for graphics power, just a means to display something on the screen.
Case in point, in a previous company I worked for years ago, we had a contract with Arrow/Intel to purchase their PCs for our mfg needs. We had hundreds monthly. Why don't we want a better graphics card? Well for one we're running OS/2 Warp :) But most importantly, there wasn't a need at all...not in the banking/financial industry. Even now the servers I built to process all the financial transactions use plain old IGP.
To make matter even worse, we were paying for hardware that exceeded the requirements of the system itself hahaha. What we really needed was a bigger bandwidth, of which fiber optic was designed in as the bus. But the CPU speed, the RAM, the graphics and especially the storage were all ridiculously un-needed. But you can't get "older" generations so you pay for what's current. Sucks when you're in the position.
All I'm saying is it's good that Intel wants to better the chipset but it's even better that they are taking their time to work things out. The market will be there when it does come out, they're not going anywhere.
Larrabee selling as a 100% discret graphic is FAIL from the 1 day.
Larrabee as a discret graphic + HPC CPU could actualy work.
So they just canceled the pure discret graphic thing to save themself from another itanic.
to compete with AMD Fusion, especially several years out. Intel tried to do it themselves, and they couldn't pull it off. They probably have much more respect for Nvidia now.
Have you been living under a rock? nVidia doesn't want to be purchased, and certainly not by Intel. And there are so many regulator hurdles that would have to be jumped that it would never get pass regulations. Realize that they'd own 80% of the GPU market plus a huge part of the chipset market. And once AMD bought ATI, they stopped making Intel chipsets.
[QUOTE]Intel hasn't said much about why it was canceled other than it was behind schedule. Intel recently announced that an overclocked Larrabee was able to deliver peak performance of 1 teraflop. Something AMD was able to do in 2008 with the Radeon HD 4870.[/QUOTE]
The overclocked Larrabee demonstration was an actual performance of 1 TFLOP on a SGEMM 4K x 4K calculation compared to the theoretical peak of 2 TFLOP, no? One article regarding the demonstration stated that the same calculation run on a NVIDIA Tesla C1060 (GT200) comes in at 370 GFLOP, while AMD FireStream 9270 (RV770) is only 300 GFLOP. So it's not something AMD was able to do in 2008, unless crossfire is taken into consideration at which point it might be possible. Anyway, point is that the hardware isn't comparable to the GT200/RV770 generation.
Nice find, really goes to show how finicky all of these GPGPU applications tend to be at current. Going from 300 GFLOP on the unoptimized code to a bit over 1 TFLOP with properly optimized code a bit over a year later is indeed quite nice. Though it also raises the question of whether or not the LRB1 hardware can be coaxed to get as much utilization of its hardware. If not, then hopefully the results gathered will at least be used to remove those bottlenecks from LRB2.
"Instead, the first version of Larrabee will exclusively be for developers interested in playing around with the chip. And honestly, though disappointing, it doesn't really matter."
Well I think it does matter as all Intel will have for the foreseeable future is the same crappy graphics they have had for years.
And it looks like Apple seems to think Intel's graphics are so bad they want nothing to do with them:
Whats really funny is seeing Anand and Charlie doing their revisionist writing that the Larrabbe consumer part cancellation was always expected or doesn't really matter.
I don't think it will matter. Larrabee is not a waste. Forget graphics for a minute. There are many other uses for a Larrabee-like chip in the embedded space. Think of it as a MLC flash controller, a TCP/IP acceleration chip, an I/O processor, etc. The same silicon could be used for all of these with just a firmware change. Larrabee technology could make all of these application-specific controllers much cheaper and easier to produce.
If Arrandale is really power hungry (i.e. more than the 25W TDP in current Pxxxx CPUs), the move makes a lot of sense. Apple is probably looking towards lower power parts. Even the lowest power i7-720QM uses a lot more power than most Core 2 parts; how much would half the core count and 32nm help? Enough to make Arrandale competitive with Core 2, probably, but I am skeptical that the initial parts will actually be significantly better for battery life. Again, waiting for something else on Apple's part would make sense.
There's also a question of whether Apple might not use Arrandale + discrete with the ability to switch between the two. They do that already on the MacBook Pro, only with 9400M and 9600M. If OS X will run on Intel's IGP, which it should, I'd guess it's more likely we'll see hybrid GPUs rather than an outright shunning of Arrandale.
It's funny how Intel bungled pretty much everything else they ever tried to dabble in:
- networking
- graphics (they've been trying for a long time either IGP or discrete, and never made anything that didn't suck... I seem to remember Dell's DGX in 1995 ?)
- non x86 CPUs
It's almost like they subsconsciously don't WANT anything but x86 to succeed. Or they just' can't succeed when they don't have a mile-long headstart. So, let's once and for all except the fact that they got lucky x86 got chosen by IBM for PCs, managed to more or less (with gentle prodding from AMD at times) not mess up too bad their stewardship of the x86 architecture, but can't get anything else right.
Let's stop wasting time following Intel. Will get faster procs, that consume less, every few months, with different power vs consumption levels. Thanks guys.
"It's almost like they subsconsciously don't WANT anything but x86 to succeed."
funny you mention that, because they insisted larabbe would be x86. Yet there was no reason to, larabbe gained nothing and lost a lot from being forced to be an x86 chip. Intel is shooting itself in the foot here by trying to fit a round object into a square hole.
Rather than bash or praise Intel, might it make more sense to look at the larger context? Could products and opportunities linked to Light Peak in (multi)-multi-core 86s have (temporarily?) eclipsed or sidetracked Intel’s efforts in discrete GPUs? Similarly, could Intel have decided to place more effort into getting iphone-capable CPUs into mass production rather than divert energy and capital into GPUs? In short, is Intel now aiming less at high-end graphics and focusing more on shorter-term solutions to provide a lot more graphic power for the average user .... while delaying its re-invention of GPU wheel?
Intel network cards and network chips are really good. They were more or less you could get onboard or for a reasonable price as a stand alone card.
And they may not produce the best graphics IGP's around, when talking about 3D performance. But they have done really wel in that segment, with about 50% marketshare in graphics. They were also first with interesting techs as 7.1 LPCM over HDMI.
""But they have done really wel in that segment, with about 50% marketshare in graphics.""
That is like saying Microsoft has done really with in the browser market with IE because they have a >50% market share. Anything couldn't be further from the truth.
They have given it away for nearly free or below cost in order to sell their CPU's to major OEM's on more then one occasion in order to push their CPUs.
Just looking at their IGP chipset prices they don't cost anymore than the non-IGP versions, around $3-5 more from what I know.
I'm with you on the network department as I absolutely love intel wireless chipsets over most of the other vendors as well as their wired Ethernet implementations.
Also love Intel Networks but don't like their Server side at all. Didn't pan out so well and it was a rough time working with them (ServerWorks) at the time (10 years ago I believe).
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
75 Comments
Back to Article
filotti - Wednesday, December 16, 2009 - link
Guess now we know why Intel cancelled Larrabee...Coolmike890 - Thursday, December 10, 2009 - link
I find this whole thing laughable. I remember Intel's previous attempts at graphics. I remember people saying the same things back then: "Intel is going to dominate the next gen graphics" etc. It is for this reason I have pretty much ignored everything to do with Larrabee. I figured it would be as successful as their previous attempts were.The nVidia guy is right. GPU's are hard to make. Look at all of the work they and AMD have put into graphics. Intel doesn't have the patience to develop a real GPU. After awhile they are forced to give up.
Scali - Friday, December 11, 2009 - link
A lot has changed though. Back in the early days, everything was hardwired, and a GPU had very little to do with a CPU. GPUs would only do very basic integer/fixedpoint math, and a lot of the work was still done on the CPU.These days, GPUs are true processors themselves, and issues like IEEE floating point processing, branch prediction, caching and such are starting to dominate the GPU design.
These are exactly the sort of thing that Intel is good at.
nVidia is now trying to close the gap with Fermi... delivering double-precision performance that is more in line with CPUs (where previous nVidia GPUs and current AMD GPUs have a ratio of about 1:5 for single-precision vs double-precision, rather than 1:2 that nVidia is aiming for), adding true function calling, unified address space, and that sort of thing.
With the introduction of the GeForce 8800 and Cuda, nVidia clearly started to step on Intel's turf with GPGPU. Fermi is another step closer to that.
If nVidia can move towards Intel's turf, why can't Intel meet nVidia halfway? Because some 15 years ago, Intel licensed video technology from Real3D which didn't turn out to be as successful as 3dfx' Voodoo? As if that has ANYTHING at all to do with the current situation...
Coolmike890 - Saturday, December 12, 2009 - link
You make excellent points. I will happily acknowledge that things have changed a lot. However, Intel has not. Like I say, I doubt they have the patience to develop a fully-functional competitive GPU. It takes too long. A couple of business cycles, after all of the grand press releases, someone up in the business towers decides that it's taking too long, and it's not worth the investment.I think when they decide to do these things, the boardroom people think they'll be #1 inside of a year, and when it doesn't turn out that way, they pull the plug. THAT's why they won't pull it off.
Scali - Monday, December 14, 2009 - link
Well, it's hard to say at this point.Maybe it was a good decision to not put Larrabee into mass production in its current form. That saves them a big investment on a product that they don't consider to have much of a chance of being profitable.
They haven't cancelled the product, they just want another iteration before they are going to put it on the market.
Who knows, they may get it right on the second take. I most certainly hope so, since it will keep nVidia and AMD on their toes.
erikejw - Thursday, December 10, 2009 - link
"Larrabee hasn't been let out of Intel hands, chances are that it's more than 6 months away at this point. "Some developers have working silicon(slow) since 6 months.
That is all I will say.
erikejw - Thursday, December 10, 2009 - link
3 months.I love the editing feature.
WaltC - Tuesday, December 8, 2009 - link
___________________________________________________________"Do I believe the 48-core research announcement had anything to do with Larrabee's cancelation? Not really. The project came out of a different team within Intel. Intel Labs have worked on bits and pieces of technologies that will ultimately be used inside Larrabee, but the GPU team is quite different."
___________________________________________________________
It wasn't the Larrabee team or any other product development team that canceled Larrabee, from what I gather. It was management, and management sits above the development teams and is the entity responsible for making product cancellations as well as deciding which products go to market. I don't think the 48-core research announcement had anything directly to do with Larrabee's cancellation, either, because that 48-core chip doesn't itself exist yet. But I surely think the announcement was made just prior to the Larrabee cancellation in order to lessen the PR impact of Larrabee getting the axe, and I also don't think Larrabee being canceled just before the week-end was an accident, either.
________________________________________________________
"With the Radeon HD 5870 already at 2.7 TFLOPS peak, chances are that Larrabee wasn't going to be remotely competitive, even if it came out today. We all knew this, no one was expecting Intel to compete at the high end. Its agents have been quietly talking about the uselessness of > $200 GPUs for much of the past two years, indicating exactly where Intel views the market for Larrabee's first incarnation.
Thanks to AMD's aggressive rollout of the Radeon HD 5000 series, even at lower price points Larrabee wouldn't have been competitive - delayed or not."
_______________________________________________________
Well, if Larrabee wasn't intended by Intel to occupy the high end, and wasn't meant to cost >$200 anyway, why would the 5870's 2.7 TFLOPS benchmark peak matter at all? It shouldn't have mattered at all, should it, as Larrabee was never intended as a competitor? Why bring up the 5870, then?
And, uh, what were the "real-time ray tracing" demos Intel put on with Larrabee software simulators supposed to mean? Aside from the fact that the demos were not impressive in themselves, the demos plainly said, "This is the sort of thing we expect to be able to competently do once Larrabee ships," and I don't really think this fits the "low-middle" segment of the 3d-gpu market. Any gpu that *could* do RTRT @ 25-30 fps consistently would be, by definition, anything but "low-middle end," wouldn't it?
That's why I think Larrabee was canceled simply because the product would have fallen far short of the expectations Intel had created for it. To be fair, the Larrabee team manager never said that RTRT was a central goal for Larrabee--but this didn't stop the tech press from assuming that anyway. Really, you can't blame them--since it was the Intel team who made it a part of the Larrabee demos, such as they were.
_______________________________________________________
"It's not until you get in the 2013 - 2015 range that Larrabee even comes into play. The Larrabee that makes it into those designs will look nothing like the retail chip that just got canceled."
_______________________________________________________
Come on, heh...;) If the "Larrabee" that ships in 2013-2015 (eons of time in this business) looks "nothing like the retail chip that just got canceled"--well, then, it won't be Larrabee at all, will it? I daresay it will be called something else entirely.
Point is, if the Larrabee that just got canceled was so uncompetitive that Intel canceled it, why should the situation improve in the next three to five years? I mean, AMD and nVidia won't be sitting still in all that time, will they?
What I think is this: "Larrabee" was a poor, hazy concept to begin with, and Intel canceled it because Intel couldn't make it. If it is uncompetitive now, I can't see how a Larrabee by any other name will be any more competitive 3-5 years from now. I think Intel is aware of this and that's why Intel canceled the product.
ash9 - Wednesday, December 9, 2009 - link
Larrabee Demo:Did anyone note the judder effects of the flying objects in the distance?
Penti - Monday, December 7, 2009 - link
I really think It's a shame. I was hoping for a discreet Intel larrabee-card. It would be perfect for me that aren't a die hard gamer but still wants decent performance. I don't think the "software" route is a bad one, people don't seem to realize that that's exactly what AMD/ATi and nVidia is doing. Vidcards don't understand DX and OGL. Plus Video-cards is for so much more then DX and OGL now days. It's shaders that help in other apps, it's OpenCL, it's code written for specific gpus and so on. Also Intels Linux support is rock solid. An discreet GPU from Intel with Open source drivers would really be all it can be. Something the others wouldn't spend the same amount of resources trying to accomplish. Even the AMD server boards uses Intel network chips because they can actually write software. But at least we can look forward to the Arrandale/Clarkdale / H55. But I was also looking forward to something more resembling workstation graphics. But at least I hope Arrandale/Clarkdale helps to bring multimedia to alternative systems. Even though that will never fully happen in patent-ridden world. But at least it's easier to fill the gap with good drivers that can handle video. I also see no reason why Apple wouldn't go back to Intel chipsets including IGPs.Christobevii3 - Monday, December 7, 2009 - link
I was hoping they'd release this as an add in card for servers. Since it uses general processors, having an add in card with 32 or what not mediocre cpu's and being able to allocate them to virtual servers would be amazing.pugster - Monday, December 7, 2009 - link
I figured that Intel would cancel Larabee because of cost. An Intel core2duo wolfdale processor with 6mb of l2 cache has 410 million transistors and would easily sell for $200. My cheap geforce 9600gso would probably sell for $70 and the gpu itself has 505 transistors. I figure that even if Intel makes competitive gpus, they would barely eek out a profit selling these larabee gpus.Robear - Monday, December 7, 2009 - link
At this point I wouldn't be all that surprised to hear an announcement for a new discrete graphics microarchitecture. Intel has an advantage with nVidia and AMD both being fabless, and both using TSMC. A whole article could be written about that alone.Larabee was heckled around the nVidia/AMD circles at the notion of a software-driven, general purpose GPU. I wanted to believe that Intel had an ace up its sleeve, but nVidia and AMD have been doing this for a while. You couldn't just dismiss what they were saying.
The whole "fusion" architecture seems to be the future of the CPU / GPU, so it would surprise me if Intel dropped out of the discrete business completely. It makes sense to invest in graphics, and it makes sense to offset your R&D costs by selling your product in another market (i.e. mobile graphics -> discrete).
If I was to guess, I would assume that after performance profiling their engineers came about to "If we had purpose-specific hardware we could ..." You can use your imagination there.
Anyway, I would expect a new microarchitecture to be announced in the next 6 months or so. That might explain why they didn't kill larrabee completely. We'll probably see a GP, many-core architecture with purpose-specific instructions (Like SSE). x86 would be good, and I can see where Intel would benefit from that, but I wouldn't put it past Intel to go with a new instruction set specific to GPUs. I could see that instruction set quickly becoming a standard, too.
x86 instructions are good, but this instruction set fares well with multi-tasking branch-heavy threads, integer computing, and general I/O. GPUs have a very heavy FP workload that's usually streaming, and this has been handled with x86 extensions. I can see x86 support being a burden on the overall architecture, so it seems only natural to take lessons learned and start over.
Intel knows instruction sets. Intel knows compilers. Intel knows how to fab. My disappointment in not seeing an Intel GPU this year is mitigated by my excitement to see what Intel will be coming out with in its place.
Scali - Tuesday, December 8, 2009 - link
Larrabee already HAS purpose-specific instructions (LRBni).It's not that different from what nVidia does with Fermi, really. It's just that nVidia doesn't use the x86 ISA as their basis. Doesn't matter all that much when you use C++ anyway.
Shadowmaster625 - Monday, December 7, 2009 - link
Did anybody catch the innuendo in their choice of game? ... Enemy Territory!~ har har harUltraWide - Monday, December 7, 2009 - link
What about the angle that intel is still being investigated for anti-competitive business practices? No one has talked about this tidbit.The0ne - Monday, December 7, 2009 - link
I don't know about you but I am not surprise nor will be if they get investigated. But then neither am I about other companies trying to win market share. Dell does it, AMD does it, MS does it, Apple is the king of it all, etc.It's a topic that's not related to this topic as it's huge by itself.
Targon - Monday, December 7, 2009 - link
Intel has a problem, and that is that while they can do a VERY good job when focused on making ONE very good product, they don't do a very good job in multiple areas.Look at processors....Intel has been focused on making the best processors out there, but really, they don't make ANYTHING else that is really amazing. Chipset quality is good, because that SUPPORTS their CPU business. GPU designs are pathetic and so far behind AMD and NVIDIA that I won't recommend ANY system with an Intel graphics product. Their networking products are also less than amazing.
So, does Intel do ANYTHING really well other than processors and chipsets? Even when they try to branch out into other areas and invest a LOT of money, the results have yet to impress anyone.
Even when they have tried to come up with an all new design it has flopped EVERY time. The last successful all new design was the Pentium Pro(even though the original flopped, the design ended up in the Pentium 2 and eventually into the Core and Core 2 line of processors)....so, Intel has come up with THREE really original designs. The old 8086/8088, the Pentium, and the Pentium Pro. Itanium flopped, and what else will Intel do.
Scali - Tuesday, December 8, 2009 - link
Uhhh... the 286, 386 and 486 were all completely unique architectures aswell, and all were very successful.As for Intel's IGP... Sure, Intel goes for a design that is as cheap and simple as possible, but in that light they do well. Intel's IGPs are VERY cheap, and they support everything you might need... They have DX10 support, video acceleration, DVI/HDMI/HDCP and all that.
Given the amount of transistors that Intel spends on their IGPs and the number of processing units inside their IGPs, they are in line with what nVidia and AMD offer. They just apply the technology on a smaller scale (which is good for cost and for power consumption/heat, which matter more to a lot of people than overall performance).
somedude1234 - Monday, December 7, 2009 - link
OK, so Intel GPU's can't touch ATI/AMD or nVidia's best, we know this, we've always known this. Who has more GPU market share?You want a list of other things that Intel does well?
- Network controllers: some of the best money can buy.
- Consumer-level SSDs: Theirs are the best on the market in the benchmarks that matter the most for performance (random R/W). AFAIK, even the X25-E doesn't have a super-cap or other cache protection, so I don't consider that to be a truly "enterprise" drive.
- NAND Flash memory
Just to name a few examples.
AnandThenMan - Monday, December 7, 2009 - link
Yes, that's why Intel spent a a billion+ on Larrabee, for no reason at all, because they already ship the highest #s of GPUs. Intel loves to just toss money to the wind, it doesn't matter at all.
kkwst2 - Wednesday, December 9, 2009 - link
What? Talk about a straw man argument. The GP said nothing to imply that Larrabee didn't matter, but was just responding to the ludicrous assertion by the GGP that Intel didn't do anything else well/successfully.EnzoFX - Monday, December 7, 2009 - link
Why so much shit towards Intel. Their IGP's are not for you, that's fine, but there's no reason to call it a bad product out of rage. The IGP itself just works for MOST computer users. If this wasn't the case EVERYONE would have a discrete solution, and Intel wouldn't have such a huge marketshare in the Graphics market, no one would allow it, not the OEM's, or MS.They do work, just not for you, not for gaming. Sometimes people forget that they are not the majority. I'd say anyone reading Anandtech does not fall in this category, who have always been fine with Intel's IGP, or at least getting along fine with it.
- Monday, December 7, 2009 - link
Intel's announcement wasn't too surprising or devastating, it just makes things a bit less interesting.I am a long time reader of AnandTech (from its inception) and am not surprised by the above statement coming from this site. But I beg to differ; Intel has proven in recent news that they are indeed top nomine for the 'evil empire" award-perhaps for as long as 10 years. If all worldly claims ring true, Intel is at best an inhibitor of technology (except their own). What will the world be like without Intel’s presence is always interesting- it will spark competition and innovation, once again, out of 2 car garages (figurative) where PC’s began.
asH
stmok - Monday, December 7, 2009 - link
...Call it Larrabee 1.0 ;)Gen 1 = Prototype.
Gen 2 = Learn from Gen 1.
Gen 3 = Start again based on what you've learned from Gen 1 and 2.
Gen 4 = Refinement of Gen 3.
etc...
I think I'll stick to ATI or Nvidia solutions for a good while yet.
Mike1111 - Monday, December 7, 2009 - link
I also asked about Haswell in the comments of the last Larrabee cancelation article. Because it was my understanding that some future version of Larrabee was planed to be integrated into the Haswell architecture as a GPU/AVX2 hybrid.Anand himself mentioned something like this in his IDF 2009 Larrabee Demo article: "Nehalem was architected to be very modular, Jasper Forest is just another example of that. Long term you can expect Nehalem cores (or a similar derivative) to work alongside Larrabee cores, which is what many believe Haswell will end up looking like."
At least, I was one of many ;-)
haukionkannel - Monday, December 7, 2009 - link
Intel don't need anything faster than their today IGP's...Most consumer are happy with even that. But yeah to PC-players this is not so good news.
MamiyaOtaru - Monday, December 7, 2009 - link
because that's by far the most important benchmark for SSDs amirite?MrDiSante - Monday, December 7, 2009 - link
And random reads/writes that blow anything else out of the water. I recommend you actually read an article or two on this fine site about SSDs.Devo2007 - Monday, December 7, 2009 - link
..never mind the fact they've had to pull firmware releases TWICE in a row to fix major data loss issues. Yeah, Intel's got a great track record with SSD technology right now. :pMrPoletski - Wednesday, December 9, 2009 - link
Those major flaws that Intel pulled its SSDS for were ni reality fairly minor and unlikely to affect Joe Public.If it had been another company, the existance of a fault would probably have been denied completely.
Devo2007 - Monday, December 7, 2009 - link
Just to clarify: Intel SSDs are generally pretty good, but they have to run further tests on the firmware before releasing it.bradley - Monday, December 7, 2009 - link
Or how about substantiated claims about Intel's X25-E SSD having hardware/firmware problems which cause performance slowdowns and operational time-outs -- and at least one documented account of an enterprise storage provider who has kicked Intel to the curb, Pillar Data Systems Inc?http://www.computerworld.com/s/article/9138130/Pil...">http://www.computerworld.com/s/article/...SSD_to_t...
AnandThenMan - Monday, December 7, 2009 - link
This can't be taken seriously, really. Larrabee is a very very important product for Intel. Without it, they will have nothing to fight AMD and Nvidia in the inevitable HPC sector using some form of GPU acceleration. It's going to happen, which is why Larrabee was born in the first place. To say that "it doesn't matter" is asinine at best, of course it matters! It matters a whole hell of a lot, and the failure of Intel to make Larrabee a sellable and viable product is a huge setback, financially and otherwise. Intel didn't sink a vast amounts of money into Larrabee just to have it barely survive as some sort of developer platform. This was Intel's full on fray into making a real graphics chip.
How does Intel expect to fight Fusion now? They will have nothing that will come even remotely close. This whole article is a big fat attempt at trying to soften the blow, it's beyond silly and honestly fairly insulting to the readers. Sites like this are supposed to tell it like it is, and the truth is Larrabee is a giant, multi-billon dollar flop. Yes Intel can absorb it, but that doesn't make the failure any less significant in its own right.
AnandThenMan - Monday, December 7, 2009 - link
I would also like to add that AMD gets flogged and beaten with the point that they don't have a processor that can hang with Intel's best. But at least they have a processor that is fairly close. Turn things around, Intel has zero in graphics that can compete with AMD, yet Intel gets a pass.Intel should stop fooling around trying to make an x86 based graphics chip and design one properly, or buy up the IP to do it. Larrabee is essentially Intel trying to use x86 to monopolize the graphics space, but found out the hard way that x86 is terrible at doing graphics, the reason real GPUs exist it he first place.
beginner99 - Monday, December 7, 2009 - link
IMHO the are building this in x86 because sometime in the future you will just have the 100+ core array seen the Intel slides on which each core can be used for graphics or as CPU. So you actually don't have any distinction anymore between CPU and GPU.This would be pretty nice since the system can be balanced very nicely theoretically that your neither GPU nor CPU limited.
smartalco - Monday, December 7, 2009 - link
You seem to miss the part where Intel isn't canceling Larrabee as a consumer part permanently, they are just delaying consumer release. Pretty much everything you just said is irrelevant if you look past the one line you quoted.AnandThenMan - Monday, December 7, 2009 - link
Who cares? Something that exists in a lab or in obscurity for years on end doesn't do Intel any good, never mind the consumer. You can't buy Larrabee, and won't be able to for who knows how long, maybe never. Which makes it a failure, period. No amount of spin and excuses can change that. Apply the same logic to any other product, and say, well it's not available, won't be for years, but it doesn't matter it isn't actually canceled! All is well, nothing to worry about.
Pure nonsense. Suppose Fermi is never released, and Nvidia announces in 3 months that it won't be a real product, but will live on as an internal project. Will anyone really being saying, it doesn't matter? Of course not. The only reason Intel can get away with this is because it's not there core business. But that is going to hurt them going forward, because the GPU and CPU are on a collision course, it's only a matter time. Which again, is exactly the reason Larrabee was conceived in the first place.
Minion4Hire - Monday, December 7, 2009 - link
It does Intel a TON of good if they can get functional, realistic hardware into the hands of developers to play with and help them refine their product. I doubt Intel wants to release a part that no one is willing to buy because of the inevitable problems it will have upon launch, regardless of how well or poorly it performs. By holding back and letting the developers pick it apart they have more time to improve Larrabee and are more likely to be able to limit/counter the difficulties they will encounter when it's finally launched.AnandThenMan - Monday, December 7, 2009 - link
So you're saying up until this point, Intel had no Larrabee cards in the hand of developers?
You know what does even more good? Having and actual product on the market AND in the hands of developers. That's how hardware becomes successful.
How is holding back going to somehow magically make Intel able to improve Larrabee at a faster rate than they were working at before? It's not.
Minion4Hire - Monday, December 7, 2009 - link
MORE developers.... Sony tried to get game studios to develop for the Cell, but a lack of development meant that there were few reasons to buy a PS3 in the first year or two after its release (unless you wanted a Blu-Ray player). Intel could have a more successful launch if they can familiarize Larrabee with and receive feedback from as many developers as possible.And I never said that holding back will "improve Larrabee at a faster rate", I said that they'll have fewer problems with the product when they do launch it.
Seriously, you seem to be trying too hard. Intel has working hardware but they don't feel it's ready. I'm willing to bet their major hurdle right now is software more than hardware, but either way if it isn't ready then it isn't ready. Trying to break into a new market with a half-done product would be a stupider move than delaying it until it IS ready.
LaughingTarget - Monday, December 7, 2009 - link
"Intel could have a more successful launch if they can familiarize Larrabee with and receive feedback from as many developers as possible."Here's the rub with this hypothesized plan. Familiarizing ones self with a product isn't some simple matter. This requires taking people away from productive products to simply learn another option. What business in its right mind would do this? This would amount to Intel getting free testers. Moving personnel over to new chipsets is an expensive endeavor and companies won't simply test Larrabee, espeically since the when, or even if, of its release on a mass scale. As it stands, "getting familiar" is either dumping cash into a huge hole or completely upside down when present values of the investment are considered.
This is why it took developers a long time to get ramped up on the PS3 and Cell. Sony had to foot the bill to get companies like Naugty Dog, Insomniac, and Konami to put out major games that actually took advantage of Cell. Cell is showing to be totally dead in the water everywhere else. Consumer computing, the big dollars, hasn't picked up on it and likely never will. As such, Sony had to show developers personally that money could be made by investing in the skills to properly use Cell, even though after the PS3 it's likely a dead technology.
This is the only way Intel can get Larrabee off the ground. This notion that software development companies are going to waste valuable resources beta testing an unknown quality needs to be dropped. Intel has to fund and market a product that uses Larrabee and prove that it can make it because, otherwise, all we have to work on is Intel's history, which is far from something Intel wants to advertise when it comes to graphics processing.
Zool - Monday, December 7, 2009 - link
The thing us that with a full x86 core + graphic they will never EVER catch a pure GPU like radeon 5800. They will be ALWAYS behind for size and cost, only if people wouldnt care about price it could work (and that wont happen). Its like running a graphic trough some emulator, they will always need more ressources than a pure GPU. Not to mention that the performance in new games will be higly driver dependant. All games would need a specific driver aproach to run at full speed.The0ne - Monday, December 7, 2009 - link
You know, there are companies out there that uses IGP to run their machines. And some of these companies are quite big. There isn't a huge need to for graphics power, just a means to display something on the screen.Case in point, in a previous company I worked for years ago, we had a contract with Arrow/Intel to purchase their PCs for our mfg needs. We had hundreds monthly. Why don't we want a better graphics card? Well for one we're running OS/2 Warp :) But most importantly, there wasn't a need at all...not in the banking/financial industry. Even now the servers I built to process all the financial transactions use plain old IGP.
To make matter even worse, we were paying for hardware that exceeded the requirements of the system itself hahaha. What we really needed was a bigger bandwidth, of which fiber optic was designed in as the bus. But the CPU speed, the RAM, the graphics and especially the storage were all ridiculously un-needed. But you can't get "older" generations so you pay for what's current. Sucks when you're in the position.
All I'm saying is it's good that Intel wants to better the chipset but it's even better that they are taking their time to work things out. The market will be there when it does come out, they're not going anywhere.
Zool - Monday, December 7, 2009 - link
Larrabee selling as a 100% discret graphic is FAIL from the 1 day.Larrabee as a discret graphic + HPC CPU could actualy work.
So they just canceled the pure discret graphic thing to save themself from another itanic.
rolodomo - Sunday, December 6, 2009 - link
to compete with AMD Fusion, especially several years out. Intel tried to do it themselves, and they couldn't pull it off. They probably have much more respect for Nvidia now.AMD/ATI v. Intel/Nvidia
dagamer34 - Monday, December 7, 2009 - link
Have you been living under a rock? nVidia doesn't want to be purchased, and certainly not by Intel. And there are so many regulator hurdles that would have to be jumped that it would never get pass regulations. Realize that they'd own 80% of the GPU market plus a huge part of the chipset market. And once AMD bought ATI, they stopped making Intel chipsets.JHBoricua - Monday, December 7, 2009 - link
Nvidia has 80% of the GPU market? Only in their PR department. Otherwise, citations needed.martinw - Monday, December 7, 2009 - link
I think the previous poster meant that Intel+Nvidia together have around 80% of the GPU market.Khato - Sunday, December 6, 2009 - link
[QUOTE]Intel hasn't said much about why it was canceled other than it was behind schedule. Intel recently announced that an overclocked Larrabee was able to deliver peak performance of 1 teraflop. Something AMD was able to do in 2008 with the Radeon HD 4870.[/QUOTE]The overclocked Larrabee demonstration was an actual performance of 1 TFLOP on a SGEMM 4K x 4K calculation compared to the theoretical peak of 2 TFLOP, no? One article regarding the demonstration stated that the same calculation run on a NVIDIA Tesla C1060 (GT200) comes in at 370 GFLOP, while AMD FireStream 9270 (RV770) is only 300 GFLOP. So it's not something AMD was able to do in 2008, unless crossfire is taken into consideration at which point it might be possible. Anyway, point is that the hardware isn't comparable to the GT200/RV770 generation.
br0wn - Monday, December 7, 2009 - link
Actually Anand is correct. Here is a link to an actual performance of 1 Teraflop SGEMM using Radeon HD 4870: http://forum.beyond3d.com/showthread.php?t=54842">http://forum.beyond3d.com/showthread.php?t=54842The quoted 300 GFlops for RV770 from BSN is based on unoptimized code.
Khato - Monday, December 7, 2009 - link
Nice find, really goes to show how finicky all of these GPGPU applications tend to be at current. Going from 300 GFLOP on the unoptimized code to a bit over 1 TFLOP with properly optimized code a bit over a year later is indeed quite nice. Though it also raises the question of whether or not the LRB1 hardware can be coaxed to get as much utilization of its hardware. If not, then hopefully the results gathered will at least be used to remove those bottlenecks from LRB2.HighTech4US - Sunday, December 6, 2009 - link
"Instead, the first version of Larrabee will exclusively be for developers interested in playing around with the chip. And honestly, though disappointing, it doesn't really matter."Well I think it does matter as all Intel will have for the foreseeable future is the same crappy graphics they have had for years.
And it looks like Apple seems to think Intel's graphics are so bad they want nothing to do with them:
Apple ditches 32nm Arrandale, won't use Intel graphics
http://www.brightsideofnews.com/news/2009/12/5/app...">http://www.brightsideofnews.com/news/20...randale2...
IntelUser2000 - Sunday, December 6, 2009 - link
[quote]Apple ditches 32nm Arrandale, won't use Intel graphics
http://www.brightsideofnews.com/news/20...randale2...">http://www.brightsideofnews.com/news/20...randale2... [/quote]
I don't believe this. The reason is because when you plug in an external GPU, the IGP will be disabled anyway.
Plus, BSN has been wrong on so many things, its surprising people take most of its articles as more than just humor.
HighTech4US - Monday, December 7, 2009 - link
BSN was very right about Larrabbe:10/12/2009 An Inconvenient Truth: Intel Larrabee story revealed
http://brightsideofnews.com/news/2009/10/12/an-inc...">http://brightsideofnews.com/news/2009/1...uth-inte...
Whats really funny is seeing Anand and Charlie doing their revisionist writing that the Larrabbe consumer part cancellation was always expected or doesn't really matter.
I haven't stopped laughing yet.
Jaybus - Monday, December 7, 2009 - link
I don't think it will matter. Larrabee is not a waste. Forget graphics for a minute. There are many other uses for a Larrabee-like chip in the embedded space. Think of it as a MLC flash controller, a TCP/IP acceleration chip, an I/O processor, etc. The same silicon could be used for all of these with just a firmware change. Larrabee technology could make all of these application-specific controllers much cheaper and easier to produce.IntelUser2000 - Monday, December 7, 2009 - link
I'm not saying they are wrong with everything but...write enough crap and you are bound to get few right.
HighTech4US - Monday, December 7, 2009 - link
I somehow missed the topic change to Charlie and his Semi-Accurate site because "crap and bound to get a few right" is more like his drivel.JarredWalton - Sunday, December 6, 2009 - link
If Arrandale is really power hungry (i.e. more than the 25W TDP in current Pxxxx CPUs), the move makes a lot of sense. Apple is probably looking towards lower power parts. Even the lowest power i7-720QM uses a lot more power than most Core 2 parts; how much would half the core count and 32nm help? Enough to make Arrandale competitive with Core 2, probably, but I am skeptical that the initial parts will actually be significantly better for battery life. Again, waiting for something else on Apple's part would make sense.There's also a question of whether Apple might not use Arrandale + discrete with the ability to switch between the two. They do that already on the MacBook Pro, only with 9400M and 9600M. If OS X will run on Intel's IGP, which it should, I'd guess it's more likely we'll see hybrid GPUs rather than an outright shunning of Arrandale.
/Speculation. :)
StormyParis - Sunday, December 6, 2009 - link
It's funny how Intel bungled pretty much everything else they ever tried to dabble in:- networking
- graphics (they've been trying for a long time either IGP or discrete, and never made anything that didn't suck... I seem to remember Dell's DGX in 1995 ?)
- non x86 CPUs
It's almost like they subsconsciously don't WANT anything but x86 to succeed. Or they just' can't succeed when they don't have a mile-long headstart. So, let's once and for all except the fact that they got lucky x86 got chosen by IBM for PCs, managed to more or less (with gentle prodding from AMD at times) not mess up too bad their stewardship of the x86 architecture, but can't get anything else right.
Let's stop wasting time following Intel. Will get faster procs, that consume less, every few months, with different power vs consumption levels. Thanks guys.
Gasaraki88 - Friday, December 11, 2009 - link
Intel network cards are one of the best one there. I choose them for my ESX servers.taltamir - Tuesday, December 8, 2009 - link
"It's almost like they subsconsciously don't WANT anything but x86 to succeed."funny you mention that, because they insisted larabbe would be x86. Yet there was no reason to, larabbe gained nothing and lost a lot from being forced to be an x86 chip. Intel is shooting itself in the foot here by trying to fit a round object into a square hole.
sanzong - Wednesday, December 9, 2009 - link
shouldn't it be "trying to fit a round object into an oval hole", :-)FJD - Monday, December 7, 2009 - link
Rather than bash or praise Intel, might it make more sense to look at the larger context? Could products and opportunities linked to Light Peak in (multi)-multi-core 86s have (temporarily?) eclipsed or sidetracked Intel’s efforts in discrete GPUs? Similarly, could Intel have decided to place more effort into getting iphone-capable CPUs into mass production rather than divert energy and capital into GPUs? In short, is Intel now aiming less at high-end graphics and focusing more on shorter-term solutions to provide a lot more graphic power for the average user .... while delaying its re-invention of GPU wheel?cdillon - Monday, December 7, 2009 - link
Intel bungled networking? Intel makes some of the best NICs you can buy, wired and wireless. Or did you mean something else?CharonPDX - Monday, December 7, 2009 - link
"Networking", "Dabbled in"?Intel is generally considered to be the gold-standard of PC/serer network hardware for everything but the ultra-high-end "mission critical" segment.
(Yes, their attempt to get into network switches has been pretty much halted at the mid-range level; but even at that level, they're rock solid.)
knirfie - Monday, December 7, 2009 - link
What do you mean bungled?Intel network cards and network chips are really good. They were more or less you could get onboard or for a reasonable price as a stand alone card.
And they may not produce the best graphics IGP's around, when talking about 3D performance. But they have done really wel in that segment, with about 50% marketshare in graphics. They were also first with interesting techs as 7.1 LPCM over HDMI.
Kougar - Monday, December 7, 2009 - link
""But they have done really wel in that segment, with about 50% marketshare in graphics.""That is like saying Microsoft has done really with in the browser market with IE because they have a >50% market share. Anything couldn't be further from the truth.
Jaybus - Monday, December 7, 2009 - link
It's nothing like IE. They don't give away IGP hardware for free. 50% of that market is a substantial profit for Intel.Kougar - Thursday, December 10, 2009 - link
They have given it away for nearly free or below cost in order to sell their CPU's to major OEM's on more then one occasion in order to push their CPUs.Just looking at their IGP chipset prices they don't cost anymore than the non-IGP versions, around $3-5 more from what I know.
Jedi2155 - Monday, December 7, 2009 - link
I'm with you on the network department as I absolutely love intel wireless chipsets over most of the other vendors as well as their wired Ethernet implementations.The0ne - Monday, December 7, 2009 - link
Also love Intel Networks but don't like their Server side at all. Didn't pan out so well and it was a rough time working with them (ServerWorks) at the time (10 years ago I believe).n7 - Sunday, December 6, 2009 - link
Yeah those SSDs they've made are just terrible...jeffrey - Sunday, December 6, 2009 - link
You bring-up some interesting points. I would like to add Itanium. Tukwila is running a little late just like all of the other members of the family.Anand, any chance of an in-depth status update of the Intel Itanium Architecture?
Taft12 - Monday, December 7, 2009 - link
Uhh, he DID include Itanium when he mentioned non-x86 CPUs.I don't know of Intel screwing up any networking business. Their ethernet cards are regarded as some of the best available.