Intel CEO Blames Company's Obsessive Focus on Capturing 90% CPU Market Share For Missing Out on Other Opportunities (wccftech.com) 101
Intel chief executive Bob Swan says he's willing to let go the company's traditional dominance of the market for CPUs in order to meet the rising demand for newer, more specialized silicon chips for applications such as AI and autonomous cars. From a report: Intel's Bob Swan blames being focused on 90% CPU market share as a reason for missing opportunities and transitions, envisions Intel as having 30% of all-silicon TAM instead of majority CPU TAM. Just a few years ago, Intel owned more than 90% of the market share in the x86 CPU market. Many financial models used Intel's revenue as a proxy for the Total Available Market of the CPU sector. With a full-year revenue of $59.4 billion in 2017, you can estimate the total TAM of the CPU side of things at roughly $66 billion (2017 est). Bob Swan believes that this mindset of protecting a majority share in the CPU side has led to Intel becoming complacent and missing out on major opportunities. Bob even went as far as to say that he is trying to "destroy" this thinking of having a 90% market share in the CPU side and instead wants people to come into office thinking Intel has 30% market share in "all Silicon." Swan on how Intel got to the place where it is now: How we got here is really kind of threefold, one we got a lot faster than we expected and the demand for CPUs and servers grew much faster than we expected in 2018. You'll remember we came into 2018 projecting a 10% growth and we grew by 21% growth so the good news problem is that demand for our products in our transformation to a data-centric company was much higher than we expected. Secondly, we took on a 100% market share for smartphone modem and we decided that we would build it in our fabs, so we took on even more demand. And third, to exacerbate that, we slipped on bringing our 10nm to life and when that happens you build more and more performance into your last generation for us -- 14nm -- which means there is a higher core count and larger die size. So those three -- growing much faster than we thought, bringing modems inside and delaying 10nm resulted in a position where we didn't have flexible capacity.
Re: (Score:3, Interesting)
Low power got them core to begin with. It's not a bad strategy. I think they also need to focus on chiplet designs. They have yield problems and AMD solved it with chiplets. Intel could get back into the game in a few release cycles if they worked on a chiplet design. Finally, they should drop their crap 10nm process and just go to a new process node. Follow the AMD blueprint here and they can actually catch up.
What CAUSED Intel's insufficient management? (Score:4, Insightful)
However, what is needed more is a discussion of the conditions that caused Intel not to do well. I posted this comment earlier this year: Intel management seems insufficient. [slashdot.org] Part of that comment:
"The Intel CEO, Robert Swan Intel CEO, Robert Swan [intel.com], apparently has little or no technical knowledge."
The CEO doesn't understand the technical details of managing the company? To me, that is an underlying reason companies don't do well.
Re: (Score:2)
The most knowledgeable CEO Intel ever had on process technology and fabs was Brian Krzanich who CAME from leading TMG, the division responsible for that. AND YET the failings of the 10nm process happened on his watch. You could make a good argument that design goals for 10nm were too aggressive because he was enthusiastic about getting a really superior node and a less technical, less knowledgeable but more business-savvy CEO might have been more risk averse and asked TMG to prioritize safety and timeliness
Overall reasons for Intel's poor management? (Score:2)
It's not possible for someone who doesn't understand the business in depth to be fully "business-savvy".
You are directing our attention to important details. But what are the overall reasons?
Re:Overall reasons for Intel's poor management? (Score:5, Insightful)
I disagree. The leader certainly needs to be familiar with the businesses core competency but they need not be a subject matter expert in it. They need to know enough to be able to ask the right questions so they can make informed decisions. The job of Intel's CEO is not to develop the process technology themselves. It's to put the right people in positions of power who can and to hold them accountable. Having too much technical knowledge and not enough business knowledge can be just as bad for a CEO as the opposite problem. They can get themselves stuck in the trenches where they don't belong and lose track of where they are directing the company as a whole.
MOD Parent UP! (Score:3)
Quoting, with corrections and editing:
The leader needs to be capable of understanding the core competency but does not need to be an expert in it.
Technology company leaders need to know enough to be able to ask the right questions so they can make informed decisions.
Another quote:
Having
Re: (Score:2)
Swan has been CEO since January of this year. How do you explain Intel's problems before that? How many of Intel's problems started only this year?
Re: Here's another focus area: low power (Score:2)
Re: (Score:2)
That's ... really bad. TSMC's 5nm started "risk production" last March, and they are targeting 2020 for volume production, ramping up in the first half of the year. They started risk production of their 7nm+ enhancement in August of last year, which I mention because it adds EUV lithography to the mix, AMD's Zen 3 is scheduled to use it. Intel's "10nm" node is purely 193nm UV lithography, it would appear they don't have "real world" EUV experience yet.
I also read [tsmc.com] from their website that they have a 7nm
Go AMD! (Score:5, Interesting)
AMD had been lagging behind in the x86 CPU area until a couple years ago, when they started churning out glorious CPUs. All my machines are slowly switching to AMD CPUs. My fiancee's mITX PC has a Ryzen 3600, my sister's build will have either 3950X or 2950X (whichever becomes available first when I start buying the components), and my main PC will have a Threadripper 3xxx (will upgrade sometime during fall next year). My freshly built NAS has a Ryzen 3 1200 in it as well. All my friends who upgraded their PCs during the last couple years have chosen AMD.
Intel's dominance in the CPU market has two reasons, none of which is related to performance. The first is their multi-year exclusive deals with companies such as Lenovo, Dell, HP, Toshiba (using Intel CPus in laptops) and the second is capacity. They can churn out a lot more CPUs.
Re:Go AMD! (Score:4)
I'll probably buy AMD for my next system anyway because Intel honestly deserves what they're getting right now and I like that they don't change sockets every ten seconds. Maybe if Intel gets crushed hard enough, they'll start doing things the right way like AMD has been all along and we'll have two legitimate competitors.
Re: (Score:3)
Re: (Score:2)
Intel only has an edge in clock speeds at the very top end of their product line, but for the rest of their chips, Intel is behind in terms of performance. Think about it, you hear about the 9900k and the 9700k, but for the rest of the Intel products, for the price, you end up with performance that isn't all that great. Reduced overall performance from security mitigations(due to all the security problems with Intel chips) means that in another two years, even the 9900k may end up being slower than the
Re: (Score:1)
BZZZZT! AMD high performance, Out-of-Order (OoO) chips have the same class of Spectre bugs that every other OoO design and company baked into their CPUs, ARM, IBM both mainframe and POWER, MIPS, and SPARC. That was official in the very first Spectre paper.
No one has really cared that much about AMD's chips be
Re: (Score:2)
Most PC games are multithreaded now since all the consoles have been multicore since the 2000s. It's just that there tends to be one thread that is the limiting factor (if not the GPU) so single threaded performance can help... For peak FPS.
If you look at average FPS and 99th percentile AMD chips are very competitive. Also when you look at bang-for-buck metrics you can't beat AMD.
For most people the best thing would probably be to get an AMD CPU and spend the savings on a better GPU, then a few years later
Re: (Score:3)
Intel's dominance in the CPU market has two reasons (...) the second is capacity. They can churn out a lot more CPUs.
Not in the last decade, during the bad years between Bulldozer and Zen AMD had to pay several penalties because they couldn't sell enough chips to buy all the wafers they had committed to. They had the capacity but not the customers. Intel played dirty too but from 2006 (Core 2) to 2011 (Sandy Bridge) they made huge [anandtech.com] strides technologically, it's only after they saw that Bulldozer was a dud that they started taking their foot off the gas pedal. I loved the Athlons of the early 2000s as much as anybody but th
Re: (Score:2)
The inability to command fab capacity is one of the implications of their decision to going fabless
Re: (Score:2)
Re: (Score:3)
Until, as a hypothetical, the external manufacturer says something like, sorry, Apple has already reserved the extra capacity you now desire, we'll pencil you in for half of what you want next year.
AMD's most recent really big success period was due to superior high level architecture decisions (Hypertransport + direct memory attachment to CPUs vs. a FSB, and the AMD64 macroarchitecture when Itanium AKA IA-64 faile
Re: (Score:2)
Before their good chips AMD would just have wasted lots of money on their own fabs because they would have been underutilized. The fine they paid is probably not so much compared to that cost. And they would have been unable to finance the upgrade to "7nm", and then be stuck with a good processor design but not enough production capacity to make mon
Re: (Score:1)
Please "show your work", how does being the most technologically advanced happen only because of a financial advantage, which you're absolutely sure no one else was able to muster during this history (once upon a time VCs funded hardware as well as software companies)? Intel is showing right now that all the money in the world can't make their "10 n
Re: (Score:2)
Upgrading to new nodes became more and more epxensive in the past. Can you show me a company which jumped ahead of everyone with a new node, without investing huge amounts of money? And why should that suddenly happen for AMD? Fab upgrades are not really their area of expertise.
The money to pay these upgrades must come from somehwere. It requires production volume, or a really revolutionary chip. Having investors does not change that, AMD w
Re: (Score:2)
Many things could make this a temporary advantage, like Intel succeeding in their "7 nm" node (but I'm not betting on that one)
They could strike back, but I'm pretty sure the window of opportunity where Intel could get so far ahead as to drive AMD out of business has closed. There's not enough silicon generations left before both Intel and TSMC have to hit some kind of physical limit - a silicon lattice is 0.5nm wide it simply can't go on - and then I suspect they'll end up more like Airbus and Boeing, both making airplanes but neither is so revolutionary it'll take the whole market.
Re: (Score:2)
I think they were sort of forced into it by very bad decisions and mistakes they made right after beating Intel across the board. But whatever the reasons, their ability to bid against others for TSMC capacity will be a ceiling for what they'll be able to accomplish. Of course, it's their competitor being entirely unable to use its equivalent process node that gives them something of an advantage for now,
intel did Shit like raid keys caping pci-e lanes i (Score:2)
intel did Shit like raid keys caping pci-e lanes in high end cpus.
BS like old gen the $500 cpu had full lanes next gen $500 caped and you had to get the $900 one to get all the lanes.
Re:Go AMD! (Score:5, Interesting)
> All my friends who upgraded their PCs during the last couple years have chosen AMD.
Same. Everyone I talk too is super excited about Ryzen and Threadripper; none of my computer friends are talking about Intel chips. All we have seen from Intel is incremental upgrades -- quite a few of us are still on the 3770K / 4770K / 5770K era and have seen ZERO reason to upgrade -- until now! I predict the R9 3900X is going to become the new octa-core -- similar to how the i7 4770K was an extremely popular quad-core.
I just bought a another Threadripper 1920X (12C/24T for $200 !!!) and have a TR 3960X on (pre) order.
> Intel's dominance in the CPU market has two reasons
I would humbly add Lies of Marketing as reason #3.
Intel's shenanigans have been outright lying [youtube.com] (Intel's Disgraceful Marketing Gets Worse). When even Linus flames Intel [youtube.com] over the Core i9 10980XE you know it's bad. Best YT comment was this:
Re: (Score:2)
quite a few of us are still on the 3770K / 4770K / 5770K era and have seen ZERO reason to upgrade -- until now!
God damn it! I was hoping for another couple years of not paying for CPU/Mobo updates. Still on DDR3 here, so probably full build will be necessary.
Re: (Score:2)
Even though the i7 3770K / 4770K / 5770K are getting a little old they are still decent chips. It REALLY depends on what you doing.
The biggest problem with the R9 3950X, TR 3960X, and TR 3970X is availability! (Some might argue the high price(s) but these are HEDT -- they are NOT meant for casual users / gamers. I will admit the price of the 3rd gen Threadripper mobos being north of $500 is annoying. People eBaying them is just outright robbery.)
Basically bookmark this price summary and availability [nowinstock.net] meta
Re: (Score:2)
Re: (Score:2)
But yeah the trinity of hardware upgrades is CPU, RAM, and Motherboard IMHO
PSU as well, in most cases, because you really don't want your new hardware to run on a 7 year old 450W PSU, which might or might not provide all the specialized pinouts such as 8-pin CPU power connector. Add a new case too if you cheaped out in the past.
Re: (Score:2)
A new case is definitely a nice QoL!
From my experience I haven't had any issues with PSUs as I tend to buy 750W - 1,000W at the 80+ Gold or better. Âe.g. I just ordered a 1,000 Titanium PSU for the TR 3960X.
Corsair has an older but good PSU guide on efficiency [corsair.com].
Re: (Score:2)
I have never found a fanless Mini-ITX board with an AMD CPU.
Re: (Score:2)
What does this have to do with anything?
Re: (Score:2)
It's another reason to buy an Intel CPU.
Re: (Score:2)
Intel's dominance in the CPU market has two reasons, none of which is related to performance.
Sorry but that is just false. Intel's dominance has largely to do with performance as well. CPUs and computers aren't just tossed out like iPhones once a year. AMD being "back in the game" is a recent phenomenon and there's a shitton of Intel CPUs from a day where AMD had no viable performing alternative that will keep ticking for many years. And as it stands Intel still has the single threaded performance crown, though that is becoming less and less relevant.
Capacity also has little to do with it. You say
Re: (Score:2)
Re: (Score:2)
Over a dozen AMD systems, only one motherboard of mine ever died, and that was a Soyo Dragon, which was a shit motherboard for overclocking, while I was pushing it very very hard, and it was also from the bad capacitor era.
You sound like the guy in comments from a recent previous story where he picked up a Gigabyte Aorus board and apparently never bothered to check the reviews. The Aorus line is quite possibly the worst motherboards in history, and it took me all
Re: (Score:2)
Last time with the FX-8150
That was a long time ago. Things improved tremendously since.
Re: (Score:1)
Stagnation (Score:5, Interesting)
Yeah, he is saying that because they are getting massacred by AMD’s EPYC 2nd Gen Rome processors and they have no viable way to compete with them for the foreseeable future. For instance, one of the road maps I saw said that Intel wasn’t even contemplating a migration from PCIe 3.0 until 2021 at the earliest. AMD already has PCIe 4.0, which is 252 gigabit/s for x16 lanes. This is a problem for people like me because I’m already deploying 200 GbE cards.
Re: (Score:2)
AMD may have pci-e 5.0 at the time intel is just starting on 4.0
Re: (Score:2)
Re: (Score:1)
Who is currently selling 200 gigabit Ethernet cards?
Re: (Score:3)
Supermicro and Mellanox, for two.
Re: (Score:3)
More examples (Score:2)
These single-focus programs turn big companies into idiots*. When Google made overpowering Facebook their main goal, their applications started cross-leaking personal info like the diaper of 50 lb baby.
When Microsoft became "tablet UI or bust", they made the Frankenstein OS Windows 8 that did neither finger nor mice well: "Minger".
And once IBM made revenue streams their primary goal, and customer service took a shot through the heart as a result. Sure, they got a short-term revenue boost, but customers star
Re: (Score:2)
monoculture disasters (Score:2)
Turns out it happens to silicon plants too.
Re: monoculture disasters (Score:1)
Potatoes are a new world crop. The explorers brought back potatoes. The Irish said "hey, cool monoculture! Let's go for it." History tells the rest of the story.
Re: (Score:1)
That's a potential defect in Adam Smith's/David Ricardo's "comparative advantage" specialization. If your economy depends on a few narrow specialties, and change or problems render those specialties useless, then the population can suffer heavily until replacements are formed.
They brought back haswell (Score:2)
A microarch from 6 years ago because they are still dealing with supply issues in 14nm. What is going on? First they had problems making 10nm viable and slipped schedules. Now the working 14nm node is unable to keep up with demand, which has actually gone down in some segments due to AMD's competition and remained stagnant in others. They can supply fewer 14nm chips today than they could in 2017. It's almost as if their fabs are being sabotaged.
Re: (Score:2)
See my first comment [slashdot.org], Intel has a very strong self-sabotage program with stack ranking. And there are more reasons which can't be discussed on a forum like Slashdot.
But the main reason would be Intel inside not admitting that their "10 nm" node just doesn't work at all, and it's now clear won't ever, and converting their "14 nm" fab lines, either to produce support and other chips, since "10 nm"
Re: (Score:2)
But the main reason would be Intel inside not admitting that their "10 nm" node just doesn't work at all, and it's now clear won't ever, and converting their "14 nm" fab lines, either to produce support and other chips, since "10 nm" was supposed to replace them for CPUs, or maybe to switch to "10 nm" or "7 nm".
I'll have to review ASML's press releases but I didn't think they had shipped many EUV scanners to Intel yet. TSMC and Samsung have bought over 50 between the two though. Still, an overzealous conversion to 10nm sounds consistent with the habits of a company that has rapidly changed nodes for the last 40 years.
See my first comment [slashdot.org], Intel has a very strong self-sabotage program with stack ranking. And there are more reasons which can't be discussed on a forum like Slashdot.
Not surprising. They've had this problem for decades. Vinod Dham didn't stick around long after Pentium.
Re: (Score:2)
But the main reason would be Intel inside not admitting that their "10 nm" node just doesn't work at all, and it's now clear won't ever
To be clear, Intels failure on 10nm had to do with blowing off 5+ years trying to get their 3D Trigates to be practical at 10nm. Even at 14nm their yields were bad with Trigates. One can only imagine how bad they were at 10nm.
To be even more clear, Intels 14nm no longer use 3D Trigates either. After wasting 5+ years trying to make Trigates something other than a dead end, they had to suck it up, and are now desperately playing catch up, as more than one rent-a-fab has better FinFET's than Intel now.
The
Re: (Score:2)
Re: (Score:2)
A "3D Trigate" is a finFET. OP has no idea what they are talking about.
Specifically, trigate means the gate is run up both sides and also across the top of the fin and they are all connected together. The other producers of finfet processes do that as well, and their fins look roughly the same. Theoretically you could not run the gate across the top, then the two sides could be electrically isolated and you could even drive them separately, but no one really does that at volume.
Re: (Score:2)
It's almost as if their fabs are being sabotaged.
Meredith probably went behind Bob's back and changed the specs on the Malaysian production lines.
Strikes me as garbage (Score:5, Interesting)
Nothing will save you if you completely blow a transition to the next process node, as Intel has done with their "10 nm" node (all 193nm lithography, but more aggressive than TSMC's initial all 193nm "7 nm" node).
Nothing will save you if you're completely incompetent at doing other types of chips, as I've read Intel has been with cellular modems. They've never been able to meet their promises to customers, and have finally given up on the market.
Nothing will save your efforts to broaden your product lines you if you earn a reputation for giving up on non-core ones before they have a chance to make it, Intel's done this at least a few times, I don't know if they've lost enough trust in the process.
Nothing will save you if you adopt stack ranking to fire 10% or more of your employees every year, we're seeing the end game of this with GE, and we know about its sad history at Microsoft after Ballmer removed all the safety mechanisms Gates had built into the system. It means employees can't plan on having a career at the company, for one bad relationship with a new manager means you're out. If the granularity is small enough, it means you can't assemble teams of superstars, because no matter what their manager wants, he will have to fire 10% of them every year. All in all it makes employees focus on the politics of survival instead of doing what the company ostensibly is supposed to do.
Re:Strikes me as garbage (Score:4, Interesting)
Agreed. They only retroactively decided they were too obsessive on CPU market share when AMD came along to disrupt that.
The problem is despite earnest attempts to diversify, they always failed. IoT chips, Mobile devices, networking chips, complete systems, Wireless modems, omnipath, Optane, McAffee, HPC software. I would in fact say I've seen more evidence of the opposite, that they were so distracted by trying to diversify, secure in their unassailable PC market position that they failed to keep up with the CPU technology.
The truth is that their non-CPU efforts have sucked. Generally speaking it's not from lack of dedication or investment, it's because they simply lack the competencies and they don't even have a way of knowing how to get the competencies. Their failures have led to a reputation issue of giving up and that has hamstrung any future legitimate advances, but the failures failed in the first place because of more fundamental limitations than obsession on CPU.
Re: (Score:2)
I have to wonder about their embedded efforts which they abruptly exited not long ago, the Galileo and Edison. They are conceptually enough like their high end CPUs and support chips that they should have had the competencies necessary to make them worthwhile to use, but
Re:Strikes me as garbage (Score:5, Interesting)
We were given several Galileo boards for educational evaluation. Over a year's time, I repeatedly emailed Intel to try to obtain long-promised add-on boards that never showed up. Every time I did, the person who I had corresponding with previously had moved on to another group, and a new person said the boards would be there soon. And on and on ...
Apparently the number one priority of every engineer in the Galileo product group was to get out of it as quickly as possible. It didn't surprise me in the least when Intel abandoned the entire product line.
Re: Strikes me as garbage (Score:2)
I would point out Intel have a long, very long line of failed new CPU architectures, Itanium being the most prominent. They got lucky with x86 and used the profits to move ahead of everyone else's FAB process making up for the rubbish CPU architecture in the meantime. Noting that even some of their x86 architectures where rubbish too (here's looking at you NetBurst). Intel are and have always been a one trick pony and that pony is x86 processors.
Re: (Score:2)
You're confounding macro with microarchitectures. Not counting microcontrollers, their macroarchitectures are the 4004 through AMD64, iAPX 432 (first 32 bit in 1981, failed), i860 and i960 (RISC, first failed, second successful), and Itanium (VLIW, another concept that failed due to the "a sufficiently smart compiler" conceit).
Lots of these succeeded entirely on their own merits, like the early microprocessors which among other things launched the personal computer with the 8080, and the i960 which was ver
Re: (Score:2)
One thing is that Moore's law is slowing, but another thing is absolutely nothing is happening on the arch/feature side. True for both AMD and Intel. Buy a CPU today and get the sa
Re: (Score:2)
There's clearly some very nasty internal politics including the unwarranted optimism you guess going on, preventing the microarchitecture people from putting the next Ice Lake major m
Typical interview candidate answer (Score:3)
Sir, I work too hard and neglect my family. I often forget to file expense reports. I have this nasty habits of immediately doing stuff the boss orders without first worrying about ethical guidelines and other such issues.
Great news for AMD! (Score:1)
Re: Great news for AMD! (Score:1)
Take over what? All the Fabs that they rent?
Intel CEO announces he has no idea what he is doin (Score:1)
Re: (Score:2)
Hence the adage (Score:2)
I'll believe it when I see it (Score:3)
It's one thing for Intel to declare that CPU dominance doesn't matter so much, and another thing entirely to translate those words into actions.
Intel lives to sell high-margin chips. Their forays into low-cost computing (e.g. Galileo) have been an unmitigated disaster. The entire corporate culture revolves around rewarding those who design and sell high-end products, not commodity components.
For Intel to give up their obsession with CPU dominance in favor of low-margin products makes as much sense as Mercedes-Benz declaring that they will compete in the economy car market. It's not going to happen.
Re: (Score:1)
For Intel to give up their obsession with CPU dominance in favor of low-margin products makes as much sense as Mercedes-Benz declaring that they will compete in the economy car market. It's not going to happen.
Yes, the A series never happened.
Re: (Score:2)
With an entry-level MSRP of nearly $33K, the A series subcompacts are not economy cars, except in comparison to other Mercedes-Benz models.
Let me know when Mercedes-Benz starts competing in the same market as the Honda Fit and the Toyota Yaris.
Re: (Score:1)
Re: (Score:2)
The A series is the least-expensive Mercedes, sure.
And you can get a Honda or Toyota for half the price.
I don't think Mercedes is focused on the economy car market.
Re: (Score:1)
Blame Microsoft (Score:2)
specialized silicon chips
The Microsoft philosophy, which Intel adopted, was to implement everything with (proprietary) O/S drivers. All you need is the bare minimum A/D hardware and everything else will be taken care of by the CPU.
"Willing to let go"? (Score:5, Interesting)
More like desperately outclassed and knows it. It is absolutely astonishing that AMD with a far smaller budget, less people and no own fabs can humiliate Intel in this way. (And they just did it for the 2nd time too) Intel is fat, lazy and stupid and does care not one bit about its customers. There is no other explanation.
Also, Intel is traditionally a memory company, while AMD grew with signal processors. May explain why AMD engineering is so often better, for example with a CPU-integrated memory controller years ahead of Intel. The only edge Intel ever had was a better manufacturing process and some fundamental dishonesty that allowed them to cut corners that should not be cut.
Now, I do not hope Intel dies, that would probably be too much incentive for AMD to get lazy as well. But cutting them down to the size they deserve at around 30% of the CPU market, with the other 30% AMD and the rest from others would be an excellent long-term situation. That is if they ever manage to create a competitive, secure CPU design again.
Re: (Score:2)
There are different levels of engineering that are relevant here. Intel for a very long time has done very poorly at the highest levels of engineering architecture decisions, like their long obsession that they would have trouble att
Re: (Score:2)
Now that's an interesting angle. Suppose AMD realized they were completely failing with the successors to the K8 (no K9 ever made it out the door), and bought ATI to tide them over until they could get their CPU act together. Which didn't actually potentially happen until more than a decade later.
Re: (Score:2)
Yeah, this sounds like CEO speak for "We screwed up, and let AMD leapfrog us on desktop and server processors. Now we need to greatly increase our R&D budget to catch up, and that's going to hurt our quarterly earnings for the next year or so"
Re: (Score:2)
Pretty much so, yes.
Re: (Score:2)
It is absolutely astonishing that AMD with a far smaller budget, less people and no own fabs can humiliate Intel in this way.
It's not astonishing at all. If you outsource a large part of production and R&D, and then simply exclude those numbers then it may be bad, but the reality is unless you combine AMD and TSMC together you are comparing Apples to the entire bloody Apple tree.
Also, Intel is traditionally a memory company, while AMD grew with signal processors. May explain why AMD engineering is so often better, for example with a CPU-integrated memory controller years ahead of Intel.
It's debatable just how "ahead" they actually are. It was only a few weeks ago that AMD actually managed to surpass Intel in memory bandwidth at the upper end, and as for memory latency even their latest Ryzens are still well behind Intel on that fron
Re: (Score:2)
You really have no clue what you are talking about. That is not "calling my bullshit", that is just disgracing yourself publicly.
Translation: We were so greedy, it ruined us. (Score:2)
Holy hell, I was afraid the CPU market might become a monopoly, but I didn't think it would be AMD winning out.
Nor did I think that I would wish Intel to make a comeback to keep the market from that.
Re: (Score:1)
Intel has more than enough resources to develop a new architecture and return to the fight. We might not see them for 2 years, but they'll be back. The desktop market is too important to just cede.
Re: (Score:2)
We might not see them for 2 years, but they'll be back.
With turn-a-round times of about 5 years between initial design and production, you are right about 2 years, but only for the transition to a smaller node. Intel realized its failure to solve the yield problem about 3 years ago as evidenced by their massive layoffs and P.R. about their "New Cloud Strategy."
Yes, Intel knew it before AMD released their first chiplette designs. These companies cant keep secrets from each other. AMD was ~4 years into chiplettes before Intels first major layoffs, where the ~4
Started with iPhone miss, cloud finished it (Score:1)
Re: (Score:2)
Every single Out-of-Order (OoO) design has Spectre bugs, and that name was chosen because these bugs will be haunting us for a very long time. (Even the Meltdown mistake was shared by ARM, and IBM, both mainframe and POWER.)
For cloud vendors, there's plenty of differences between AMD and Intel, and AMD isn
Re: (Score:2)
At least they got will.i.am to be Director of Creative Innovation...
Intel deserves scorn (Score:4, Interesting)
When I was at Intel, I was working on the Infotainment project, getting Intel chips into cars for GPS, radio, streaming etc. I found a bug in the FM hardware that would only tune in every 3rd station. Got proof of the domain of the bug, reported it to the manager (one of the old-line good old boys). He told me to go up to the top floor, find any station that worked, forget the bug, and send the dev kits off to the customers. Although Intel culture says to get things right, reality was that my small pushback got me on the shit list and out of the group in short order. The project apparently died anyway, as most all non-CPU things do there. Intel is a pit of backstabbing vipers and marketing weasels. The firehose of CPU money is the only thing that keeps them alive.
There was a digital signage project whose only purpose was to dump overstocked low-binned Core2 CPUs on an unsuspecting market. The dev kit -never- worked. It was so slow, it couldnâ(TM)t get out of its own way. Yet, Intel told customers they were custom-built chips made for signage. Lies.
On the other hand, I was in the lab with a guy tasked with bringing up the Atom. I got to see the first time one ever booted. He used Doom as the first test suite.
The Intel Chopper (Score:2)
They spent over $2 mil to have Orange County Choppers build an Intel bike.
They kept it behind glass at the Chandler AZ facility. The electronics were all empty fakes.
It did have a nice paint job. I wonder where it ended up?
Paul Otellini was a buffoon.
Diversify! Wait, first sell this cellular business (Score:3)
So the message here is get in to other business, but, sell them off before they can deliver and stick to CPU's. Got it. I think...
"90%" market share (Score:2)
They also "forgot" to mention that their market share has shrunk to mere 18% in some segments, such as: https://wccftech.com/amd-decim... [wccftech.com]. The OEMs are slower in their moves than people making their own builds, but one can see more and more AMD-based computers on the market.
When going to pretty much any dealer and checking the most popular CPUs, the top 2-5 spots are almost always AMD's. So they didn't only forget other goals than 90% CPU market share, they also slipped on that goal, and it looks like their