Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Businesses Intel

Intel's Dow Status Under Threat As Struggling Chipmaker's Shares Plunge (reuters.com) 72

Intel's slumping share price could cost it a spot in the Dow Jones Industrial Average. Reuters reports: Analysts and investors said Intel was likely to be removed from the Dow, pointing to a near 60% decline in the company's shares this year that has made it the worst performer on the index and left it with the lowest stock price on the price-weighted Dow. The chipmaker's shares slid about 7% on Tuesday amid a broader market selloff, with the Philadelphia SE Semiconductor index (.SOX) down nearly 6%, following reports of lower chip sales globally in July.

A removal from the index will hurt Intel's already bruised reputation. The company has missed out on the artificial intelligence boom after passing on an OpenAI investment and losses are mounting at the contract manufacturing unit that the chipmaker has been building out in hopes of challenging TSMC. To fund a turnaround, Intel suspended dividend and announced layoffs affecting 15% of its workforce during its earnings report last month. But some analysts and a former board member believe the moves might be too little, too late for the chipmaker.

This discussion has been archived. No new comments can be posted.

Intel's Dow Status Under Threat As Struggling Chipmaker's Shares Plunge

Comments Filter:
  • by ceg97 ( 976736 ) on Tuesday September 03, 2024 @07:51PM (#64760132)
    Feels a lot like Kodak.
    • Feels a lot like Kodak.

      I've long been a fan of AMD, but realistically their only big advantage is the piles of cheap cores. Now that's awesome when you have something that parallelizes well, but the majority of stuff is still single threaded. I don't know if it's their compiler work or their Linux kernel contributions, but in all but the most trivially parallelized workloads my Intel i5 beats my 3rd gen threadripper.

      • I've long been a fan of AMD, but realistically their only big advantage is the piles of cheap cores.

        Depends on the chip. Just today, I needed to do a custom build to run a piece of software that doesn't do multithreading very well, so I wanted a chip with the highest single core performance. AMD had a chip on the shelf with a 4.7GHz clock speed; the fastest Intel chip was 3.7.

        Sure, GHz isn't everything, but in this particular case, it makes a difference, and AMD still has CPUs that reliably beat Intel in single-core performance on the Ryzen side, in addition to their Threadripper CPUs that blow Intel out

      • Considering that the majority of the chip makers profits comes from high margin sales to businesses/enterprises/cloud providers, maybe it's smart of AMD to have focused on scaling up number of cores at the expense of single thread speed.

        Intel chips are less power efficient, and have had more security problems also. I use AMD in 3 out of the 6 computers I built and use at home for various purposes. I deeply regret my most recent 2 Intel purchases of Intel 12400 and 13400 CPUs for my 2 HTPCs. This is more due

        • Considering that the majority of the chip makers profits comes from high margin sales to businesses/enterprises/cloud providers, maybe it's smart of AMD to have focused on scaling up number of cores at the expense of single thread speed.

          Sure, except that AMD now has better single thread performance on anything that takes less wattage than a goddamn toaster, and it's damned close even then (and while using far less power.)

      • by m00sh ( 2538182 )

        Feels a lot like Kodak.

        I've long been a fan of AMD, but realistically their only big advantage is the piles of cheap cores. Now that's awesome when you have something that parallelizes well, but the majority of stuff is still single threaded. I don't know if it's their compiler work or their Linux kernel contributions, but in all but the most trivially parallelized workloads my Intel i5 beats my 3rd gen threadripper.

        Yes!

        Now that Intel has more cores in their CPUs than AMD, that advantage is gone. Intel has 24 cores and AMD has 16 on their highest ends.

        Time to go all in on INTC.

        • by Kazymyr ( 190114 )

          24 cores where more than half are Atom-based "efficiency" cores that struggle under any real load. Whereas AMD's cores are all the same.

          • by m00sh ( 2538182 )

            24 cores where more than half are Atom-based "efficiency" cores that struggle under any real load. Whereas AMD's cores are all the same.

            Yeah, but they are in two different chips. Then, Intel would have 48 cores per two chips.

            All in on INTC.

      • I've long been a fan of AMD, but realistically their only big advantage is the piles of cheap cores.

        That hasn't been true since the FX. AMD has had decent single-thread performance since then, and now they have an even better weapon available to them — big piles of cheap cache. They are getting great performance with their throw-cache-at-it approach. I remember even way back in the Super Socket 7 days, their processors with L2 cache on it made your mainboard cache into L3 and for certain programs it made kind of a hilariously huge difference.

        • I've long been a fan of AMD, but realistically their only big advantage is the piles of cheap cores.

          That hasn't been true since the FX. AMD has had decent single-thread performance since then, and now they have an even better weapon available to them — big piles of cheap cache. They are getting great performance with their throw-cache-at-it approach. I remember even way back in the Super Socket 7 days, their processors with L2 cache on it made your mainboard cache into L3 and for certain programs it made kind of a hilariously huge difference.

          And yet they're slower [cgdirector.com].

          I've long been an AMD fan, mostly because I love underdogs and hate a monopoly. But for the average user Intel seems to be better option [tomshardware.com].

          • And yet they're slower.

            They're slightly slower on synthetic benchmarks, they're barely slower in the real world, they're much cheaper, motherboards for them are cheaper because the chipsets are also cheaper, and they use much less power which means the TCO is less.

            I've long been an AMD fan, mostly because I love underdogs and hate a monopoly. But for the average user Intel seems to be better option.

            You mean the same intel that currently has RMAs for their defective CPUs several months out? Fell off the turnip truck last night, huh?

      • I've long been a fan of AMD, but realistically their only big advantage is the piles of cheap cores

        How did you fail to notice that Intel has been doing contortion acts for years, in a futile attempt to catch up with AMD's power efficiency?

    • Nonsense. Kodak was steamrollered by, first, digital image capture (invented by, wait for it...Eastman Kodak) leaving EK with enormous un-used coating capacity, and then by the growing ubiquity of phones with cameras. Intel's pickle is just old fashioned bad management and fab problems with the industry's habit of taking an architecture and improving it by shoving more power and cycles down its throat. I read something recently about AMD having some issues as well with a recent chip iteration. Will be inter
    • by AvitarX ( 172628 )

      I don't think Intel has refused to do AI, they just haven't (and may not ever) succeed there.

      Kodak on the other hand invented the digital camera and refused to market or push it because they used the razor model and digital photography didn't need blades.

    • Big difference. We don't need film anymore. We do need semiconductor fabrication.

      When all semiconductor manufacturing is maxed out at around 1.7 angstrom (smallest area a perfect silicon crystal can achieve semiconductor effect), new companies will be able to purchase entire fabs in a box (a really big box) from China. The Chinese will subsidize a price war to collapse the Taiwanese economy and Intel will face domestic competitors.

      Intel can probably survive on patents for 25 years.

      That said, if Intel were t
      • by m00sh ( 2538182 )

        Big difference. We don't need film anymore. We do need semiconductor fabrication.

        When all semiconductor manufacturing is maxed out at around 1.7 angstrom (smallest area a perfect silicon crystal can achieve semiconductor effect), new companies will be able to purchase entire fabs in a box (a really big box) from China. The Chinese will subsidize a price war to collapse the Taiwanese economy and Intel will face domestic competitors.

        Intel can probably survive on patents for 25 years.

        That said, if Intel were to cave to market pressure and make huge arrays of fast cores, they'd see an Intel Core revival again.

        I use and love AMD CPUs, but their chipsets still suck... Bad.... Literally. I have no idea why, but they just can't manage power. I never have these problems with Intel, but every AMD system I've used just doesn't stop producing unnecessary heat. (not sure about laptops, haven't tried. Intel is just too good there to bother)

        Either way, Kodak's business got obsoleted. There simply wasn't a need for them. Intel just needs a new Andy Grove

        Yes!!

        AMD CPUs are such power hogs. Who bothers with AMD CPUs when you need any sort of power efficiency anyways? Intel always wins on that.

        Time to go all in on INTC.

    • Came here to say Boeing. This is what happens when public companies are run by accountants and CEOs incentivised to increase stock prices.

      There is no doubt that money is critical to every business. However, if profits are the overriding factor in decisions, more important things like customer satisfaction and long term health are compromised and open a company up to competition and failure.

      Kodak declined because they rode their cash cow instead of adapting to a changed marketplace. Intel is doing the sa
  • Pat "ticktock" Gelsinger is basically an idiot whose fundamental incompetence was masked in the past by Intel's illegal market control. Now, anybody who still has a shred of respect for him should consider the sad trail of destruction he has left behind him.

  • Then we'd have some unique two-cent stocks.

    • by sodul ( 833177 ) on Tuesday September 03, 2024 @08:36PM (#64760192) Homepage

      Boeing and Intel are probably both Too Big To Fail, if not for their actual size (shrinking like a snowman in the spring), at least for their strategic importance to the USA and their allies. The reliance on TSMC is too risky.

      On the other hand, there are other airplane manufacturers for the army, and GlobalFoundry is probably good enough that the loss of Intel would not be that bad.

      • Intel is not. If they had a bigger presence in manufacturing in the United States then yeah they would be but they don't. So they can easily be replaced by whoever wants to make high-end arm CPUs.

        What's going to suck is enthusiast PCs and gaming PCs are basically going to go the way of the dodo if arm takes over. I don't care what anyone is telling you arm CPUs cannot hang with Intel and AMD when it comes to raw power. It's just that power comes with the price of at least 65 Watts. Still I like being ab
        • by anoncoward69 ( 6496862 ) on Tuesday September 03, 2024 @11:32PM (#64760452)
          If Intel/AMD disappeared im sure ARM manufactures could ramp up to take their place. The reason ARM processors aren't as powerful because they mainly focus on the low power and mobile markets that Intel/AMD don't care about. If there were a sudden void in that market for an ARM manufacturer to step in and make a monster high performance CPU im sure they would.
          • The reason ARM processors aren't as powerful because they mainly focus on the low power and mobile markets that Intel/AMD don't care about.

            People are trying and have been trying to make arm processors that are as powerful as amd64. It just never works. Apple has made a decent try but they've come up with chips that have unfortunate limitations, and then they make them even more limited by pairing them with inadequate cooling so people literally resort to running them inside of their refrigerator while doing heavy number crunching which is hilarious. It just works... in the arctic.

            Further, Intel and AMD have both made numerous attempts to get i

            • Apple's core products are all battery dependent. Even in the days when they weren't, they still couldn't pull this off with their IBM partnership in powerpc. They tried to claim their G4 was the fastest PC, except it wasn't. So they gave up and switched to Intel. That threw their fans into a tailspin because they had all bought into the propaganda shit apple sold them only to turn around and say "oh yeah, Intel was faster the whole time, we just told you that shit because we knew you'd buy it anyways".

              Apple

              • The dual G4 and G5 were legitimately impressive, and possibly the fastest desktop systems you could just go and buy at the time. If you wanted a more powerful PC, you had to use a server-class product and add parts to it. Apple undeniably has achieved significant and impressive performance with their new processors, and if they can get over the architecture hurdles that Intel and AMD crossed over in the area of off-chip bandwidth then they will really have something impressive.

                Conversely, maybe PCs will st

                • The dual G4 and G5 were legitimately impressive, and possibly the fastest desktop systems you could just go and buy at the time.

                  I recall the Opterons already being ahead of them. Intel was still playing the long-pipeline "more mhz wins" game with that crappy netburst architecture, and PPC wasn't far behind. AMD had already hopped off of that bandwagon, and Intel later changed course with the Core series. Meanwhile Apple was still holding the powerpc bag after Steve Jobs came back and re-licensed OS 9 so that apple with its whopping 6 million desktop users were once again the only ones buying powerpc and it never caught up. Apple's p

                  • You guys keep repeating this claim here but it's all BS as I've been saying for years. The differences in distance related are so insignificant that by the time the refresh cycle synchronizes, you've long since lost any supposed speed gains. Anybody who claims otherwise has no idea how snchronous DRAM works.

                    How synchronous DRAM works: Latency is added to account for the fact that even in the best case, the paths to the RAM do not have the same latency, so access to the RAM is slower when it is off-chip.

                    Maybe, just maybe if you try really hard with something like GDDR6 you can get slightly tighter timings, but as he mentions with DDR4, this is not at all the difference you're making it out to be

                    It clearly makes a large real-world difference.

                    Cache is SRAM. That's a whole other slice of cheese.

                    Cache is near-processor. That's the similarity.

                    You remind me of kids on the playground during the nintendo vs sega days talking about how much bits their next console would have making it better.

                    More bits are better, all else being equal. The Intellivision was the most capable console of its generation despite not even being at 1 MHz speed in part because it was 16 bit.

                    Linux is now getting this nice influx of users thanks to Microsoft being dbags.

                    That or valve is heavily pushing gaming on Linux.

                    It's no doubt both things, but I am seeing

                    • How synchronous DRAM works: Latency is added to account for the fact that even in the best case, the paths to the RAM do not have the same latency, so access to the RAM is slower when it is off-chip.

                      No, this is false. 20 years ago AMD made a major change to its own architecture that gave the best speed improvement we're even going to see in this regard. You know what that was? They put the memory controller in the CPU instead of being part of the northbridge. Intel started doing the same shortly after, and other architectures started following suit. That is the only place where distance actually mattered, and it's already as close as it's going to get. Once the commands have been sent to the controller

        • Shows how much overhead is in modern operating systems when it takes a decent computer to play a 2 decade old video game. Having to employ complex calculations like run-ahead to determine the state in the next frame and not add latency.

          Can I ask why not just use a PS2?

      • Re: (Score:3, Insightful)

        by geekmux ( 1040042 )

        Boeing and Intel are probably both Too Big To Fail, if not for their actual size (shrinking like a snowman in the spring), at least for their strategic importance to the USA and their allies. The reliance on TSMC is too risky.

        On the other hand, there are other airplane manufacturers for the army, and GlobalFoundry is probably good enough that the loss of Intel would not be that bad.

        President Biden signed the CHIPS and Sciences Act that provided $280 billion in funding. Including $39 billion in domestic chip manufacturing subsidies. As a taxpayer, don’t talk to me about what’s Too Big until you first tell me where the fucking money went.

        That Act was signed two years ago. Guess it’s already time to follow the money.

        • There is a big difference between approving the money and spending it. The management of the $50B was assigned the National Institute of Standards and Technology (NIST), which was a $1B agency, with a lot of work already on its plate. Do you know how hard it is to manage that much money? All of our scientists who wanted to do any management have been vacuumed up by the CHIPS sub-agency, and are working as hard as they can to figure out appropriate uses of the money. Unfortunately, that also means the res

          • Most of the money hasn't gone anywhere, yet. Also, the spend period was 5 years, not instantaneous.

            Maybe gullible taxpayers should stop believing “moar money” is the answer every time after watching scientists make excuses about not being accountants for two damn years, which is hardly “instantaneous”. Maybe we should start DOING something with the funds before some politician lies and says the need moar to steal. Or before some failing company makes a Too Big to Fail excuse.

          • Most of the money hasn't gone anywhere, yet. Also, the spend period was 5 years, not instantaneous.

            Intel postponed it's Columbus fab and is now planning one for Ireland, on top of the new fabs it's building in Poland and Israel. TSMC postponed it's second Arizona fab and is now planning to build a second in Japan instead. Samsung announced this April that it would be expanding it's facilities in Texas using $6 billion in Chips Act funding, but the majority of it's investments are going to stay in Korea.

            Now why did they pull back from here? Because of all the "diversity" requirements the Chips Act codi

        • Add on to previous comment: The $280B was the infrastructure package and everything else put together. CHIPS was (only?) $50B.

      • by CEC-P ( 10248912 )
        Well, technically some high end chips would help Boeing calculate how ****ed they are.
  • Recently some dude on Reddit took his inheritance and went all-in on Intel. He's something of a legend now on WallStBets. It's probably all his fault. /s

    • by Anonymous Coward
      I can relate. In 2008, took my life savings that I made slinging mail sacks on turnaround shifts at a mail processing plant, went more than all in- 50% margin, and put it all on INTC. Watched it plunge in half. Had to get paycheck advances to ward off the margin calls (first 2 I fulfilled, third broker forcibly sold on me) I had been under the delusion stocks always go up (should've bought an index)
  • by Anonymous Coward

    The company has missed out on the artificial intelligence boom

    That's an interesting way to spell "bubble".

  • People don't like buying self-baking chips (as oddly delicious as that sounds) or chips with built-in security holes (with the added bonus of 30% performance hits).
  • The company has missed out on the artificial intelligence boom after passing on an OpenAI investment

    At least they're not buying into the bubble in what would have been a wildly desperate throwing of bags of cash onto Altman's hype train.

    • Yup, AI is a fad. Being logical says that not investing in it is smart. But investors are sadly not logical... They were also demanding TV makers buy into the curved and 3D TV crazes.

      • Yup, AI is a fad. Being logical says that not investing in it is smart. But investors are sadly not logical... They were also demanding TV makers buy into the curved and 3D TV crazes.

        Perhaps we should stop feeling sad for idiot, er I mean “investors”. It’s odd finding someone with money who fails to grasp basic math. Then again, the US stock market hasn’t made sense in over a decade. We don’t deserve a mere crash this time around. It’s that far fucked, because of “investors” defining “value” where there is none. Just viral hype. Not unlike “news” casters..

      • AI is a fad. Being logical says that not investing in it is smart.

        AI is a fad which is not over and throwing some functional units into the processors which are designed to do processing specifically for AI is not particularly difficult or expensive. All of these processors already have similar logic in them and have been getting more and more of it over the years as it is, so adding a little more and slapping "AI" on the name is a minor hardship. As long as the fad persists, printing that sticker is printing money, too. When the fad stops, they can simply stop doing that

    • Look how fat Nvidia has become. AI may be a bubble, but it's a highly caloric bubble.

  • by dcooper_db9 ( 1044858 ) on Tuesday September 03, 2024 @10:08PM (#64760336)
    This is the kind of value imbalance that was taking place in the late 90s, just before the dotcom bubble burst. This is the AI bubble. Intel will bounce back after the market crashes.
    • Not this time. Intel has been mismanaged for too long.

      • Not this time. Intel has been mismanaged for too long.

        I agree. My thoughts were that it would be a slow burn after 10nm failed.

        Now they have to use excess capacity that AMD doesn't use at TSMC. Really? What?!?!

        Stick a fork in... they are done. Should of shorted the stock 5 years ago.

    • This is the AI bubble. Intel will bounce back after the market crashes.

      What makes you think they will bounce back? Do you think they are skilled enough at abusing their near-monopoly position? They certainly are not engineering anything that consumers want to buy.

  • OpenAI is losing money hand over fist, is being propped up by deep pockets at Apple, NVidia and Microsoft and Intel would have no realistic ability to liquidate their holdings except praying another big investor would take their bags from them.

    Intel should have invested heavily in GPUs, TPUs, etc. to have a viable route to riding the AI hype wave.

    Going "nahhhh bruh" to that offer from OpenAI was literally the one "missed opportunity" that was actually wise for them.

  • Currently, the highest weighted stock - UnitedHealth Group (UNH.N), opens new tab - is priced about 29 times higher than Intel.

    There is something really wrong with the US Economy when this is true.

    • The market is showing where the money is. Right now it is being made in the health care sector. Or I should be saying health care is the where all the greed and corruption is right now.

      People can avoid buying a new computer, but they can't avoid being healthy. Health care is like a public utility, everyone should be able to afford it - but there are no price restrictions, ownership restrictions, or profit restrictions. This invites pure capitalism.
      • there are no price restrictions, ownership restrictions, or profit restrictions

        The insurance companies' profits are limited to a percentage of the price, which is why they are happy for costs to rise.

  • This may be an over-reaction to their reduced profit. If you look at the numbers they still produced $12.8B last quarter. That's a lot of chips.
    • This is why I hate wall st and I think companies would be better off staying private more often. "They are only going to grow by 3% instead of 5%!? SELL SELL!"

  • After all the anticonsumer crap and illegal actions against AMD, I couldnâ(TM)t be happier to be a step closer to a world free of intel.

    AMD and ARM can take over and RISC-V and POWER can raise to the challenge.

  • As punishment for this severe failure the CEO will only get $100 million in compensation as they leave instead of $150 million.

Today is a good day for information-gathering. Read someone else's mail file.

Working...