Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
IBM AI Technology

IBM Unveils Power10 Processor for Big Data Analytics and AI Workloads (venturebeat.com) 63

At the Hot Chips 2020 conference, which was held virtually this year, IBM announced the IBM Power10. From a report: It's the successor to the Power9 and represents the next generation of the company's processor family. IBM claims that the Power10 delivers up to three times greater efficiency than its predecessor while at the same time delivering higher workload capacity and container density. The Power10 was designed over five years and has the distinction of being IBM's first commercial 7-nanometer processor. (In 2015, IBM, Samsung, and other members of IBM's Research Alliance produced the first test chips as part of a $3 billion R&D investment.) There will be multiple configurations, and while the specifics aren't yet being disclosed, the maximum single-chip-module offering won't exceed 15 SMT8 cores and the dual-chip-module offering won't exceed 30 SMT8 cores, according to IBM distinguished engineer and Power10 architect William Starke.
This discussion has been archived. No new comments can be posted.

IBM Unveils Power10 Processor for Big Data Analytics and AI Workloads

Comments Filter:
  • by Anonymous Coward
    That's a euphemism for spying on you, right?
    • How tone deaf do you have to be to use a term like that? That's like Exxon saying we're ramping up "Big Oil Pollution". That's not going to make you popular.
      • trash talk is cheap.
        how will it stack up at the verdansk joint armistice training facility
      • Then maybe DON'T FUCKIN POLLUTE our planet?
        How about that?

        How sociopathic do you have to be, to call someone basically stupid, for not thinking that the solution is lying to people while continuing to harm them?

        If you can't be honest about your actions, there's your hint that what you are doing may be wrong.

  • The X86 ran out of steam, so the Power comeback could cause some new excitement.
    • by guacamole ( 24270 ) on Monday August 17, 2020 @10:09AM (#60410273)

      Damn, now Apple will be forced to switch back to PowerPC architecture only six months after completing transition to the ARM architecture.

      • Also, there will be a great 3rd party market for fashionable backpacks to carry the power and cooling unit needed for the new Power10 based MacBooks.

        • by mlnelson ( 58842 )

          Any excuse for Apple to sell another accessory / cable / adapter / etc....

        • by chill ( 34294 )

          Hmmm... there was an article earlier today about the recently declassified CIA project with a bird drone using a radioisotope power system. That could have applications with a Power-10 based laptop.

        • Well, it does say Power’ right there in the name...

    • Next up the Itanium Zombie will rise.

      But Could it play Doom?

      and Could we get an AI to write Duke Nukem Forever on it?

    • Coming back on personal computers? Maybe not. Meanwhile at our shop, we run a bunch of heavy load LPARs (virtual machines) very reliably and seamlessly on our existing Power machines (AIX). It's nice hardware, but at least for our primary app, it will probably be replaced by x86 SUSE machines before long. I hope that reliability doesn't take a turn for the worse on commodity hardware.

      • How does it compare and what are the differences to say a modern Epyc?

        I mean what did catch your eye?

        • We're using Power8, which is a few years old now, and I have no idea how it compares to any of the Epyc processors.

      • by dwywit ( 1109409 )

        I'd be keen to hear about that, say, 12 months in to the new hardware.

        IBM mid and mainframe hardware has a pretty good reputation vs. commodity.

    • What makes you think anything like that? IBM will continue iterating on POWER so long as they can profit from it. POWER11 is "in development" and may well come out regardless of the state of the rest of the market. x86 hasn't "run out of steam". Keep your eye on AMD.

      • POWER11 is "in development"

        "Most POWERs only go up to 10, but . . . "

        Does POWER have the equivalent of the infamous Intel Management Engine?

        . . . just wondering . . .

        • Who knows? IBM isn't talking. Well okay, maybe they are:

          https://www.crn.com/news/compo... [crn.com]

          NOT the same as IME (which was originally billed as a remote management system), but allowing full read/write of system memory over an Infiniband connection presents an interesting attack vector (and potential backdoor).

    • POWER machines are very expensive for the amount of performance they provide.
  • by backslashdot ( 95548 ) on Monday August 17, 2020 @10:04AM (#60410245)

    Look at this Intel, everyone and their mother has a 7nm processor.

    • Figured a 5 digit user by now would know those numbers mean very little.

      • Seriously? You realize that Intel's 10nm COAG technology failed, so they couldn't claim a die area win over 7nm. Therefore, there's no spec on which Intel 10nm beats either Samsung's or TSMC's 7nm(+) transistor. Measure any useful spec. -- transistors per mm ^2, power consumption, or anything like that. How is it credible to believe that Intel's 10nm is superior to the latest 7nm from Samsung/TSMC?

    • and PCIe Gen4

    • Look at this Intel, everyone and their mother has a 7nm processor.

      So does Intel. They just call it 10nm because the name has nothing at all to do with any actual dimension.

      Intel 10nm: 100.76 Million transistors/mm^2
      TSMC N7FF: 96.5 Million transistors/mm^2
      Samsung 7LPE: 95.3 Million transistors/mm^2

  • ...when they went to the dark side (little-endian).

    • by sconeu ( 64226 )

      I thought Power was endian-selectable?

      • You are absolutely correct here. During system mode the kernel can direct which mode to enter for the rest of the system run. Additionally, on the IBM z/OS and IBM i, LPARs can be configured to run in a different mode. This has been a thing since POWER8.
        • by bws111 ( 1216812 )

          z/OS (and Z in general) has absolutely nothing to do with POWER. Z is always big-endian.

          • I didn't say z/OS wasn't. I said LPARs that are created in z/OS can choose. You can have an LPAR running Linux in little-endian, even though the host is big-endian. The hardware takes care of it as opposed to the hypervisor trying to shift back and forth as z/OS and IBM i can enable that virtualization when it starts up the LPAR. Linux itself as a hypervisor however cannot, so this is why you cannot run z/OS with a Linux host but can run Linux with a z/OS host.

            If there was any confusion about what I mea

            • by bws111 ( 1216812 ) on Monday August 17, 2020 @02:52PM (#60411637)

              I think you are a little confused on terminology. LPAR's are 'built in' to the system. They are managed by an internal hypervisor. On 'Z' machines, you can create LPARs to host the z/OS operating system (a CP parition), the Linux operating system (an IFL partition), or various other partition types.. The difference between CP partitions and IFL partitions (on Z) is not endian-ness (they are always big-endian). IFL partitions do not support certain instructions that are required to run z/OS, but CP partitions can run Linux or anything else. On certain machines (LinuxOne), CP partitions are not available, so only Linux can be run.

  • Comment removed based on user account deletion
  • Properly, there are supposed to be periods after each of the two letters, you know. You can argue that omitting the periods has become more or less a colloquial way to refer to it, but such a colloquial representation has no more place in a news headiine than an emoji.

    The first thing I usually think of when I see these kinds of headlines is "Who's Al?" before I realize that is actually being talked about.

    </rant> #grammarnazi

    • Properly, there are supposed to be periods after each of the two letters, you know. You can argue that omitting the periods has become more or less a colloquial way to refer to it, but such a colloquial representation has no more place in a news headiine than an emoji.

      The first thing I usually think of when I see these kinds of headlines is "Who's Al?" before I realize that is actually being talked about.

      </rant> #grammarnazi

      If you're an American who's been around for a minute or two, I highly doubt you need "U.S.A." dotted out for you. You see acronyms for "US" or "USA" and you accept them almost instinctively, no dotted notation needed.

      And in 99.9% of cases, when "AI" is used in a document, context provides 100% clarity for that acronym usually within the first sentence or two.

      TL; DR - #ShitYou'reUsedTo Not all acronyms are created equal.

      • by mark-t ( 151149 )

        USA is quite unambiguous. US usually isn't ambiguous either, although cases exist where it might be.

        "AI" is highly ambiguous... particularly online where you may have no control over the font the information is presented in.

        I'm not going to argue that it's sometimes so obvious from context that no disambiguation should be necessary, but the fact of the matter is that something as small as a headline isn't always going to have that context. Really, would it hurt that much to be explicit in at least th

        • USA is quite unambiguous. US usually isn't ambiguous either, although cases exist where it might be.

          "AI" is highly ambiguous... particularly online where you may have no control over the font the information is presented in.

          I'm not going to argue that it's sometimes so obvious from context that no disambiguation should be necessary, but the fact of the matter is that something as small as a headline isn't always going to have that context.

          Really?

          "Next Generation AI"..."AI in the 21st Century"..."AI Assisted Learning"...Just made those titles up, and yet all of them give a hint to the average 20 to 50-year old what the article is about in just two or three words.

          The word "set" literally has hundreds of definitions, as many words do. Ambiguity is defeated with context and an educated reader.

          Really, would it hurt that much to be explicit in at least the very first thing that a person is likely to read about it? At the very least, it gets the person on the right mental page so that later usage of "AI" will usually be contextually clear.

          Artificial Intelligence is a concept well over 50 years old. When exactly do you think we should drop the dotted notation in that acronym? Needs anoth

          • by mark-t ( 151149 )

            The ambiguity doesn't arise because of a limit on how much the reader knows about A.I., but because the capital letter i and lowercase l are visually indistinct in some sans serif fonts, and even if they know tons about A.I. they can still be aware of the name Al, and may always make a stronger mental connection to that name than to it referring to A.I.

            It could end when Al stops being a person's name, but not simply when A.I. knowledge is ubiquitous.

            Even better for differentiating them in the general c

            • The ambiguity doesn't arise because of a limit on how much the reader knows about A.I., but because the capital letter i and lowercase l are visually indistinct in some sans serif fonts, and even if they know tons about A.I. they can still be aware of the name Al, and may always make a stronger mental connection to that name than to it referring to A.I.

              It could end when Al stops being a person's name, but not simply when A.I. knowledge is ubiquitous.

              Yes, or it could simply be when "AI" goes far more mainstream, which it certainly will. 15 years ago, the only people commonly using the word "tweet" were bird watchers. Today, it's so normalized even the local elderly news anchor reports about "tweets" on the regular. AI will eventually eclipse every Albert in the world, regardless of font struggles.

              As far as acronym ambiguity in general, good luck with that. When a friend of mine was first introduced to the "BLM" movement, he was wondering why the Bur

          • Someone should make a machine learning system that generates parodies of popular songs. They could call it "Weird AI".
      • Yeah, well that's all good for US!
    • It's not a grammar fail, it's a font fail. Sans-serif fonts unfortunately often require (often implicit, not explicit) context to determine the glyph, rather than a uniquely identifiable glyph, trading the supposed eye strain of serifs for the higher cognitive load of figuring out what technical terms and numbers mean.

      Whenever I write technical documents, I push as hard as I can to get a serif font for this reason. Sadly, my company has all its templates with a sans-serif font...

    • #grammarnazi

      When a Grammar Nazi is ranting against common and generally accepted language it sort of makes you like actual Nazis, extinct, pointless, without any actual power and hated by all.

    • by bws111 ( 1216812 )

      There is no such 'rule', it is a matter of style. Some publications (like the New York Times) always put periods between initials (they would write I.B.M. and A.i.) while other (most?) publications do not.

  • by lobiusmoop ( 305328 ) on Monday August 17, 2020 @12:18PM (#60410809) Homepage

    If this image [venturebeat.com] from TFA is anything to go by, the chip die must be at least 20x20mmm, and at a 7nm process, it must have an epic transistor count.

    • Also, if not chiplet-based, a horrifyingly low yield and abaolutely brutal price... unless they got way lower error rates than TSMC, let alone Intel, which would be some major news way more worth of an article than anything else

  • by smist08 ( 1059006 ) on Monday August 17, 2020 @02:17PM (#60411471)
    Will consider it when I can get it in a $100 SBC.

Business is a good game -- lots of competition and minimum of rules. You keep score with money. -- Nolan Bushnell, founder of Atari

Working...