Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology

Nvidia Takes the Wraps off Hopper, Its Latest GPU Architecture (venturebeat.com) 58

After much speculation, Nvidia today at its March 2022 GTC event announced the Hopper GPU architecture, a line of graphics cards that the company says will accelerate the types of algorithms commonly used in data science. Named for Grace Hopper, the pioneering U.S. computer scientist, the new architecture succeeds Nvidia's Ampere architecture, with launched roughly two years ago. From a report: The first card in the Hopper lineup is the H100, containing 80 billion transistors and a component called the Transformer Engine that's designed to speed up specific categories of AI models. Another architectural highlight includes Nvidia's MIG technology, which allows an H100 to be partitioned into seven smaller, isolated instances to handle different types of jobs. "Datacenters are becoming AI factories -- processing and refining mountains of data to produce intelligence," Nvidia founder and CEO Jensen Huang said in a press release. "Nvidia H100 is the engine of the world's AI infrastructure that enterprises use to accelerate their AI-driven businesses."

The H100 is the first Nvidia GPU to feature dynamic programming instructions (DPX), "instructions" in this context referring to segments of code containing steps that need to be executed. Developed in the 1950s, dynamic programming is an approach to solving problems using two key techniques: recursion and memoization. Recursion in dynamic programming involves breaking a problem down into sub-problems, ideally saving time and computational effort. In memoization, the answers to these sub-problems are stored so that the sub-problems don't need to be recomputed when they're needed later on in the main problem. Dynamic programming is used to find optimal routes for moving machines (e.g., robots), streamline operations on sets of databases, align unique DNA sequences, and more.

This discussion has been archived. No new comments can be posted.

Nvidia Takes the Wraps off Hopper, Its Latest GPU Architecture

Comments Filter:
  • by Rosco P. Coltrane ( 209368 ) on Tuesday March 22, 2022 @12:58PM (#62380509)

    How buggy will the Linux driver be.

    • You joke....

      but less than five minutes ago, I just finished up cleaning a botched NVIDIA driver update on a SUSE machine, which took the better part of 30 minutes.

      • I am most certainly not joking. I have 3 laptops with Nvidia hardware here that I can't upgrade the kernel of because if I do, I break the "magic" kernel version number + Nvidia driver version combination that "only" crashes or fucks up the panel's backlight controls once every 5 times I close the lid and the machine suspends.

        I don't recall even installing a trouble-free Nvidia driver. And that's going back to the early 2000's. Nvidia's drivers epitomize everything that's wrong with closed-source drivers un

        • Interesting. I typically use Gentoo, and in my experience, it has 'just worked' for the last decade or two. This was actually a one-off for me, and likely my own fault.

        • I don't recall even installing a trouble-free Nvidia driver. And that's going back to the early 2000's. Nvidia's drivers epitomize everything that's wrong with closed-source drivers under Linux.

          It's not Nvidia that's the problem its Linux's over reliance on source code being the answer to everything. Windows has well designed interfaces for hardware vendors to target. Linux has ad-hoc APIs that change and break on a whim.

          • Re: (Score:2, Informative)

            by guruevi ( 827432 )

            Laughs in Windows 10 1607.

            The fact is most people run the equivalent of the nouveau drivers in Windows after some period of time, nVIDIA or AMD just says "no more support" and you're stuck on whatever branch of Windows you got stuck on. If you run RHEL, then nVIDIA drivers are phenomenal, they'll even support a mixture of Quadro and GeForce (unlike their Windows counterparts where it literally will refuse to install both drivers at the same time). But as soon as you're on CentOS Stream (basically the equiva

          • Microsoft makes changes that break old windows drivers all the time. They're known ahead of time, but so are the Linux changes, so no difference there.

            Problems with resume from suspend and such have historically been mostly Microsoft's fault, because they created the "standard" and also the tools used to create the tables that fucked up power saving on Linux.

            I've been using Linux with GPUs as long as there's been Linux support for GPUs. My first Linux machine had a Trident 8900D 1MB ISA VGA card, and I chis

            • Hah- I got hit by the Pop!_OS bug too.
              Funny enough, it only happened in Hybrid mode for me. Switch to pure NV mode, and the problem is gone.
              • Also want to add that neither Ubuntu or Fedora do it on that same machine, and they run only in hybrid mode. Only Pop, and only when not in "NVIDIA only" mode.

                I don't think the problem actually has anything to do with the NV.
                I think something in Pop is fucking with xrandr settings and messing up the PRIME pipe in its attempt at making their OS more laptop-power friendly, and it's backfiring.
                • I gave up on laptops with nvidia. Now that there's decent linux drivers for AMD graphics hardware it's a lot less hassle to go that route. But my only laptop right now is also a budget device with very little graphics hardware (I think the IGPU has like 2 or 3 cores, it'll barely run vanilla minecraft) so my opinion probably doesn't matter much. At least the video works, though, even if the backlight doesn't shut off.

        • by xwin ( 848234 )
          This is anecdotal, but I maintain a server with NVIDIA A100 GPU that I use for ML work. I do not upgrade kernel on that machine very often but I have never had an issue where the driver would not work or the machine would not boot. This is on a Ubuntu 20.04 LTE. I am always concerned that the driver update or the os update will break some dependencies in my ML stack and I would have to recover that machine. Never happened so far.
        • Sorry, your nv is not in control of your panel backlight. You're making shit up.

          Been using nv in my laptops for many years now. Last radeon laptop I had was a Southern Islands.
          nv drivers have been a charm.
        • What on earth hardware are you using? I'm responsible for several dozen heterogeneous pieces of hardware, and well over half of them bear nVidia GPUs. We also have a number of systems with AMD GPUs, and Intel DG-1s. We have AMD, Intel and IBM CPUs. And exotic ultra-fast network cards (including some that run their own OS), and almost all of it has IBM's GPFS parallel filesystem and the required kernel driver.

          And let me tell ya honey, the problem factor is NEVER the nVidia drivers. If typing "./NVIDIA_Lin
    • by Junta ( 36770 )

      It'll probably be fine, since this adapter is almost exclusively going to be used in Linux machines. This architecture is only for datacenter use.

  • Named for Grace Hopper

    Nvidia simultaneously announces a new COBOL-based shader language.

  • by IWantMoreSpamPlease ( 571972 ) on Tuesday March 22, 2022 @01:52PM (#62380721) Homepage Journal

    when I can buy their video cards for MSRP (new, not the used ragged out ones used by miners.)

  • The Nvidia chip has 2,758 times number of transistors.
  • When do we start calling it AIPU?

To do nothing is to be nothing.

Working...