Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
China Technology

The Rise of China GPU Makers (tomshardware.com) 78

The number of GPU startups in China is extraordinary as the country tries to gain AI prowess as well as semiconductor sovereignty, according to a new report from Jon Peddie Research. From a report: In addition, the number of GPU makers grew worldwide in recent years as demand for artificial intelligence (AI), high-performance computing (HPC), and graphics processing increased at a rather unprecedented rate. When it comes to discrete graphics for PCs, AMD and Nvidia maintain lead, whereas Intel is trying to catch up.

Tens of companies developed graphics cards and discrete graphics processors in the 1980s and the 1990s, but cut-throat competition for the highest performance in 3D games drove the vast majority of them out of business. By 2010, only AMD and Nvidia could offer competitive standalone GPUs for gaming and compute, whereas others focused either on integrated GPUs or GPU IP. The mid-2010s found the number of China-based PC GPU developers increasing rapidly, fueled by the country's push for tech self-sufficiency as well as the advent of AI and HPC as high-tech megatrends.

In total, there are 18 companies developing and producing GPUs, according to Jon Peddie Research. There are two companies that develop SoC-bound GPUs primarily with smartphones and notebooks in mind, there are six GPU IP providers, and there are 11 GPU developers focused on GPUs for PCs and datacenters, including AMD, Intel, and Nvidia, which design graphics cards that end up in our list of the best graphics cards. In fact, if we added other China-based companies like Biren Technology and Tianshu Zhixin to the list, there would be even more GPU designers. However, Biren and Tianshu Zhixin are solely focused on AI and HPC for now, so JPR does not consider them GPU developers.

This discussion has been archived. No new comments can be posted.

The Rise of China GPU Makers

Comments Filter:
  • by BytePusher ( 209961 ) on Tuesday December 27, 2022 @12:22PM (#63161376) Homepage
    US semiconductor sanctions are actually driving domestic innovation in China, instead of hindering development they're accelerating it
    • So far its domestic innovation in scams. 2021 saw record in stolen funds for semiconductor development.
      https://asia.nikkei.com/Busine... [nikkei.com]
      https://interconnected.blog/ch... [interconnected.blog]
      https://www.youtube.com/watch?... [youtube.com]

    • The goal wasn't so much to hinder development as to deprive them of the latest equipment right now. They were always developing their own GPUs, and they were always going to accelerate their own production as demand increased. No doubt recent events have sped things up slightly, but this is not a fundamental change. Chinese companies have been implementing other people's GPU cores for as long as they've been able, which has obviously involved a certain amount of technology transfer.

    • by haruchai ( 17472 )

      definitely accelerating them stealing tech from Taiwan

  • by DrMrLordX ( 559371 ) on Tuesday December 27, 2022 @12:43PM (#63161450)

    GPUs aren't ideal for AI anyway. Companies like piggy-backed it onto their graphics accelerator business.

    • Dunno why it vanished but I meant to say "companies like NV". Weird.

    • by Tablizer ( 95088 )

      Dedicated AI chips are still too expensive to gain enough market share to have economies of scale. Anyone want to guess when that'll change?

  • OK dumb question (Score:5, Interesting)

    by argStyopa ( 232550 ) on Tuesday December 27, 2022 @12:44PM (#63161454) Journal

    Why do we persist in calling them GPUs?

    I mean, sure, I get it, originally the highest-performance designs were indeed GPUs but it just seems funny that since now:
    - GPUs are used for bitcoining
    - GPUs are used for AI research ....are they still really "GPUs", syntactically?

    Or is it just that the economies of scale based on the demand for task-specific GPUs are the only way we (the world) gets our hands on the bleeding-edge of processing horsepower at reasonable prices, even though they're not really designed for the other tasks specifically, economics outweighs any resulting inefficiencies?

    • by znrt ( 2424692 )

      they are still gpu, and graphics rendering is still their main purpose.

      to your second question ... these applications get probably more headlines than they deserve. as for mining there already custom asic chips for mining that are more efficient than gpu, but there is no clear indication that this industry has a solid future to invest in r&d anyway. regarding ai, it's too early and the state of the art is too diverse to research in specific chips when current hw can do fine. unlike crypto mining, that w

    • by Tablizer ( 95088 )

      First, what do you suggest calling them, and second, a lot of words are based on obsolete concepts, so why should chips be different? English is a big cluster-hack.

      For examples most "eyeglasses" are not made out of glass anymore. But if you go around calling them "eyeplastics" or "mounted corrective lenses" you'll get that funny look we slashdotters often get.

      • They are only called eyeglasses in America, the rest of the world doesn’t need to be told they are for your eyes, we just call them glasses.
        • by haruchai ( 17472 )

          "we just call them glasses"
          even when not made out of glass? why?

        • by Tablizer ( 95088 )

          Whether true or not, that didn't solve the "problem" regardless.

          Reminds me of a lame joke:

          "Waiter, I don't want this fly in the soup!"

          So the waiter comes over, takes the fly out, puts it on a plate, then takes the soup back into the kitchen.

          Waiter: "There, it's not in the soup anymore, just like you asked."

    • The main difference between GPU and CPU is parallelization (graphics processing benefited from this from beginning, nowadays IA benefits from it as well :h])
    • Why do we persist in calling them GPUs?

      You can buy them without video connectors, and optimized for compute rather than for graphics. In this case they are sometimes still called GPUS ("GPGPU" though) or sometimes "stream processors". If they have a video connector and are still generally optimized for realtime graphics, then they are best described as a GPU.

    • Nobody minds Bitcoin on a GPU anymore except a handful of hobbyists and a criminals planning a rug pull. It turned out virtually all the GPU mining was done by ethereum and when it stopped doing that GPU mining stopped with it.

      AI research is still very much a niche. That leaves graphics. Everybody thinks about 3D graphics but they forget about video encoding and workstation 3D. Nvidia and AMD both so expensive workstation cards that are in high demand. If you're not in the industry you kind of forget ab
    • Nvidia, if you consider them somewhat representative of the shape of the GPU market breaks down its revenue like here: https://www.investopedia.com/h... [investopedia.com] It's fair to say what @znrt says below that the "majority" is still graphics. From a die-area perspective, the even bigger majority of the chips are general purpose ALUs. Ironically, here you are right. But to quote a GPU architect I once spoke to: "The rasterizer-part is so small its not even worth ripping out". So yeah. They are GPUs.
    • inb4 they get "rebranded" as "General Purpose Units". Oh wait, we already have the CPU term. /s

      Seriously though, GPUs have (slowly) been expanding into CPU territory for the past decade+ but they are still vastly different from CPUs namely Random Access is still hideously slow (but masked with caches). Depending on the problem sometimes a homogenous solution is better, sometimes a heterogenous solution is better. It really depends on the problem space. i.e. GPUs support "half float" and float8 formats.

    • by AmiMoJo ( 196126 )

      Probably because vector processors with graphics, AI, and video encoding specific sub modules is a bit of a mouthful.

      What would you call them?

  • by gurps_npc ( 621217 ) on Tuesday December 27, 2022 @12:57PM (#63161500) Homepage

    In order to become a technological leader in a field that advances so quickly they created a proverb called "Moore's Law (IC density doubles every 2 years)" all it takes is for a central bureaucracy to decide to invest in it.

    Communist China is very good at copying stuff. That just takes some well educated people willing to ignore patents and copyrights. It is not as good at innovation, which takes a bit more.

    I look forward to China having great new chips for 2 years that they claim are the best, then having the 2nd best chips two years later, then the "are they still making chips?" 4 years later.

    • Communist China is very good at copying stuff.

      ...and make new ones as well (the "communist" is by yourself: to call China "communist" makes no sense from decades ago...)

    • I don't know it works for the moon landing.

      If you want innovation somebody has to be the money bags and that's never going to be private business because private business wants money now and innovation takes years or even decades to pay off. If ever.

      I mean how much of your life savings would you be willing to put into an investment that isn't likely to pay off for another 50 years?
      • China has not put a man on the moon, they just landed machines. They have in no way matched the USA's greatest achievement.

        Investing for 50 years does not create the newest tech. it creates the best 50 year old tech.

    • Communist China is very good at copying stuff.

      Chineses are producing a ton of research papers. Your statement was true 30 years ago but the world has changed...

      I look forward to China having great new chips for 2 years that they claim are the best

      Honestly I don't remember China saying they are the best at X or it didn't reach me, I think it's more an united statian thing.

      • by haruchai ( 17472 )

        Research papers don't magically translate into manufacturing prowess.
        Just ask Soviet / modern Russia. A lot of the best manufacturing in China is either done by Taiwanese companies or by Western "joint" ventures who've essentially had their tech stolen by the state.

    • Re: (Score:2, Troll)

      by AmiMoJo ( 196126 )

      This is why you keep losing. Even after all the losing you already did, you are convinced that China can only copy.

      When they out innovate you and own all the patents in 6G and are churning out high end ICs that you can't match, I expect you will be shocked yet again and demand a ban on time travelling to steal future US inventions.

    • Comment removed based on user account deletion
  • More competition is always better, and with the most popular GPU in the steam survey being the GTX1050, there is room for brands like Intel and Innosilicon to bring offerings at reasonable prices that are NOT the latests and greatest. Anything that is above a 1060 will do. Intel is there with the Arcs being on par with variants of the 3080, and innosilicon is far above the 1060 too....

    A brand new card (with warranty) with performance akin to a 2080 (or more, like intel) with more memory would be a boon to u

    • by serviscope_minor ( 664417 ) on Tuesday December 27, 2022 @06:43PM (#63162384) Journal

      But, alas, in this corner of the world, we only get AMD, Nvidia and Intel...

      It shows in part how hard it really is, I think.

      Intel clearly know how to make this sort of thing quite effectively: they've been shipping serviceable iGPUs for years, with rock solid drivers (on Linux at any rate) and have close to the best fab tech (though they lost the edge) but still can't crack the GPU market. Their drivers for ARC for example didn't do a good job on older games. AMD finally put in the leg work with drivers and opened them giving now decent quality. But they haven't put in the massive amounts of legwork required to support the compute community and have not cracked the deep learning market.

      To crack this market you need a top notch design, a top notch fab and top notch software. Those are all really hard.

  • I am sure it's a unique design, not derivative of others' intellectual property. HAHHAHA!
  • Using stolen tech, if not outright fake parts.

    And you cannot say that isnâ(TM)t true, given their track record in all other industries.

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...