Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Technology

Synopsys Intros AI-Powered EDA Suite To Accelerate Chip Design and Cut Costs (anandtech.com) 11

Synopsys has introduced the industry's first full-stack AI-powered suite of electronic design automation tools that covers all stages of chip design, from architecture to design and implementation to manufacturing. From a report: The Synopsys.ai suite promises to radically reduce development time, lower costs, improve yields, and enhance performance. The set of tools is set to be extremely useful for chips set to be made on leading-edge nodes, such as 5nm, 3nm, 2nm-class, and beyond. As chips gain complexity and adopt newer process technologies, their design and manufacturing costs escalate to unprecedented levels. Designing a reasonably complex 7 nm chip costs about $300 million (including ~ 40% for software). In contrast, the design cost of an advanced 5 nm processor exceeds $540 million (including software), according to International Business Strategies (IBS) estimates. At 3 nm, a complex GPU will cost about $1.5 billion to develop, including circa 40% for software.
This discussion has been archived. No new comments can be posted.

Synopsys Intros AI-Powered EDA Suite To Accelerate Chip Design and Cut Costs

Comments Filter:
  • by Anonymous Coward

    But it's going to leave a lot on the table, because the added complexity is going to obfuscate numerous opportunities to "radically reduce development time, lower costs, improve yields, and enhance performance".

    Honestly, if you really want better chips, you're going to have to reduce complexity. And that means a rather different approached to chip design. [github.io]

    • by gweihir ( 88907 )

      If you want better engineering of any kind, you have to reduce complexity. Complexity kills basically avery positive quality. (Same applies in politics and law incidentally.) Yes, sure, the software field has still not learned that lesson and it looks like some people in chip design have forgotten it.

      • It was supposedly Einstein who said "things should be made as simple as possible, but not any simpler." I dunno if he actually said it, but I believe that is correct. There are a lot of advanced concepts that actually aren't simplifiable but the human brain tries to simplify it and instead makes wrong decisions based on it. For example, we like to classify people as either Male or Female [based on genital presence .. except genitals in certain rare cases are not well defined at birth]. Anyway, not to go off

        • by gweihir ( 88907 )

          Sure. But I highly doubt letting Artificial Ignorance having a go at things makes them as simple as possible. It is far more likely to introduce complexity that is nicely hidden and hard to find.

      • by tlhIngan ( 30335 )

        If you want better engineering of any kind, you have to reduce complexity. Complexity kills basically avery positive quality. (Same applies in politics and law incidentally.) Yes, sure, the software field has still not learned that lesson and it looks like some people in chip design have forgotten it.

        Well, the problem with chip design is the floorplanner part of it is an NP-Complete problem so optimizing is one of the slowest part of the chip design process. Using AI is a common way to approach such problem

    • I remember you guys in the 80s and 90s claiming computers would NEVER beat humans at chess. After that happened, it was Go that computers would never win at, and then again recently when a human supposedly "beat" a computer at Go (turns out he didn't, a computer told him how to do it so it was actually computer vs. computer). Chip design is perfect for computers because the rules and end goal (lowest power consumption etc.) is well defined and doesn't require a lot in the way of understanding and reasoning.

  • by 93 Escort Wagon ( 326346 ) on Friday March 31, 2023 @02:48PM (#63415054)

    This is one area where I'd hope no one would entertain the validity of the (largely AI fanboi inspired) concept of "usefully wrong".

    • by gweihir ( 88907 )

      My guess is that after a few costly disasters, this will get thrown on the trash where it belongs.

    • Considering all designs are programmatically validated, it really can't be wrong.

      What it can do is replicate existing patterns as well as determine how to distribute heat better. AI is just a tool to accelerate tedious work. It won't be a magic bullet but it should do exactly what they claim: cut costs and reduce design time.

  • by manu0601 ( 2221348 ) on Friday March 31, 2023 @05:08PM (#63415402)

    2nm-class, and beyond

    I wonder how much room there beyond 2nm. We talk of tracks that are less than 20 atoms wide.

  • Designing a reasonably complex 7 nm chip costs about $300 million (including ~ 40% for software). In contrast, the design cost of an advanced 5 nm processor exceeds $540 million (including software), according to International Business Strategies (IBS) estimates. At 3 nm, a complex GPU will cost about $1.5 billion to develop, including circa 40% for software.

    Don't worry, once Huawei get their EDA stuff running, even if it takes them another few years, they'll be doing this at a tenth of the US cost. And yet again the sanctions show a short-term gain for a long-term massive loss.

"Facts are stupid things." -- President Ronald Reagan (a blooper from his speeach at the '88 GOP convention)

Working...