Intel Reveals Specs of Arc GPU (windowscentral.com) 23
Intel has dripped out details about its upcoming Arc graphics cards over the last few months, but until recently, we didn't have full specifications for the GPUs. That changed when Intel dropped a video and a post breaking down the full Arc A-series. From a report: The company shared the spec sheets of the Arc A380, Arc A580, Arc 750, and Arc A770. It also explained the naming structure of the new GPUs along with other details. Just about the only major piece of information we're still missing is the release date for the cards. At the top end of the range, Intel's Arc A770 will have 32 Xe cores, 32 ray-tracing units, and a graphics clock of 2100MHz. That GPU will be available with either 8GB or 16GB of memory. Sitting just below the Arc A770, the Arc A750 will have 28 Xe cores, 28 ray-tracing units, and 8GB of memory. The Intel Arc A580 will sit in the middle between the company's high-end GPUs and the Intel Arc A380.
Smell of brain farts in the morning (Score:1)
Don't love it.
Re: (Score:2)
Agreed. Nvidia's CUDA is an amazing SDK for scientific work. Both AMD and Intel need something to compete since it is my understanding that OpenCL is basically dead at this point.
Re: (Score:2)
since it is my understanding that OpenCL is basically dead
My understanding is that your understanding sucks, and OpenCL is broadly used for a wide variety of tasks.
OpenCL is more broadly supported. CUDA is preferred by many corporations because it is proprietary.
Re: (Score:2)
The issue with OpenCL is that it isn't sufficiently hardware-specific and doesn't allow you to write code that is well enough optimized to the specific hardware you are targeting.
CUDA being proprietary makes no difference. You can download the compiler for free. NVidia's compilers are well maintained.
Re: pytorch on intel (Score:2)
An aspect of proprietary that has downstream effects for corporations: regardless of whether the product is good or bad now, if it was ever good in the past, it is likely to have been adopted to a degree that far surpasses the customers' capability to rid themselves of the dependency. Being proprietary reinforces this strongly with incentives from all sides to keep status quo. The same cannot be said for more open technology options.
Re: pytorch on intel (Score:1)
For that to work, Intel and AMD would have to compete in the scientific/enterprise market. These cards top out at 16GB which is rather low even just for modern VR gaming.
AMD tried but still doesnâ(TM)t have VFIO support on any modern cards and their cards top out at just 32GB, which is barely enough for your average AI model.
Meanwhile nVIDIA is capable of putting 96GB and in the near future double that and things like VFIO work flawlessly. And theyâ(TM)re also loads faster in every respect.
Re: (Score:2)
Tiled? (Score:2)
Anybody know if this is a tiled renderer?
Re: (Score:2)
Was wondering the same thing myself if it is a tile-based renderer.
What's aggravating is that the hardware [pcmag.com] is called a Tile GPU. [futurecdn.net] ARGH!
lost cause (Score:2)
Intel have tried and failed graphics in the past.
Arm and RiscV competition,
failing in the memory space,
failing to AMD in the x86 space,
Intel is smelling a bit dead, even if not dead yet.
Re: (Score:3)
But its important for the market that they try. The bitter competition between AMD and Nvidia is the reason we have the frankly astonishing graphics capabilities we do. Throwing MORE competition into the mix can only intensify that progress and force AMD and Nvidia into reigning in prices.
Of course that assumes they succeed, which is far from clear at the moment.
Plus giving Intel a taste of being the underdog at something they cant just leverage their market to control is probably good for american technolo
Re: (Score:2)
Which will happen first - the ultimate demise of Intel or the year of Linux on the desktop? They seem to have been predicted being imminent for about similar amounts of time.
Re: (Score:2)
Intel is smelling a bit dead, even if not dead yet.
Yes, pretty much. Of course, they still have fanbois that will pay more for less value just to see the Intel Logo on the product, but that will stop at some time as well.
Does anybody really care at this point? (Score:3)
The 1st generation Arc GPU line is so far behind schedule that the Nvidia 30X0 cards that it was designed to compete against will be replaced by 40X0 series cards by the time they are available at retail.
Intel had a chance to become a real contender in the graphics space if they had released their products during the GPU shortage in 2021. Instead, they're going to be obsolete by launch day. They'll probably end up being liquidated for less than half of their MSRP next year on the daily deal sites like Woot, and people STILL won't want to buy them at that point because of their mediocre driver support.
Re: (Score:2)
The 1st generation Arc GPU line is so far behind schedule that the Nvidia 30X0 cards that it was designed to compete against will be replaced by 40X0 series cards by the time they are available at retail.
That depends. The cheapest 3060 cards are still commanding around $380, with 3080's still camping out around $800. If Intel can deliver 3080 performance at 3060 price points, or if 3060 performance could be had inside $200 (where nVidia is still selling 1650's), then Intel doesn't have to win benchmarks. 80% of the nVidia 4xxx performance at 40% of the price will get Intel cards plenty of market share.
Re: Does anybody really care at this point? (Score:1)
Intel couldnâ(TM)t have released by the shortage. It takes years to develop a new product. Intel saw the shortage as an opportunity and rehashed an old design (Xeon Phi) and optimized it to be put on a graphics card. By the time they got the products to market, the shortage was over.
They are slowly recovering ground with their new CPUs, especially in the pro and server market but it will take a few years to re-establish.
This cycle repeats every few years, remember Opteron, same thing, AMD beats them fo
The zaku zeta (Score:2)
Thanks to intel releasing an useable video card, now you can make the most bizarre and amusing computer possible, an computer that use an AMD CPU and intel GPU, instead of the common intel CPU/AMD GPU of the past.
Re: (Score:2)
Thanks to intel releasing an useable video card, now you can make the most bizarre and amusing computer possible, an computer that use an AMD CPU and intel GPU, instead of the common intel CPU/AMD GPU of the past.
It won't match the Intel CPU with integrated AMD graphics [extremetech.com].
Cancelled? (Score:1)