Intel Arc B580 Battlemage Tested: $250 Graphics Cards Are Worthy Once Again (hothardware.com) 18
MojoKid writes: Today's release and review launch of the new Intel Arc B580 marks a second-gen effort from the company, with a fully refreshed Intel Xe2 graphics architecture, aka Battlemage, that promises big gains in performance and efficiency. Comparing Arc B580 to its Arc Alchemist ancestors, you can see that it's somewhat of a smaller GPU. It has fewer of nearly everything, and yet its performance specifications don't look too far off. A lot of this comes down to massive architectural improvements with an eye toward efficiency and making better use of the resources that were already there.
With 12GB of GDDR6 memory at 19Gbps, Arc B580 delivers performance that typically beats a GeForce RTX 4060 and even an RTX 4060 Ti in spots, especially when its extra 4GB of frame buffer memory comes into play. All in, Intel's latest Arc graphics offering is a strong contender in the $250 graphics card segment, and it should sell well in the months ahead, based on its value proposition, improved performance in ray tracing and advanced upscaling technologies.
With 12GB of GDDR6 memory at 19Gbps, Arc B580 delivers performance that typically beats a GeForce RTX 4060 and even an RTX 4060 Ti in spots, especially when its extra 4GB of frame buffer memory comes into play. All in, Intel's latest Arc graphics offering is a strong contender in the $250 graphics card segment, and it should sell well in the months ahead, based on its value proposition, improved performance in ray tracing and advanced upscaling technologies.
If this gets good Linux FOSS drivers (Score:5, Insightful)
Re:If this gets good Linux FOSS drivers (Score:4, Informative)
Intel is moving to a new driver package for Linux so the jury's still out on that one.
Highlights the failure of vert integration (Score:3)
What worries me is those layoffs (Score:2)
So there's a high probability the guys who write the drivers for this card are going to be on the chopping block and the card is basically going to be useless for modern games.
Which sucks because the hardware is absol
Re: (Score:3)
Companies generally don't lay off people who are needed to continue bringing in existing cash flow. To say that they're going to lay off the driver developers is practically the same as claiming that they're abandoning the video card market.
Re: (Score:2)
Makes me wonder what Intel considers their "core business" to be these days. Apparently there are rumours about selling their CPU business to Arm, so wtf.
Maybe they are going to sell everything that they can not brand as "AI".
Cherrypicked baseline? (Score:3)
I have an RTX 3060 and both it and the AMD near-equivalent were under $250 six months ago when I bought it. That's new, I paid less for one used specifically because it was EVGA and for some reason I thought it would be nice to have one of EVGA's last hurrah cards, before they decided it wasn't worth living by nVidia's rules any longer. It any case, the 3060 has 12 GB, I thought the 4060 did as well, because it's using half the memory bus of the 4090 which has 24 GB. So I don't see where this "extra 4 GB" is coming from, they should be comparing 12 GB cards to other 12 GB cards. Did they go even lower and use a low profile 3060 or 4060? That doesn't make sense either because those have only 6 GB. Are they comparing this one spec only to a 3070/4070 which does only have 8 GB as an artifact of the way the memory bus is built)?
Re: (Score:3)
Right now, tech reviewers are using the NVidia 4060 and AMD Radeon RX 7600 as their comparison cards, because they are (or likely will be) in the same price or performance range. For some reason, NVidia went down market for the 4060, making it a 8 GB card when the 3060 was a 12 GB card. The 4060 probably should have been called a "4050" instead, and perhaps the 4060 Ti 16 GB should have been the "real" 4060.
Re: (Score:2)
NVIDIA 4060 has less memory than the 3060 https://videocardz.net/nvidia-geforce-rtx-4060 [videocardz.net] . The 4070 is in a different price bracket, I think they have chosen their comparisonn point reasonably.
Re: (Score:2)
The base 4060 and even one model of 4060 Ti use 8GB. The better 4060 Ti uses 16GB. The base 4060 is adequate for 1080p, the 4060 Ti is good at it, the 4060 Ti 16GB can do 4k60 OK if you decrease quality slightly (still better than half) but is worthless for higher refresh rates at 4k. All of them have the 8 lane PCIE 4.0 bus, so they are also worthless as an upgrade card for a system with a PCIE 3 bus. But the 4060 Ti 16GB is a reasonable intro LLM engine and video editing accelerator, so it kind of has a r
Re: (Score:3)
When pushing generative AI models out to the VRAM, I'm reading off a 4x NVMe drive to start with. I don't see why only having 8 lanes to receive four lanes worth of data would be a huge problem. I can see where it might become a bottleneck for certain workloads, but the GPU itself would be an even bigger bottleneck. Typically, I'll see load times of 30 to 40 seconds pushing the model out to VRAM (then not again unless I have to change models, so this is a one-time cost), but 300 to 800 seconds actually gene
Re: (Score:1)
When pushing generative AI models out to the VRAM, I'm reading off a 4x NVMe drive to start with. I don't see why only having 8 lanes to receive four lanes worth of data would be a huge problem.
It's not, and I never said it was. The bus bandwidth is only an issue for gaming. Even then it's a non-issue for 1080p, but it does become relevant for 4k.
hello mod troll (Score:2)
Ride this dick forever
Re: (Score:2)
Everyone cherry picks. That's what a comparison is.
But why would Intel's card be compared to a 12gb 4060 model which costs almost 40% more than their card? That doesn't seem like a good comparison to me.
Picking a card solely based on having the same RAM seems like a much worse cherry-picked comparison. If they're gonna do that, then why not compare against the 4-year-old 3060 of yours? A new 12gb 3060 goes for $280-300, which is still more than this card. To find one under $250 as you've mentioned is defini
Re: (Score:2)
There's nothing cherrypicked about using the current gen bottom tier GPU as a baseline. The RTX3060 is no longer a current product, people buying new GPUs won't be comparing it to the B580. They are comparing entry level cards to entry level cards, nothing more. The underlying specs are irrelevant. Or are you going to start saying we can only compare cards with 3072 CUDA cores, 96 texture mapping units, 48 ROPs, and 24 ray tracing acceleration cores to other cards with 3072 CUDA cores, 96 texture mapping un
Re: (Score:2)
If you think that is bad, wait until you see the amount of memory in the 5000 series (allegedly), excepting the 5090 which will likely have 32.
When they are the high end company NVIDIA is going to stick it to you.
Not that good on Linux yet... (Score:3)
The B580 drivers seem to be in good shape for Windows if the reviews are to be believed, but sadly the same isn't true for Linux for all. The only outlet - as ever - who bothered testing the B580 on Linux was Phoronix [phoronix.com], who found that for Linux gaming the card was 9.5% slower than the RTX4060, 11.7% slower than the RX7600, had a idle power usage of at least 36W (vs. 10W for AMD/Nvidia cards) and you can't yet monitor the GPU temperature in Linux either.
If you're a Linux gamer, my advice is that you hold back on purchasing a Battlemage card until the Linux drivers significantly improve.