Intel Shares 48 Benchmarks To Show Its Arc A750 Can Compete With an RTX 3060 (theverge.com) 64
Intel has released 48 benchmarks that show its upcoming Arc A750 GPU should be able to trade blows with Nvidia's RTX 3060 running modern games. From a report: While Intel set its expectations low for its Arc GPUs last month, the company has now tested its A750 directly against the RTX 3060 across 42 DirectX 12 titles and six Vulkan games. The results look promising for what will likely be Intel's mainstream GPU later this year. Intel has tested the A750 against popular games like Fortnite, Control, and Call of Duty: Warzone, instead of the cherry picked handful of benchmarks the company released last month. "These are all titles that we picked because they're popular," explains Intel fellow Tom Petersen, in Intel's benchmark video. "Either reviewers are using them or they're high on the Steam survey, or new and exciting. These are not cherry picked titles."
We'll have to wait for independent benchmarks, but based on Intel's testing, the A750 looks like it will compete comfortably with Nvidia's RTX 3060. "You'll see we're kinda trading blows with the RTX 3060," says Petersen. "Sometimes we win, sometimes we lose." Intel's performance is, on average, 3 to 5 percent better than Nvidia's when it wins on titles running at 1080p. Over on the 1440p side, it looks like Intel wins on more of the benchmarks. On average it's a win of about 5 percent across the 42 games. Intel has also tested six Vulkan titles, where it seems be trading blows with the RTX 3060 once again.
We'll have to wait for independent benchmarks, but based on Intel's testing, the A750 looks like it will compete comfortably with Nvidia's RTX 3060. "You'll see we're kinda trading blows with the RTX 3060," says Petersen. "Sometimes we win, sometimes we lose." Intel's performance is, on average, 3 to 5 percent better than Nvidia's when it wins on titles running at 1080p. Over on the 1440p side, it looks like Intel wins on more of the benchmarks. On average it's a win of about 5 percent across the 42 games. Intel has also tested six Vulkan titles, where it seems be trading blows with the RTX 3060 once again.
Conflict of interest (Score:5, Insightful)
Unless those benchmarks are compiled by an independent set of experts who are not affiliated with you in any way, shape or form, they mean absolutely nothing.
Re:Conflict of interest (Score:5, Informative)
Gamers Nexus have been covering ARC a lot lately. It's bad. Performance is acceptable at the price point in some newer games. Older games perform a lot worse, because Intel doesn't have the resources to go back and optimise for older versions of DirectX and older game engines.
The bigger issue is the state of the drivers. They just don't work. A lot of features cause game breaking bugs like corrupt graphics or horrible tearing. Some completely kill performance.
I'd recommend giving them a few generations before even considering Intel ARC. They might just decide to drop it entirely, or the drivers might never reach maturity.
Re: (Score:1)
I have a laptop with an Arc-A in it that I bought last month.
For every game I care to play (including newer ones released as late as two years ago) I've been able to maintain consistent more-than-playable framerates (Excepting Zandronum-based DOOM II port with Complex Doom Invasion - that simply needs Vulkan support for the draw call issue it has.) Dead Cells, Super Animal Royale, Dead by Daylight, and more, these all play just fine (and I got my rank-1 killer achievement in DBD last night.)
Gamers Nexus use
Re: (Score:3)
Gamers Nexus uses shitty benchmark software (and shitty choice of horridly-unoptimized games/game engines) so I take literally EVERYTHING with about ten grains of sodium chloride (and a few grains of potassium chloride just to level out the heart activity.)
And which benchmarks do you use and why are your benchmarks better?
Run the tests for yourself instead of relying upon idiot sites that are paid to game shit for ad dollars. YOU CAN RETURN PRODUCTS YOU DEEM DEFECTIVE.
That requires me to buy the product then test it against other systems that I may not have. Then according to you return the system even though it is not defective but does not perform the way I want.
Re: (Score:2)
"And which benchmarks do you use and why are your benchmarks better?"
I actually play the games and chart the FPS. Most engines do have an FPS counter you can enable, if you have games from competent developers!
Re: (Score:2)
I actually play the games and chart the FPS. Most engines do have an FPS counter you can enable, if you have games from competent developers!
So you perform your own tests for games which you like in highly variable scenarios. How is that benchmarking? That's like me saying one car is better than the other based on the random roads I take.
Re: (Score:2)
Your list of games includes two games that have basically no meaningful GPU performance requirements being 2D games, and third one has requirements listed as GTX 460/HD 6850 with 1GB of VRAM.
You could run those on integrated graphics from AMD. Heck, you could probably run it on intel's integrated, thought DBD would probably have some stutter.
Re: (Score:2)
"Your list of games includes two games that have basically no meaningful GPU performance requirements being 2D games"
They're still running on a 3D composited engine utilizing everything except polygons. This requires OpenGL hardware acceleration. Otherwise, they could run in pure software - they do not.
"You could run those on integrated graphics from AMD"
The Ryzen 3 laptop I have struggles to run Dead Cells or The Binding of Isaac at consistent framerates, going between 25-40 (and Binding of Isaac has insan
Re: (Score:2)
If you run on lowest possible CPU, you're obviously going to run into issues. That's not about graphics, that's also about CPU being very slow, memory likely being low and slow and so on.
For example, I can run dead cells on mid tier i5 from half a decade ago with intel iGPU no problems. It has basically no GPU requirements worth mentioning. Notably not my opinion but that of the developers. They mention ancient 450GTS as minimal requirements. That's low end discrete GPU from 12 years ago. Any modern iGPU ou
Re: (Score:2)
Oh, Super Animal Royale is vector-based, not 2D pixel-based. Much faster with a 3D card than pure CPU software rendering. Also, again, on a 3D composition engine.
Re: (Score:2)
Irrelevant nitpicking. Neither has meaningful GPU requirements beyond integrated GPUs. CPU software rendering hasn't been the option in consumer space in over a decade at least.
Re: (Score:2)
because Intel doesn't have the resources
>In the years 2011-2015, Intel ... lavished shareholders with $36b. in stock buybacks
>From 2016 through 2020, Intel ... $45b. as buybacks.
>Last year the company ... used $2.4 billion ... to repurchase 39.5 million shares of stock.
Re: (Score:2)
Re: (Score:2)
I know words are really hard for slashdotters these days, but spending it on an optional, unrelated luxury is not the same as not having the resources.
You may or may not play video games, but you're not a nerd and never were one.
Re: (Score:2)
spending it on an optional, unrelated luxury is not the same as not having the resources
It's your opinion that this is an "optional, unrelated luxury". While I would agree with that, the point that you are missing is that many modern CEOs seem to think that this is neither optional nor a luxury.
Re: (Score:2)
It's your opinion
Horse. Shit.
Re: (Score:2)
Re: Conflict of interest (Score:2)
Re: (Score:2)
Shame there's only outrage farmers on the internet.
Re: (Score:2)
Has that rumor been confirmed?
We better get cheaper prices. (Score:2)
Re: We better get cheaper prices. (Score:1)
Re: (Score:2)
RTX 3060 12GB is about $40 over msrp (Score:2)
Re: (Score:2)
No they are not. They're about 40% higher compared to 5 years ago, for very, VERY LITTLE actual performance gain excepting raytracing, at least on nVidia's side. And the only thing new there is the raytracing and a couple of effects shaders (which the shaders could be done in Vulkan or OpenGL on any sufficient hardware, though OpenGL has draw call issues/limitations.) All other accelerated use-cases have LAGGED, outside of streaming video output, for attention-whoring/content providing, and current games sh
Re: (Score:2)
5 years back gets you a new'ish 1080ti (Mar 2017). Looking at a few games and composite benchmarks, a 3080ti produces about double the FPS in regular (not raytraced) games.
Re: (Score:2)
Back in 2019 - 3 years ago - I got a 1660 GTX TI for 250 GBP.
Wow, that's only 7% slower than the 1070 I just got used for USD 150. To be fair though, you have been able to get mediocre performance with it for the last few years.
Re: (Score:2)
Cards that were released 5 years ago and are still sold today are *more expensive now* than they were at release. Including the ones that were panned as being over-priced at release!
Oranges not needed here, the apples to apples comparison shows a price increase over time, where you'd expect the older technology to have gone down significantly in price. Used prices on years-old models are high now than new prices at release.
Ok? (Score:1)
Re: (Score:3)
To be fair the money is in the mid-range cards and not flagship cards. The volume is huge for cards that are good enough for all the new games at a moderate resolution, but are not the most expensive card possible.
More competition is better for consumers. Assumes Intel's new card is real competition and not hot air.
Re: (Score:2)
The 4060 and RX7500 are mid range cards.
Re: (Score:2)
New cards are coming in Hot and Intel missed their opportunity.
Re: (Score:2)
Considering the mass of reports on overwhelmed warehouses with all the unsold current gen GPUs, all signs point to "coming in hot" being a fairly distant thing for next gen GPUs beyond top tier halo models that don't really compete with current gen models.
Re: (Score:2)
The original poster (and a later comment in this thread) is referring to NVIDIA RTX 40-series and AMD RX 7000 series cards discussed last month. Try to keep up.
Re: (Score:2)
Which will remain unreleased for SKUs competing with those that sit in storage unsold until said warehouses are sold out. Try to keep up.
Re: (Score:3)
If it competes favorably to a 3060, and comes in at a way cheaper price point, that could be a big win for consumers and gamers who don't need the latest and greatest flagships. I'm all for more competition and a return to there being an actual entry and mid-level price point GPU good enough for gaming. I remember when $200-$250 would get you a pretty damn good card that was probably 80% of cards costing $700 and up. I miss those days.
Re: (Score:2)
The problem is that for those of us who would buy something like that, we need it to run older graphically intensive games well.
And that is something Intel's new offerings just don't do and will not do any time soon if Petersen's statements on what they're focusing on with driver development are to be believed.
There's also the supposed thing with Intel scaling better with resolution, but 4k gaming on lower end models is still awful. It's just slightly less awful on Intel. Not a selling point, especially con
Re: (Score:2)
I bet you nuclear weapons to dollars that if the Demoscene got hold of Intel's internal documentation, they'd RAPE AMD and nVidia in graphical capability.
Re: (Score:2)
I bet you nuclear weapons to dollars that if the Demoscene got hold of Intel's internal documentation, they'd RAPE AMD and nVidia in graphical capability.
If only you could make a dollars to dollars comparison. If the demoscene got ahold of everyone's internal documentation, Intel would be a distant almost-ran in the GPU performance department just like they are today.
Re: (Score:2)
Re: (Score:2)
What the Demoscene can accomplish with very, very little actual computing power would astound you.
Re: (Score:2)
What kind of bugs though? (Score:3)
Re: (Score:2)
Re: (Score:2)
Why is ray tracing that important?
Because both AMD and nvidia's latest GPUs do it, so if theirs don't, they will get pounded directly in the ass in marketing. AMD and nvidia will literally pay devs to use the functionality and provide them engineering assistance to get it working if necessary to make Intel look like a bag of lames.
Re: (Score:2)
Re: (Score:2)
https://www.pcgamesn.com/intel... [pcgamesn.com]
They may not even last that long.
nVidia's GPU allows for ray tracing/DLSS (Score:2)
"Nothing is free in 3D" and it's up to the consumer to decide if those features are worth it to them. Ray tracing is a hit on frame rate on a 3060, even though it has the dedicated transistors for that, but DLSS in its latest form is for most people always going to be a no-brainer on a RTX 3060, when the option to use the latest version of it is present.
Shadow of the Tomb Raider, for example, was iffy at best with DLSS originally, but the latest iteration of that rendering method has shown its worth, and so
It gets crushed in older games though (Score:2)
Intel normally has very, very good driver support. So as long as they don't just give up on this (which they might, pretty sure they got into this for a piece of the crypto pie) then in a year or two we'll have 3 viable GPU manufacture
Does it kill apple silicon as well? (Score:2)
Does it kill apple silicon as well?
Re: (Score:2)
Quite probably.
https://www.notebookcheck.net/M1-Max-vs-RTX-3080-Laptop-GPU-Apple-SoC-excels-in-synthetic-benchmarks-but-flounders-alongside-the-M1-Pro-in-gaming-tests.575086.0.html
Re: (Score:2)
What those reviews don't show you is wether the programs being benchmarked are native ARM, or running under Rosetta.
Anything running under emulation is going to perform much worse than native code.
Re: Does it kill apple silicon as well? (Score:2)
Re: (Score:2)
It's the best at what it does - emulation of a totally alien processor architecture.
Emulation is still slower than native code running on the same hardware.
Now... (Score:2)
How will it fare:
-When benchmarks are run by independent parties
-When RTX4060 is out (which will probably happen shortly after A750 can come out)
-In the benchmarks other than the ones Intel hand picks
Give them a year or two (Score:2)
Average FPS is meaningless (Score:3)
What little independent testing that has been done shows that Intel's cards can, in a very limited set of games, produce decent average framerates, but TERRIBLE 0.1% and 1% lows. Meaning that they have terrible frame time consistency.
Re: (Score:2)
Which is interesting because with the same (non-Intel) GPU historically for your same amount of dollars you would get higher minimum frame rates with Intel, but higher maximum frame rates with AMD (And more cores.) So here Intel is, failing at minimum frame rates with their GPU...
Re: (Score:2)
It's not really interesting because interactions between GPU and CPU are a whole different beast from GPU itself handling the load.
Most likely culprit is the drivers. It took both nvidia and ATI/AMD many years before theirs were up to snuff in gaming. The only real surprise is that intel didn't expect drivers to be a significant enough problem to put a lot of people on it ASAP when it became clear they will be making discrete GPUs to trawl through most played stuff on steam for last few years and make sure
Re: (Score:2)
There's speculation that there are hardware issues with the chips that may be responsible for many of their woes beyond what drivers can fix. They require rebar support from the motherboard, for example. Without it, their performance falls off a cliff to the point of being useless. I can't remember exactly, I think it was like a ~40% reduction in average framerate and "terrible frametime consistency" turning into "massive stuttering problems". The problem is that because Intel's going to be competing in mos
Re: (Score:2)
Thing is, their discrete GPUs are just like their 12th gen CPUs. If you don't run win 11 (with resizable bar support and new scheduler that supports different types of cores), they utterly suck and have horrid problems.
But this also shows it to be fundamentally a software problem. There may be something beyond that, and there are indeed rumors about it. But before those potential hardware problems, software problems are by far the worst aspect of intel's new hardware. Both in CPU and GPU space.
Intel Just Needs to Release the 7xx Cards (Score:2)
All these press junkets and releases are useless since no tech enthusiasts ever takes the performance numbers that any manufacturer releases at face value. Intel just needs to release the d*mn cards ASAP at a "we'll ignore the sh*tty drivers" price point and let the good tech sites publish unbiased benchmarks.
None of the info released by Intel is doing them any favors since it's lower-middle of the pack performance when the top performers get all the attention (a fact that AMD/ATI knows all too well.)