Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Technology

Intel Shares 48 Benchmarks To Show Its Arc A750 Can Compete With an RTX 3060 (theverge.com) 64

Intel has released 48 benchmarks that show its upcoming Arc A750 GPU should be able to trade blows with Nvidia's RTX 3060 running modern games. From a report: While Intel set its expectations low for its Arc GPUs last month, the company has now tested its A750 directly against the RTX 3060 across 42 DirectX 12 titles and six Vulkan games. The results look promising for what will likely be Intel's mainstream GPU later this year. Intel has tested the A750 against popular games like Fortnite, Control, and Call of Duty: Warzone, instead of the cherry picked handful of benchmarks the company released last month. "These are all titles that we picked because they're popular," explains Intel fellow Tom Petersen, in Intel's benchmark video. "Either reviewers are using them or they're high on the Steam survey, or new and exciting. These are not cherry picked titles."

We'll have to wait for independent benchmarks, but based on Intel's testing, the A750 looks like it will compete comfortably with Nvidia's RTX 3060. "You'll see we're kinda trading blows with the RTX 3060," says Petersen. "Sometimes we win, sometimes we lose." Intel's performance is, on average, 3 to 5 percent better than Nvidia's when it wins on titles running at 1080p. Over on the 1440p side, it looks like Intel wins on more of the benchmarks. On average it's a win of about 5 percent across the 42 games. Intel has also tested six Vulkan titles, where it seems be trading blows with the RTX 3060 once again.

This discussion has been archived. No new comments can be posted.

Intel Shares 48 Benchmarks To Show Its Arc A750 Can Compete With an RTX 3060

Comments Filter:
  • by devslash0 ( 4203435 ) on Thursday August 11, 2022 @04:07PM (#62781310)

    Unless those benchmarks are compiled by an independent set of experts who are not affiliated with you in any way, shape or form, they mean absolutely nothing.

    • by AmiMoJo ( 196126 ) on Thursday August 11, 2022 @04:23PM (#62781382) Homepage Journal

      Gamers Nexus have been covering ARC a lot lately. It's bad. Performance is acceptable at the price point in some newer games. Older games perform a lot worse, because Intel doesn't have the resources to go back and optimise for older versions of DirectX and older game engines.

      The bigger issue is the state of the drivers. They just don't work. A lot of features cause game breaking bugs like corrupt graphics or horrible tearing. Some completely kill performance.

      I'd recommend giving them a few generations before even considering Intel ARC. They might just decide to drop it entirely, or the drivers might never reach maturity.

      • by Khyber ( 864651 )

        I have a laptop with an Arc-A in it that I bought last month.

        For every game I care to play (including newer ones released as late as two years ago) I've been able to maintain consistent more-than-playable framerates (Excepting Zandronum-based DOOM II port with Complex Doom Invasion - that simply needs Vulkan support for the draw call issue it has.) Dead Cells, Super Animal Royale, Dead by Daylight, and more, these all play just fine (and I got my rank-1 killer achievement in DBD last night.)

        Gamers Nexus use

        • Gamers Nexus uses shitty benchmark software (and shitty choice of horridly-unoptimized games/game engines) so I take literally EVERYTHING with about ten grains of sodium chloride (and a few grains of potassium chloride just to level out the heart activity.)

          And which benchmarks do you use and why are your benchmarks better?

          Run the tests for yourself instead of relying upon idiot sites that are paid to game shit for ad dollars. YOU CAN RETURN PRODUCTS YOU DEEM DEFECTIVE.

          That requires me to buy the product then test it against other systems that I may not have. Then according to you return the system even though it is not defective but does not perform the way I want.

          • by Khyber ( 864651 )

            "And which benchmarks do you use and why are your benchmarks better?"

            I actually play the games and chart the FPS. Most engines do have an FPS counter you can enable, if you have games from competent developers!

            • I actually play the games and chart the FPS. Most engines do have an FPS counter you can enable, if you have games from competent developers!

              So you perform your own tests for games which you like in highly variable scenarios. How is that benchmarking? That's like me saying one car is better than the other based on the random roads I take.

        • by Luckyo ( 1726890 )

          Your list of games includes two games that have basically no meaningful GPU performance requirements being 2D games, and third one has requirements listed as GTX 460/HD 6850 with 1GB of VRAM.

          You could run those on integrated graphics from AMD. Heck, you could probably run it on intel's integrated, thought DBD would probably have some stutter.

          • by Khyber ( 864651 )

            "Your list of games includes two games that have basically no meaningful GPU performance requirements being 2D games"

            They're still running on a 3D composited engine utilizing everything except polygons. This requires OpenGL hardware acceleration. Otherwise, they could run in pure software - they do not.

            "You could run those on integrated graphics from AMD"

            The Ryzen 3 laptop I have struggles to run Dead Cells or The Binding of Isaac at consistent framerates, going between 25-40 (and Binding of Isaac has insan

            • by Luckyo ( 1726890 )

              If you run on lowest possible CPU, you're obviously going to run into issues. That's not about graphics, that's also about CPU being very slow, memory likely being low and slow and so on.

              For example, I can run dead cells on mid tier i5 from half a decade ago with intel iGPU no problems. It has basically no GPU requirements worth mentioning. Notably not my opinion but that of the developers. They mention ancient 450GTS as minimal requirements. That's low end discrete GPU from 12 years ago. Any modern iGPU ou

          • by Khyber ( 864651 )

            Oh, Super Animal Royale is vector-based, not 2D pixel-based. Much faster with a 3D card than pure CPU software rendering. Also, again, on a 3D composition engine.

            • by Luckyo ( 1726890 )

              Irrelevant nitpicking. Neither has meaningful GPU requirements beyond integrated GPUs. CPU software rendering hasn't been the option in consumer space in over a decade at least.

      • because Intel doesn't have the resources

        >In the years 2011-2015, Intel ... lavished shareholders with $36b. in stock buybacks

        >From 2016 through 2020, Intel ... $45b. as buybacks.

        >Last year the company ... used $2.4 billion ... to repurchase 39.5 million shares of stock.

        • So your point is that they don't have the resources because they are spending it all on stock buybacks?
          • I know words are really hard for slashdotters these days, but spending it on an optional, unrelated luxury is not the same as not having the resources.

            You may or may not play video games, but you're not a nerd and never were one.

            • spending it on an optional, unrelated luxury is not the same as not having the resources

              It's your opinion that this is an "optional, unrelated luxury". While I would agree with that, the point that you are missing is that many modern CEOs seem to think that this is neither optional nor a luxury.

      • The benchmarks by Nexus were for a completely different line, and done with drivers still in beta. After the benchmarks Intel found a bug which was really hampering the performance. Let's just wait a while for the drivers also to improve and for the actual price.
    • Shame there's only outrage farmers on the internet.

  • Lets see if competition drives some prices down.
    • Prices are already down to normal levels or even lower...
      • Not until a new mid range card is in the 200-250 range. They are still sitting in the 400-500 range.
      • by Khyber ( 864651 )

        No they are not. They're about 40% higher compared to 5 years ago, for very, VERY LITTLE actual performance gain excepting raytracing, at least on nVidia's side. And the only thing new there is the raytracing and a couple of effects shaders (which the shaders could be done in Vulkan or OpenGL on any sufficient hardware, though OpenGL has draw call issues/limitations.) All other accelerated use-cases have LAGGED, outside of streaming video output, for attention-whoring/content providing, and current games sh

        • by JMZero ( 449047 )

          5 years back gets you a new'ish 1080ti (Mar 2017). Looking at a few games and composite benchmarks, a 3080ti produces about double the FPS in regular (not raytraced) games.

          • Cards that were released 5 years ago and are still sold today are *more expensive now* than they were at release. Including the ones that were panned as being over-priced at release!

            Oranges not needed here, the apples to apples comparison shows a price increase over time, where you'd expect the older technology to have gone down significantly in price. Used prices on years-old models are high now than new prices at release.

  • Even if it does... It's not going to complete with a 4060 or RX7500. New cards are coming in Hot and Intel missed their opportunity.
    • To be fair the money is in the mid-range cards and not flagship cards. The volume is huge for cards that are good enough for all the new games at a moderate resolution, but are not the most expensive card possible.

      More competition is better for consumers. Assumes Intel's new card is real competition and not hot air.

      • The 4060 and RX7500 are mid range cards.

        • New cards are coming in Hot and Intel missed their opportunity.

          • by Luckyo ( 1726890 )

            Considering the mass of reports on overwhelmed warehouses with all the unsold current gen GPUs, all signs point to "coming in hot" being a fairly distant thing for next gen GPUs beyond top tier halo models that don't really compete with current gen models.

            • The original poster (and a later comment in this thread) is referring to NVIDIA RTX 40-series and AMD RX 7000 series cards discussed last month. Try to keep up.

              • by Luckyo ( 1726890 )

                Which will remain unreleased for SKUs competing with those that sit in storage unsold until said warehouses are sold out. Try to keep up.

    • by Scoth ( 879800 )

      If it competes favorably to a 3060, and comes in at a way cheaper price point, that could be a big win for consumers and gamers who don't need the latest and greatest flagships. I'm all for more competition and a return to there being an actual entry and mid-level price point GPU good enough for gaming. I remember when $200-$250 would get you a pretty damn good card that was probably 80% of cards costing $700 and up. I miss those days.

      • by Luckyo ( 1726890 )

        The problem is that for those of us who would buy something like that, we need it to run older graphically intensive games well.

        And that is something Intel's new offerings just don't do and will not do any time soon if Petersen's statements on what they're focusing on with driver development are to be believed.

        There's also the supposed thing with Intel scaling better with resolution, but 4k gaming on lower end models is still awful. It's just slightly less awful on Intel. Not a selling point, especially con

    • by Khyber ( 864651 )

      I bet you nuclear weapons to dollars that if the Demoscene got hold of Intel's internal documentation, they'd RAPE AMD and nVidia in graphical capability.

      • I bet you nuclear weapons to dollars that if the Demoscene got hold of Intel's internal documentation, they'd RAPE AMD and nVidia in graphical capability.

        If only you could make a dollars to dollars comparison. If the demoscene got ahold of everyone's internal documentation, Intel would be a distant almost-ran in the GPU performance department just like they are today.

      • I fail to see how getting Intel's internal documentation equates to needlessly violent depiction of capabilities against AMD or NVidia offerings. The current information put out by Intel shows their new unreleased midrange card might be equal to NVidia’s lowest midrange card that NVidia released a year ago. Yay?
        • by Khyber ( 864651 )

          What the Demoscene can accomplish with very, very little actual computing power would astound you.

          • And you are assuming that Intel has not shown the best that their GPU can do and some hidden secret can be gleamed from internal documentation. Maybe a secret SET_TURBO_ON = TRUE was missed by Intel engineers.
  • by skovnymfe ( 1671822 ) on Thursday August 11, 2022 @04:18PM (#62781358)
    I mean, what Intel seems to be doing best these days is ship their chips with all kinds of bugs in them, so I might as well just ask outright what bugs they have put in this time, and how much of a performance penalty we can expect when they've been patched?
    • From what others have said, the bugs are first generation of a product type bugs where things are not optimized as they should be. Also some of them are probably a consequence of adding too many features to the drivers and not focusing on core functionality. For example, adding in ray tracing which seemed to break things. Why is ray tracing that important? Even for NVidia's flagship GPUs, ray tracing has minimal benefits. Yet in the first drivers of a first generation GPU, Intel is trying to add it.
      • Why is ray tracing that important?

        Because both AMD and nvidia's latest GPUs do it, so if theirs don't, they will get pounded directly in the ass in marketing. AMD and nvidia will literally pay devs to use the functionality and provide them engineering assistance to get it working if necessary to make Intel look like a bag of lames.

        • You would think that engineers would know better than to give in to marketing but Intel is far from the engineering company it used to be. But my point is not to add ray tracing ever. I question why does it need to be in the first generation of drivers especially when it breaks other things and the ray tracing if it worked would be so marginal as to barely be a benefit.
      • https://www.pcgamesn.com/intel... [pcgamesn.com]

        They may not even last that long.

  • "Nothing is free in 3D" and it's up to the consumer to decide if those features are worth it to them. Ray tracing is a hit on frame rate on a 3060, even though it has the dedicated transistors for that, but DLSS in its latest form is for most people always going to be a no-brainer on a RTX 3060, when the option to use the latest version of it is present.

    Shadow of the Tomb Raider, for example, was iffy at best with DLSS originally, but the latest iteration of that rendering method has shown its worth, and so

  • sometimes by 30-40%. So they've got a ways to go on drivers. If you're an early adopter and the prices isn't too bad I could so going for it on a second rig. The price is likely to be in the $300-$400 range assuming nothing goes crazy in the crypto markets (knock on wood).

    Intel normally has very, very good driver support. So as long as they don't just give up on this (which they might, pretty sure they got into this for a piece of the crypto pie) then in a year or two we'll have 3 viable GPU manufacture
  • Does it kill apple silicon as well?

    • by CAIMLAS ( 41445 )

      Quite probably.

      https://www.notebookcheck.net/M1-Max-vs-RTX-3080-Laptop-GPU-Apple-SoC-excels-in-synthetic-benchmarks-but-flounders-alongside-the-M1-Pro-in-gaming-tests.575086.0.html

      • by Bert64 ( 520050 )

        What those reviews don't show you is wether the programs being benchmarked are native ARM, or running under Rosetta.
        Anything running under emulation is going to perform much worse than native code.

  • by Junta ( 36770 )

    How will it fare:
    -When benchmarks are run by independent parties
    -When RTX4060 is out (which will probably happen shortly after A750 can come out)
    -In the benchmarks other than the ones Intel hand picks

  • I'm sick of NVidia being the main show in town. Some serious competition would do everyone the world of good.
  • by Guspaz ( 556486 ) on Thursday August 11, 2022 @06:23PM (#62781822)

    What little independent testing that has been done shows that Intel's cards can, in a very limited set of games, produce decent average framerates, but TERRIBLE 0.1% and 1% lows. Meaning that they have terrible frame time consistency.

    • Which is interesting because with the same (non-Intel) GPU historically for your same amount of dollars you would get higher minimum frame rates with Intel, but higher maximum frame rates with AMD (And more cores.) So here Intel is, failing at minimum frame rates with their GPU...

      • by Luckyo ( 1726890 )

        It's not really interesting because interactions between GPU and CPU are a whole different beast from GPU itself handling the load.

        Most likely culprit is the drivers. It took both nvidia and ATI/AMD many years before theirs were up to snuff in gaming. The only real surprise is that intel didn't expect drivers to be a significant enough problem to put a lot of people on it ASAP when it became clear they will be making discrete GPUs to trawl through most played stuff on steam for last few years and make sure

        • by Guspaz ( 556486 )

          There's speculation that there are hardware issues with the chips that may be responsible for many of their woes beyond what drivers can fix. They require rebar support from the motherboard, for example. Without it, their performance falls off a cliff to the point of being useless. I can't remember exactly, I think it was like a ~40% reduction in average framerate and "terrible frametime consistency" turning into "massive stuttering problems". The problem is that because Intel's going to be competing in mos

          • by Luckyo ( 1726890 )

            Thing is, their discrete GPUs are just like their 12th gen CPUs. If you don't run win 11 (with resizable bar support and new scheduler that supports different types of cores), they utterly suck and have horrid problems.

            But this also shows it to be fundamentally a software problem. There may be something beyond that, and there are indeed rumors about it. But before those potential hardware problems, software problems are by far the worst aspect of intel's new hardware. Both in CPU and GPU space.

  • All these press junkets and releases are useless since no tech enthusiasts ever takes the performance numbers that any manufacturer releases at face value. Intel just needs to release the d*mn cards ASAP at a "we'll ignore the sh*tty drivers" price point and let the good tech sites publish unbiased benchmarks.

    None of the info released by Intel is doing them any favors since it's lower-middle of the pack performance when the top performers get all the attention (a fact that AMD/ATI knows all too well.)

Intel CPUs are not defective, they just act that way. -- Henry Spencer

Working...