


Nvidia's RTX 5060 Review Debacle Should Be a Wake-Up Call (theverge.com) 60
Nvidia is facing backlash for allegedly manipulating the review process of its GeForce RTX 5060 GPU by withholding drivers, selectively granting early access to favorable reviewers, and pressuring media to present the card in a positive light. As The Verge's Sean Hollister writes, the debacle "should be a wake-up call for gamers and reviewers." Here's an excerpt from the report: Nvidia has gone too far. This week, the company reportedly attempted to delay, derail, and manipulate reviews of its $299 GeForce RTX 5060 graphics card, which would normally be its bestselling GPU of the generation. Nvidia has repeatedly and publicly said the budget 60-series cards are its most popular, and this year it reportedly tried to ensure it by withholding access and pressuring reviewers to paint them in the best light possible.
Nvidia might have wanted to prevent a repeat of 2022, when it launched this card's predecessor. Those reviews were harsh. The 4060 was called a "slap in the face to gamers" and a "wet fart of a GPU." I had guessed the 5060 was headed for the same fate after seeing how reviewers handled the 5080, which similarly showcased how little Nvidia's hardware has improved year over year and relies on software to make up the gaps. But Nvidia had other plans. Here are the tactics that Nvidia reportedly just used to throw us off the 5060's true scent, as individually described by GamersNexus, VideoCardz, Hardware Unboxed, GameStar.de, Digital Foundry, and more:
- Nvidia decided to launch its RTX 5060 on May 19th, when most reviewers would be at Computex in Taipei, Taiwan, rather than at their test beds at home.
- Even if reviewers already had a GPU in hand before then, Nvidia cut off most reviewers' ability to test the RTX 5060 before May 19th by refusing to provide drivers until the card went on sale. (Gaming GPUs don't really work without them.)
- And yet Nvidia allowed specific, cherry-picked reviewers to have early drivers anyhow if they agreed to a borderline unethical deal: they could only test five specific games, at 1080p resolution, with fixed graphics settings, against two weaker GPUs (the 3060 and 2060 Super) where the new card would be sure to win.
- In some cases, Nvidia threatened to withhold future access unless reviewers published apples-to-oranges benchmark charts showing how the RTX 5060's "fake frames" MFG tech can produce more frames than earlier GPUs without it.
Some reviewers apparently took Nvidia up on that proposition, leading to day-one "previews" where the charts looked positively stacked in the 5060's favor [...]. But the reality, according to reviews that have since hit the web, is that the RTX 5060 often fails to beat a four-year-old RTX 3060 Ti, frequently fails to beat a four-year-old 3070, and can sometimes get upstaged by Intel's cheaper $250 B580. And yet, the 5060's lackluster improvements are overshadowed by a juicier story: inexplicably, Nvidia decided to threaten GamersNexus' future access over its GPU coverage. Yes, the same GamersNexus that's developed a staunch reputation for defending consumers from predatory behavior, and just last month published a report on "GPU shrinkflation" that accused Nvidia of misleading marketing. Bad move! [...]
Nvidia is within its rights to withhold access, of course. Nvidia doesn't have to send out graphics cards or grant interviews. It'll only do it if it's good for business. But the unspoken covenant of product reviews is that the press, as a whole, gets a chance to warn the public if a movie, video game, or GPU is not worth their money. It works both ways: the media also gets the chance to warn that a product is so good you might want to line up in advance. That unspoken rule is what Nvidia is trampling here.
Nvidia might have wanted to prevent a repeat of 2022, when it launched this card's predecessor. Those reviews were harsh. The 4060 was called a "slap in the face to gamers" and a "wet fart of a GPU." I had guessed the 5060 was headed for the same fate after seeing how reviewers handled the 5080, which similarly showcased how little Nvidia's hardware has improved year over year and relies on software to make up the gaps. But Nvidia had other plans. Here are the tactics that Nvidia reportedly just used to throw us off the 5060's true scent, as individually described by GamersNexus, VideoCardz, Hardware Unboxed, GameStar.de, Digital Foundry, and more:
- Nvidia decided to launch its RTX 5060 on May 19th, when most reviewers would be at Computex in Taipei, Taiwan, rather than at their test beds at home.
- Even if reviewers already had a GPU in hand before then, Nvidia cut off most reviewers' ability to test the RTX 5060 before May 19th by refusing to provide drivers until the card went on sale. (Gaming GPUs don't really work without them.)
- And yet Nvidia allowed specific, cherry-picked reviewers to have early drivers anyhow if they agreed to a borderline unethical deal: they could only test five specific games, at 1080p resolution, with fixed graphics settings, against two weaker GPUs (the 3060 and 2060 Super) where the new card would be sure to win.
- In some cases, Nvidia threatened to withhold future access unless reviewers published apples-to-oranges benchmark charts showing how the RTX 5060's "fake frames" MFG tech can produce more frames than earlier GPUs without it.
Some reviewers apparently took Nvidia up on that proposition, leading to day-one "previews" where the charts looked positively stacked in the 5060's favor [...]. But the reality, according to reviews that have since hit the web, is that the RTX 5060 often fails to beat a four-year-old RTX 3060 Ti, frequently fails to beat a four-year-old 3070, and can sometimes get upstaged by Intel's cheaper $250 B580. And yet, the 5060's lackluster improvements are overshadowed by a juicier story: inexplicably, Nvidia decided to threaten GamersNexus' future access over its GPU coverage. Yes, the same GamersNexus that's developed a staunch reputation for defending consumers from predatory behavior, and just last month published a report on "GPU shrinkflation" that accused Nvidia of misleading marketing. Bad move! [...]
Nvidia is within its rights to withhold access, of course. Nvidia doesn't have to send out graphics cards or grant interviews. It'll only do it if it's good for business. But the unspoken covenant of product reviews is that the press, as a whole, gets a chance to warn the public if a movie, video game, or GPU is not worth their money. It works both ways: the media also gets the chance to warn that a product is so good you might want to line up in advance. That unspoken rule is what Nvidia is trampling here.
Same shit different year. (Score:5, Insightful)
A 3080 is superior to a 4070 and DEFINITELY a 4060... but NVidia doesn't enable half the goodies inside the 4XXX driver's DLSS set (despite the 3XXX series being fully capable of frame generation and other artificially "restricted" functions). The only reason the 4070 and 4060 look "better on paper" is the fact that the drivers have been sabotaged for the 3XXX and 2XXX.
And you can see this clearly when you enable FSR on a 3XXX and 2XXX card.
NVidia is going to piss away their fanbase at this rate.
Re: (Score:3)
They'll piss away their fan(atics) base, but they'll continue to maintain the dominant marketshare unless Intel continues to make big strides in their drivers and is willing to take a few more years of deficit spending for their GPU section.
Re:Same shit different year. (Score:5, Interesting)
The GamersNexus video "NVIDIA's Dirty Manipulation of Reviews" https://www.youtube.com/watch?... [youtube.com] (mentioned in TFA) is def worth watching.
Nvidia are on top of the world ... and instead of just doing what originally got them there... they're going the enshitification route. (in graphics cards, anyway). ... i heart marketing.
Re:Same shit different year. (Score:5, Insightful)
Why? Gaming graphics cards are a drop in the bucket for their revenues these days. They really don't care - they can drop all the GPUs on the market today and there would only be a slight blip on the revenue. Their main profit center is compute cards they sell for AI and other purposes.
That's why they don't care there's a GPU shortage, or they're basically handing GPUs to AMD. They have a loyal fanbase who will buy the 5090 GTX TI SUPER OC OMGSWTFBBQ edition for $10,000. (Heck, Asus I think is selling a gold plated "Dubai Edition" or something)
Re: (Score:3)
Exactly. NVidia has a long history of cheating. Everybody knows they cannot be trusted.
None of that matters as long as they dominate the GPGPU market. Gaming is a side hustle now.
Re: (Score:2)
NVidia is going to piss away their fanbase at this rate.
Huh? Why would AI companies care about this? - Snide comment aside NVIDIA has absolutely fucked over budget gamers. The fact that they are still fans at all is mindblowing. NVIDIA has one gaming market left - the mid-high end where they lack competition. The low-end gamers have been pissed away a while back.
Re: (Score:1)
There is a solution (Score:1)
Just do the reviews after the cards and drivers are released, and buyers can wait until they see the reviews. I don't think Nvidia can prevent that.
Re: (Score:3)
Re: (Score:2)
People would like to have reviews on launch day, so they can know on the day the cards are available whether they are worth buying, because they tend to sell out and they want to know if they should try to get one.
If the manufacturer is going to fail to produce enough units for initial demand, they ought to at least make sure the devices get reviewed before release, instead of doing exactly the opposite and making sure they don't.
Re: (Score:1)
Re: There is a solution (Score:2)
If you know how to make people behave rationally, please let us know.
problem is (Score:1)
The "previews" described above were one issue, that the only press coverage out there was giving nVidia's exact test suite and basically parroting their cherrypicked bullshit numbers.
The related issue is that the sites that agreed to do previews could use the same drivers to prepare their own reviews in advance of launch, while nobody else had drivers, so that at launch they would have the only reviews out there. It's nice that Hardware Unboxed managed to dash out a review the same day of release out of she
Re: (Score:2)
That would end reviewers getting free shit.
oh puleeze (Score:2, Insightful)
It's not about reviewers "getting free shit." They need to get the cards (AND DRIVERS) in advance of the release date so that their reviews go live at a point in time when consumers actually care about the review. They also need access to technical support to get help when the buggy pre-release drivers don't work. The reviewers can easily afford to buy the stuff if they need to; they're running an actual business with employees and rent and all kinds of other costs. Having to buy a $100 case or a $500 GPU i
Re: (Score:2)
for big enough reviewers (those that are more than one men show), paying for cards if needed is not a issue. When a video generates between 1k to 100k, paying for the card is still profit and you keep the card to compare it later on (when some free stuff you have to actually return it later on)
Re: (Score:2)
That's not even considering someone like LTT, who could by the card at retail, test it, scribble a sharpie signature on the fan shroud, and sell it for more than it was bought for - ignoring profit from the review itself.
Re: There is a solution (Score:2)
Re: (Score:1)
"Buyers should wait instead of buying a product on release day".
The way I can tell the 5060 8 gig is absolute crap (Score:2)
AMD though cannot match Nvidia on production so their cards aren't any more available, now there is Intel's b580.
Re:The way I can tell the 5060 8 gig is absolute c (Score:4, Insightful)
You can also know it's crap because it's got 8GB on it. 12GB is a practical minimum today, and 16GB is the new 8GB anyway.
Nvidia clearly doesn't even need this market, and the only reason we really need them in it is to keep AMD from raising their prices. I don't suggest that it would make sense for them to stop making consumer GPUs (it's not a small amount of income or profit to be made there) but they're clearly acting like they don't need us, and it's a weird time for that since we've never needed them less — in particular in this low end of the market where they are pulling the review sample/driver availability shenanigans.
Re: (Score:3)
I don't suggest that it would make sense for them to stop making consumer GPUs (it's not a small amount of income or profit to be made there)
Looking at opportunity costs, each gaming GPU costs Nvidia thousands of dollars because that silicon could have been made into a data center chip that sells for 10-20x the price.
Re: (Score:2)
That's assuming that they are cannibalizing sales of those devices. They can only sell so many of them.
Re: (Score:2)
That's assuming that they are cannibalizing sales of those devices. They can only sell so many of them.
Only Nvidia knows for sure, but Jensen Huang has been known to say internally that data center demand is 2-3x current sales and that sales are mostly supply constrained (mostly due to CoWoS packaging). Even with increasing constraints on China sales, total sales continue to increase, which lends some credence to the large demand theory.
I think they would like to blow off consumer space (Score:2)
So they're doing just enough to keep AMD and Intel at Bay and that's it. Also they need to keep the mobile chipset GPU makers out of their market. They are completely unthreatening right now but if they completely abdicate the consumer gaming market that could let them get a foothold into more high-end gpus.
Basically it's classic anti-competitive b
Re: (Score:2)
Sure, as long as you purposely disable swap on your system as you recommend, everything should be fine. /s
Re: (Score:2)
OK bot.
Re: (Score:1)
Re: The way I can tell the 5060 8 gig is absolute (Score:2)
8 gig isn't enough to run new games (Score:2)
So you dropped $300 to $400 on a video card and your games are barely playable.
Some of this isn't necessary better optimization could fix it but PS5 and
Middle manager (Score:2)
Probably the ill begotten scheme of some cockamamie fascist wannabe middle manager, rather than a company wide policy.
Re: (Score:3)
I don't know, after having seeing how the CEO acts on stage during product reveals, I can believe the bullshitting and censoring has his full support, or was his idea.
Re: (Score:2)
It really warmed my heart to see the crowd's lackluster response to his captain america bullshit. It was a collective "what?"
Re: (Score:2)
GamersNexus brought this up in their video (the one linked in the summary here) and from their own contacts say the decisions are coming from higher up, and when they try to get info on it NVidia tries to have a lower level/PR/marketting person handle answering any questions, essentially leaving the upper management a way to escape any responsibility by not being on record.
Demonizing a hypothetical "cockamamie fascist wannabe middle manager" is dancing to NVidia upper-management's tune.
Re: (Score:2)
Except it's happening globally, which suggests it's coming from the top.
The Reviewers Shouldn't Be Surprised (Score:5, Insightful)
The reviewers do massive amounts of work to run all these cards through the ridiculous amounts of tests. They REALLY put work in... but almost universally they also refuse to give any realistic time talking about the big selling point of the last two generations of cards: Frame Generation. Instead, they focus almost entirely on raster performance. They also focus heavily on the most modern and demanding of AAA games instead of the games that that budget buyers play-- LAST YEAR'S AAA games.
Nobody buying an RTX 4060 or RTX 5060 gives a damn about 4K performance with Ultra settings an Max Ray Tracing with all DLSS turned off. That segment doesn't even own 4K monitors. It's a completely unrealistic scenario. But not only do they do those tests for shits and giggles, but they FOCUS on them. Here's are test that the GPU manufacturers want done with their budget tier of cards:
1a. What texture quality do I need to set [THIS GAME] to at 1080p to get 70 FPS so that frame gen can work really well?
1b. What does it look like before and after DLSS is enabled?
2a. What texture quality do I need to set [THIS GAME] to at 1440p to get 70 FPS so that frame gen can work really well?
2b. What does it look like before and after DLSS is enabled?
Because that's EXACTLY what the budget segment does. They buy the card they can afford and they figure out how to make it work best for them.
Of course, the reviewers won't do that. Instead, they'll put the Honda Civic on the drag strip and complain about its 0-60 speed. They'll hook up 2-ton trailer and complain about towing capacity. What they WON'T do is test the card how it's intended to be used and provide good buying advice within the budget range.
Re: (Score:1)
Re: (Score:1)
Thats because frame gen is only any good if the raster performance is already good. If the card is running at 20fps, throwing in a bunch of extra frames makes it appear to be running more smoothly, but the actual experience is terrible. And if raster frames are high enough for a good experience, then you don't need frame gen. It's just not that useful in most scenarios.
Sorry - last post incomplete :/
The reviewers do tend to use high settings, but most use 1080p & 1440p.
They also can't know what everyones setups are like, so try to isolate performance to the gpu only.
Then they put it in a chart with other cards.
They're not trying to optimize a game for getting to 70fps. They're showing the cards relative strength.
Re: (Score:2)
They also can't know what everyones setups are like, so try to isolate performance to the gpu only. Then they put it in a chart with other cards. They're not trying to optimize a game for getting to 70fps. They're showing the cards relative strength.
And that's exactly the problem. They're reviewing what the card can do in extreme circumstances and basing their "buy/don't buy" advice based on that. Those extreme circumstances, though, are in no way demonstrative of how the card will be used and thus should not be used as the basis for advice.
If you tested multiple commuter vehicles on a race track and critiqued them based on their ability to handle turns at 70mph, you would be providing zero benefit to those in the market for a commuter vehicle. And if
Re: (Score:1)
Re: (Score:2)
Thats because frame gen is only any good if the raster performance is already good.
Which is exactly why I gave the suggested methodology of...
What texture quality do I need to set [THIS GAME] to at 1080p to get 70 FPS so that frame gen can work really well?
Re: (Score:2)
1a. What texture quality do I need to set [THIS GAME] to at 1080p to get 70 FPS so that frame gen can work really well?
1b. What does it look like before and after DLSS is enabled?
As for frame generation, unless you're playing a game which is nothing more than an interactive movie, response time matters, and if the game is natively 20 fps and you're using frame generation to turn it into 80 fps, the game will be sluggish as all shit. ...
I think he means get it to 70 FPS natively so that it can frame gen up to 200+ but once you get up to 70 FPS native (esp if the 1% doesn't suck) then the casuals he talks about won't really care about the 200 fake FPS anyway.
overall it's just sad t
Re: (Score:2)
unless you're playing a game which is nothing more than an interactive movie, response time matters
100% correct. Additionally, it matters subjectively to the person experiencing the response time. Most people buying budget cards won't notice the latency increase of going from 70fps on pure raster to 120fps with DLSS/FSR. They're going to be freaking out over how smooth everything is.
Re: (Score:2)
What are you talking about dude. The issue is simple - modern games BARELY fit within 8 GB (and sometimes don't even fit into 8 GB) and when you enable the AI-based enhancement features (DLSS, frame generation)
That's exactly what was said about the 8GB cards released LAST generation and people SOMEHOW continue to use 8GB cards to play modern video games. They simply reduce texture quality a little bit. Almost every single game can be played with 8GB of VRAM, you just have to adjust the texture quality and, most importantly, the people who are buying entry-level cards aren't concerned about seeing a couple more polygons in their environmental foliage.
Re: (Score:3)
instead of the games that that budget buyers play-- LAST YEAR'S AAA games.
That is some seriously strange gatekeeping. Sorry but there's no clear guidance on what players with certain graphics cards play, nor is there any true comparison between the year and performance requirement of a AAA game. Virtually every gamer I know varies their games and performance requirements wildly. And games seem to have no consistency in max performance even related to visual play. Some games look phenomenal and run well. Others look poor and run like shit. Some modern games run better than last ye
But I must say... (Score:4, Interesting)
Re: (Score:2)
I hope Intel overall survive their current CPU tribulations so they can stay in the game.
My next card won't be NVIDIA... (Score:3)
I have only owned NVIDIA cards since a GeForce4 MX but when my current GTX 1660 Super gets replaced (which probably won't be until it fails given the high price of GPUs in Australia) my next card won't be NVIDIA due to all the crap they have pulled as of late.
Take your review with a grain of salt (Score:1)
The original article comments "But the unspoken covenant of product reviews is that the press, as a whole, gets a chance to warn the public if a movie, video game, or GPU is not worth their money".
It is very obvious that film reviews have not worked this way for a long time.
Simple solution (Score:5, Insightful)
the solution is simple, don't buy NVIDIA! the top hardware may be good, but today, things are getting worse and very overpriced
AMD have good cards, Intel new cards are also very good, specially for their cost. Both work great in linux with open drivers, while nvidia is still on the closed ones and a pain to install, maintain and full of bugs (not in games, but in X11/Wayland stuff)
The power of nvidia is their market share, drop it and games will start to work better in other cards too. With lower market share, nvidia will either lower prices or build better hardware.
TLDR: don't buy nvidia, demand AMD or Intel GPUs
Re: Simple solution (Score:3)
Re: (Score:2)
That only sucks relative to their own products. The problem is if you want top cards you are going NVIDIA regardless of what your exact performance requirements are. AMD has abandoned this segment, and Intel never played in it.
Oh Boohooo (Score:1)
NVidia has joined the club (Score:2)
With the likes of Oracle and Broadcom.
Is that just an unlucky number for them? (Score:1)