Nvidia Announces a $299 RTX 4060 With the 4060 Ti Arriving May 24 For $399 (theverge.com) 50
Nvidia has officially announced its RTX 4060 family of GPUs. This includes the RTX 4060 Ti, which will debut next week on May 24th starting at $399, and -- perhaps the biggest news -- the RTX 4060, which will be available in July for just $299, $30 less than the RTX 3060's original retail price. A 16GB version of the RTX 4060 Ti is also due in July for $499. From a report: Nvidia's 60-class GPUs are the most popular among PC gamers on Steam, and the launch of the RTX 4060 family marks the first time we've seen Nvidia's latest RTX 40-series cards available under the $500 price point, let alone under $300. The $399 RTX 4060 Ti will ship on May 24th with just 8GB of VRAM, while a 16GB model is due in July priced at $499. There's an ongoing debate over the value of 8GB cards in the PC gaming community right now, particularly with the arrival of more demanding games that really push the limits of GPU memory even at 1080p (if you want all the max settings enabled, that is). It's a much bigger issue at 1440p and, of course, 4K resolutions, but Nvidia appears to be positioning its RTX 4060 Ti card for the 1080p market. [...] Specs-wise, the RTX 4060 Ti will be a 22 teraflop card with AV1 encoder support and more efficient energy usage. The total graphics power is 160 watts on both the RTX 4060 Ti 8GB and 16GB models, with Nvidia claiming the average gaming power usage will be around 140 watts. The RTX 3060 Ti had a total graphics power of 200 watts, and Nvidia says it uses 197 watts during games on average, so there are some impressive power efficiency improvements here.
Summary (Score:5, Insightful)
Super cut down chip with low memory bus width, "we have tons of cache to compensate". And lot of VRAM to make it more marketable, because bigger number of storage is great even when throughput is shit. Remember the old geforce MX GPUs? Nothing new for nvidia there.
Oh and their argument for low grade chip is "use DLSS3". The anti-feature for gamers on a gamer card. Who's this for other than desperate people who need a sub 500 USD classic "high end CPU" which is now magically a mid end one in terms of price?
For those not in the know, DLSS3 uses ML AI to take two frames generated by a game and guess a frame that would be in between the two to provide illusion of smoother running game. But this process adds latency, so the actual game on the background runs a bit worse. So the main reason for having a higher frame rate, game that is more responsive to your input is made worse by turning DLSS3 on. It's an anti-feature for gamers, masquerading as a feature because it lets nvidia post huge FPS increases on promotional materials, without mentioning that this fake FPS number leads to lower actual FPS number and hence responsiveness.
Re: Summary (Score:2)
Re: (Score:3)
Fun part: nvidia's desperation to create artificial product segmentation in last two generations led to some absurd scenarios where xx60 cards outperform xx70 ones, and mobile products outperforming desktop ones.
And no, it's not worth it. In mobile space, nvidia's product segmentation is such that most xx60 laptops cannot properly send image directly to the laptop screen, whereas xx70 products can. xx60s use a feature where discrete GPU sends frame to integrated one, which in turn sends it to the laptop scr
Re: (Score:2)
you say this like it's a problem. nvidia is a company that needs to be profitable. It's cheaper to push video over the pcie bus and have the integrated card hand it off AND that prevents lower end hardware from canibalizing higher end hardware sales too much. These segmentations are how companies exist. This is necessary.
If a 3060 was as good as a 3070, there would be no 3070. a 4060 really only needs to be as good as a 3070 to land in new 'mid grade' machines from best buy, and it needs to be somewhe
Re: (Score:2)
Say it with me fanboy.
COMPANIES ARE NOT YOUR FRIENDS.
I'm on the side of the consumer. Because I am one.
Re: (Score:2)
Funny, last "AMD" GPU I had was ATI Radeon HD 4700, and it's a never again for me. Driver fuckery to make it work properly was just too much. Recently had to get two laptops for relatives, 3060 one for one that needed GPU power and a really small one that had integrated AMD graphics, because intel is absolute garbage tier in terms of performance per watt so important in that segment right now.
Within a month, win11 borked the graphics drivers rendering machine unusable. I found a fix online (surprise, many p
Re: (Score:2)
AMD CPU and nvidia GPU is the way to go. And I did go used. I am cheap as I have mentioned frequently, so I got a 1600AF on an ASRock board on eBay when my 990FX G1 Gaming board died. It came with a 1050 and a 256GB SK Hynix NVMe M.2 x4 which is now my ZFS mirror's SLOG. I paid... $240 plus another $30 to get it up from 16 to 32GB, all matching DIMMs albeit not too fast. I put my FX-8350 on my old backup non-M.2 board with its 32GB of DDR3 and got $140 for that plus the 1050. GPU is still a 1070 as I'm not
Re: (Score:2)
I would've agreed with you, except for the recent brouhaha that GamersNexus published their piece on. Which on the side revealed that firmware side of AMD CPUs is also garbage. You had a lot of people with current gen AMD CPUs starting to openly talk about general system instability with these things.
Essentially AMD moved to AM5 which is a new socket for them, which obviously requires a lot of firmware rework. And they borked it. To be fair, intel's first attempt at their big.LITTLE in their previous gen wa
Re: (Score:2)
I doubt you'll see the same people arguing both ways. The problem with the current frame interpolation is straightforward. If the game engine targets 30 fps, then the most recent, complete 'good' frame is deliberately held back for 33 ms.. The video pipeline knows the most recent picture, but must refrain to display it to display a fake frame or two instead Now I think that the obsession with input lag can be a bit silly at times, but 33ms seems to be a bit up there.
What *might* be more useful would be
Re: (Score:2)
I think your argument is about real vs. virtual frames and using DLSS3's interpolation as a cop-out for raw hardware innovation.
I don't agree with the sentiment that they are "fake" frames, however. Give me all the fake frames in the world. If this was baked into a driver, people would call it revolutionary, but since it requires dev work to implement, it's fake.
idgi
Re: (Score:2)
It's a fake frame because it's not a frame generated by the game. As a result, the primary advantage of high frame rate, faster response rate of the game is gone.
And in case of DLSS 3, it's not just that this advantage is absent. Using the feature increases input latency significantly. Hence, it's an anti-feature, and that has nothing to do with "requiring dev work to implement". If you could do this with zero dev work, it would still be an anti-feature, and those would still be fake frames. That's just obj
Re: (Score:2)
For those not in the know, DLSS3 uses ML AI to take two frames generated by a game and guess a frame that would be in between the two to provide illusion of smoother running game. But this process adds latency, so the actual game on the background runs a bit worse.
DLSS adds about 40-60ms of latency. This is barely noticeable and irrelevant unless you're playing Fortnite in the next Olympics.
So the main reason for having a higher frame rate, game that is more responsive to your input
Making the game more responsive to input is not the reason for higher frame rate. At even the lowest playable framerates the framerate is not the limiting factor for latency. If I had the choice between a choppy low framerate and a roughly 50ms of input latency, I'd choose the latter every time without even asking what content I'm playing. That last part is important even for the
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re:Summary (Score:4)
DLSS adds about 40-60ms of latency. This is barely noticeable and irrelevant unless you're playing Fortnite in the next Olympics.
Is this correct? If you're right that DLSS ADDS 40-60ms of latency--to whatever existing latency there is--that's a lot for a game!
Back in the day, there was a huge difference between "lpb" (low ping bastards) who had like 15ms pings and other players on modem or dsl or what not where ping times were 80ms+
Re: (Score:2)
https://www.youtube.com/watch?... [youtube.com]
Those numbers would be the total resulting latency. Which is not even that bad when compared to the native latency where none of the upscaling is used.
But sure, I would also opt for the DLSS performance option which reduces the latency the most. In fast paced games I really do not care that much about graphics fidelit
Re: (Score:2)
Most people will happily trade the visual quality improvements for input latency.
You must speak on behalf of gaming MORONS. The whole point of DLSS is it improve the appearance of higher resolution games. No gamer is going to tolerate deliberately sluggish input latency just to have a prettier picture at a higher resolution. DLSS is not going to improve the movie viewing experience.
But you go ahead and enjoy your pretty picture, and get killed by gamers playing at 1080p, who've turned DLSS off.
Re: (Score:2)
Re: (Score:2)
There are actually a ton of casual gamers that are quite happy with DLSS3 1080p gameplay on affordable hardware. This isn't meant for high end gamers. This market is FAR bigger than the high end gaming crowd and these are the products that prop up the R&D for a 5090 Ti super pro whatever next year. As stated in the article, the xx60 series cards are the most popular on steam and from what I've seen the 4060 is a great upgrade for those 1600/2060 users.
I'm no longer a twitch gamer that needs 120fps on
Re: (Score:2)
>There are actually a ton of casual gamers that are quite happy with DLSS3 1080p gameplay on affordable hardware.
If "ton" equals "zero", you're right. The only hardware currently available that can run DLSS 3 is high end and halo products. There are no affordable DLSS 3 capable GPUs out there. Unless you're the kind of a guy who thinks that affordable toys are in the 600-700 USD range for one part out of many required for said toy to work.
Which makes one of you in the whole world.
Re: (Score:2)
You can play a semantic game here but when you resort to that sort of argument, you split reality where every single person but you sees you lose the argument and you think you've p0wned the other side.
There are more gamers running DLSS on 2xxx and 3xxx right now that will ever buy a xx80 or xx90 GPU in all of time. Those same gamers will be exceptionally happy with a 4060 and DLSS3. This may be an 'anti-feature' to the highest end rigs, but it's a real functional and well received feature for the majorit
Re: (Score:2)
And now, we went straight into lies. First, no one here is talking about "DLSS". The topic is DLSS 3. That is a very specific piece of techology that has nothing to do with DLSS 2 that works in 2xxx and 3xxx. Neither 2xxx nor 3xxx support DLSS 3. DLSS 2 is a good feature for gamers. DLSS 3 is an anti-feature for gamers. Former makes game responsiveness objectively better. Latter makes game responsiveness objectively worse.
Also, 4060 is not out. As I mentioned above, the cheapest card that supports DLSS 3 th
Re: (Score:2)
Still in this sematic argument. DLSS in all itterations is a trade off to produce higher frame rates or similar frame rates with higher visuals. The method behind them doesn't matter to the typical consumer at all. This isn't an argument about technical jargon, it's about your false claim that a feature that's previous versions are well received is somehow not a feature. I'm glad you know better than the global leader in GPUs. MOST gamers are not the gamers you are selecting. You're comparing tens of
Re: (Score:2)
Integrated upscaler vs fake image generator. Totally semantic. It's not like those are completely different technologies that have little to nothing in common.
>A small amount of latency added is nothing compared to getting to turn the visuals up a notch for the typical gamer.
"I admit I was lying anyway. Also, muh sinematic experience!"
I applaud your mindless fanboying. If nothing else, it reminds me that people who direct their religious impulse toward worshipping a company of their choice are real, deep
Re: (Score:2)
That's literally a semantic argument. You're trying to attack the terms used in the argument instead of the subject matter at hand. LITERALLY semantic argument. Maybe you thought I was using that word as some general put down but I'm not, I'm being literal.
Maybe we just let the market decide, and see if Nvidia throws DLSS out because it's a market failure or not.
Maybe gamers aren't the target (Score:5, Insightful)
Re: (Score:2)
This is something you buy because your old GPU died or you need to do a new build for your kid. i.e. you're out of options. Used GPU prices are pretty terrible until you get down to 1080/5700xt territory which might be too low end for some (I'm running a 1080 now and I'm fine with it, but it's 6 years old and showing it's age a bit).
Re: (Score:2)
i too was running a GTX 1080 for years and years. It's still a fine card, particularly if you don't care about RT or 4k.
I did wind up biting the bullet and getting a 7900 xtx though; it's still overpriced for sure, but $600 cheaper than the 4090, and ~$200 cheaper than the 4080. I don't care at all about RT, which as near as I can tell is the primary differentiator in performance between the 7900xtx and the 4090.
buying a mid range card at this point just seems like a silly idea. Due to the paltry VRAM, yo
Not worth it (Score:5, Insightful)
Honestly this 4060 should have been branded the 4050 (and priced accordingly) and the 4070 moved down to the 4060 moniker (and price).
As it is the performance boost over the 3060 will be pretty unimpressive. Aside from some pretty specific features like AV1 encoding or the power efficiency gains there's little point in upgrading.
Re: (Score:2)
Honestly this 4060 should have been branded the 4050 (and priced accordingly) and the 4070 moved down to the 4060 moniker (and price).
As it is the performance boost over the 3060 will be pretty unimpressive. Aside from some pretty specific features like AV1 encoding or the power efficiency gains there's little point in upgrading.
I don't think the 3060 owners are the target for the 4060. Someone who just bought a new card last year is not in the market for a new card this year, unless they are the rich folks who buy the xx90 each year. Yes, the 4060 is incremental compared to the 3060, and that's exactly expected. The question is how many of the 1060 owners will decide to get the 4060, since there is more than an incremental difference, and the $300 price is arguably more important that all the ways that the 4060 is inferior to t
Re: (Score:2)
You're right... NVIDIA seems to be focused on comparing the 4060 with the 2060 Super rather than with the 3060. Interposing a two generation gap between your reference and the new card makes it look much better I guess.
I use my system primarily for video editing and the 3060/12 has been the only really affordable option with enough VRAM for editors such as Davinci Resolve 18. No video editor in their right mind would purchase an 8GB card these days and memory/bus bandwith makes a *huge* difference in suc
Re: (Score:2)
Its the pricing that pisses me off. I just bought a reconditioned RTX3060 for $330. I just thought NVIDIA was going to give up on the low end, given their overpriced higher end offerings.
Re: (Score:2)
Aside from some pretty specific features like AV1 encoding
I can't even see content creators upgrading just to get sped up AV1 hardware encoding. For a cell phone, absolutely. But even if one can't afford an expensive RTX 4000 series, their CPU should be able to crush AV1 encoding anyway.
Six-Year-Old GTX 1080 Also Has 8GB RAM (Score:4, Informative)
The GTX 1080 was a great card.
Re: (Score:1)
Like comparing an i7 from Intel's 7th generation that you bought for $350 has the same amount of cores as a 13th gen i3 that is priced at less than $100. So why go for something newer?
Given the price class the GTX 1080 from six years ago, you should be comparing it with an RTX 4070, RTX 4070 Ti and perhaps soon an RX 7800 XT.
Re: (Score:2)
Re: (Score:2)
I still have a 1080. Granted I haven't played any of the newest blockbuster games, but I seem to recall it did Witcher 3 at 4k just fine.
For someone that primarily plays 1080P how would this card not be enough? And it's like 8 years old.
Re: (Score:2)
I don't know why I would another video card
You accidentally a whole world. Yes it's a meme from a decade ago, but then so is the GTX 1080 :-)
when it only has the same amount of RAM
But to your technical questions, RAM is only a requirement and a limit for texture loading. If you think the GTX1080 will perform the same as a RTX4060 then you're crazy. The RTX3060 already outperformed many GTX1080s and was equally matched in some cases, this card will be faster than that.
But as to why you would do something like this? The answer is you wouldn't. Not unless your GTX1080 breaks and you are lim
Competiting with last gen (Score:2)
Re: (Score:1)
Just getting higher performance isn't that difficult. Getting higher performance while also increasing the performance per watt is a different matter.
With today's RTX 4090 we already have GPUs that on air cooling need such a huge heat sink that it renders a lot of smaller PC cases whi
DLSS4 Super Frame Generation (Score:2)
Re: (Score:2)
Re: (Score:2)
Why stop at 2? Or 4? Or 400? Let the AI on the card generate the whole game by just giving it a paragraph of English text? ChatGPT does run on GPUs, right?
Re: DLSS4 Super Frame Generation (Score:1)