Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
IT Technology

Nvidia Announces a $299 RTX 4060 With the 4060 Ti Arriving May 24 For $399 (theverge.com) 50

Nvidia has officially announced its RTX 4060 family of GPUs. This includes the RTX 4060 Ti, which will debut next week on May 24th starting at $399, and -- perhaps the biggest news -- the RTX 4060, which will be available in July for just $299, $30 less than the RTX 3060's original retail price. A 16GB version of the RTX 4060 Ti is also due in July for $499. From a report: Nvidia's 60-class GPUs are the most popular among PC gamers on Steam, and the launch of the RTX 4060 family marks the first time we've seen Nvidia's latest RTX 40-series cards available under the $500 price point, let alone under $300. The $399 RTX 4060 Ti will ship on May 24th with just 8GB of VRAM, while a 16GB model is due in July priced at $499. There's an ongoing debate over the value of 8GB cards in the PC gaming community right now, particularly with the arrival of more demanding games that really push the limits of GPU memory even at 1080p (if you want all the max settings enabled, that is). It's a much bigger issue at 1440p and, of course, 4K resolutions, but Nvidia appears to be positioning its RTX 4060 Ti card for the 1080p market. [...] Specs-wise, the RTX 4060 Ti will be a 22 teraflop card with AV1 encoder support and more efficient energy usage. The total graphics power is 160 watts on both the RTX 4060 Ti 8GB and 16GB models, with Nvidia claiming the average gaming power usage will be around 140 watts. The RTX 3060 Ti had a total graphics power of 200 watts, and Nvidia says it uses 197 watts during games on average, so there are some impressive power efficiency improvements here.
This discussion has been archived. No new comments can be posted.

Nvidia Announces a $299 RTX 4060 With the 4060 Ti Arriving May 24 For $399

Comments Filter:
  • Summary (Score:5, Insightful)

    by Luckyo ( 1726890 ) on Friday May 19, 2023 @09:51AM (#63534751)

    Super cut down chip with low memory bus width, "we have tons of cache to compensate". And lot of VRAM to make it more marketable, because bigger number of storage is great even when throughput is shit. Remember the old geforce MX GPUs? Nothing new for nvidia there.

    Oh and their argument for low grade chip is "use DLSS3". The anti-feature for gamers on a gamer card. Who's this for other than desperate people who need a sub 500 USD classic "high end CPU" which is now magically a mid end one in terms of price?

    For those not in the know, DLSS3 uses ML AI to take two frames generated by a game and guess a frame that would be in between the two to provide illusion of smoother running game. But this process adds latency, so the actual game on the background runs a bit worse. So the main reason for having a higher frame rate, game that is more responsive to your input is made worse by turning DLSS3 on. It's an anti-feature for gamers, masquerading as a feature because it lets nvidia post huge FPS increases on promotional materials, without mentioning that this fake FPS number leads to lower actual FPS number and hence responsiveness.

    • I know that the __60 lines are inferior to the __70 and higher ones, but where does the Ti fit in? Ive seen some dell laptops that listed with either the 3060 or 3060Ti as an upgrade but wondered if it was even worth it. My laptop is approaching 3yo but has the RTX 2070 in it which was supposed to be a lot better than the 2060 out at the time.
      • by Luckyo ( 1726890 )

        Fun part: nvidia's desperation to create artificial product segmentation in last two generations led to some absurd scenarios where xx60 cards outperform xx70 ones, and mobile products outperforming desktop ones.

        And no, it's not worth it. In mobile space, nvidia's product segmentation is such that most xx60 laptops cannot properly send image directly to the laptop screen, whereas xx70 products can. xx60s use a feature where discrete GPU sends frame to integrated one, which in turn sends it to the laptop scr

        • you say this like it's a problem. nvidia is a company that needs to be profitable. It's cheaper to push video over the pcie bus and have the integrated card hand it off AND that prevents lower end hardware from canibalizing higher end hardware sales too much. These segmentations are how companies exist. This is necessary.

          If a 3060 was as good as a 3070, there would be no 3070. a 4060 really only needs to be as good as a 3070 to land in new 'mid grade' machines from best buy, and it needs to be somewhe

          • by Luckyo ( 1726890 )

            Say it with me fanboy.

            COMPANIES ARE NOT YOUR FRIENDS.

            I'm on the side of the consumer. Because I am one.

    • I think your argument is about real vs. virtual frames and using DLSS3's interpolation as a cop-out for raw hardware innovation.

      I don't agree with the sentiment that they are "fake" frames, however. Give me all the fake frames in the world. If this was baked into a driver, people would call it revolutionary, but since it requires dev work to implement, it's fake.

      idgi

      • by Luckyo ( 1726890 )

        It's a fake frame because it's not a frame generated by the game. As a result, the primary advantage of high frame rate, faster response rate of the game is gone.

        And in case of DLSS 3, it's not just that this advantage is absent. Using the feature increases input latency significantly. Hence, it's an anti-feature, and that has nothing to do with "requiring dev work to implement". If you could do this with zero dev work, it would still be an anti-feature, and those would still be fake frames. That's just obj

    • For those not in the know, DLSS3 uses ML AI to take two frames generated by a game and guess a frame that would be in between the two to provide illusion of smoother running game. But this process adds latency, so the actual game on the background runs a bit worse.

      DLSS adds about 40-60ms of latency. This is barely noticeable and irrelevant unless you're playing Fortnite in the next Olympics.

      So the main reason for having a higher frame rate, game that is more responsive to your input

      Making the game more responsive to input is not the reason for higher frame rate. At even the lowest playable framerates the framerate is not the limiting factor for latency. If I had the choice between a choppy low framerate and a roughly 50ms of input latency, I'd choose the latter every time without even asking what content I'm playing. That last part is important even for the

      • by Sebby ( 238625 )
        So where's that lawsuit you promised me, moron?
      • Wait what? 40ms-60ms lag is ridiculously noticeable. For a while, I was gigging with a MIDI-driven setup and any latency over 5ms made the piano nearly unusable.
        • Agreed on both points, just posting to point out the obvious which is that you don't actually need musical levels of timing precision for video games. Wouldn't hurt for sure, but for the vast majority of games, does it matter as much as the motion blur you get from the combination of eye movements and sample-and-hold display technologies in use today does? In my opinion, no, not even close. For perfectly sharp motion we're going to need low persistence and if we don't want to deal with strobing that's going
      • by Moridineas ( 213502 ) on Friday May 19, 2023 @01:43PM (#63535417) Journal

        DLSS adds about 40-60ms of latency. This is barely noticeable and irrelevant unless you're playing Fortnite in the next Olympics.

        Is this correct? If you're right that DLSS ADDS 40-60ms of latency--to whatever existing latency there is--that's a lot for a game!

        Back in the day, there was a huge difference between "lpb" (low ping bastards) who had like 15ms pings and other players on modem or dsl or what not where ping times were 80ms+

        • by fazig ( 2909523 )
          The youtube channel Hardware Unboxed did a piece on DLSS 2 vs DLSS 3 last year, where they also compared the added input latencies, which are not that bad.
          https://www.youtube.com/watch?... [youtube.com]

          Those numbers would be the total resulting latency. Which is not even that bad when compared to the native latency where none of the upscaling is used.
          But sure, I would also opt for the DLSS performance option which reduces the latency the most. In fast paced games I really do not care that much about graphics fidelit
      • Most people will happily trade the visual quality improvements for input latency.

        You must speak on behalf of gaming MORONS. The whole point of DLSS is it improve the appearance of higher resolution games. No gamer is going to tolerate deliberately sluggish input latency just to have a prettier picture at a higher resolution. DLSS is not going to improve the movie viewing experience.

        But you go ahead and enjoy your pretty picture, and get killed by gamers playing at 1080p, who've turned DLSS off.

    • There are actually a ton of casual gamers that are quite happy with DLSS3 1080p gameplay on affordable hardware. This isn't meant for high end gamers. This market is FAR bigger than the high end gaming crowd and these are the products that prop up the R&D for a 5090 Ti super pro whatever next year. As stated in the article, the xx60 series cards are the most popular on steam and from what I've seen the 4060 is a great upgrade for those 1600/2060 users.

      I'm no longer a twitch gamer that needs 120fps on

      • by Luckyo ( 1726890 )

        >There are actually a ton of casual gamers that are quite happy with DLSS3 1080p gameplay on affordable hardware.

        If "ton" equals "zero", you're right. The only hardware currently available that can run DLSS 3 is high end and halo products. There are no affordable DLSS 3 capable GPUs out there. Unless you're the kind of a guy who thinks that affordable toys are in the 600-700 USD range for one part out of many required for said toy to work.

        Which makes one of you in the whole world.

        • You can play a semantic game here but when you resort to that sort of argument, you split reality where every single person but you sees you lose the argument and you think you've p0wned the other side.

          There are more gamers running DLSS on 2xxx and 3xxx right now that will ever buy a xx80 or xx90 GPU in all of time. Those same gamers will be exceptionally happy with a 4060 and DLSS3. This may be an 'anti-feature' to the highest end rigs, but it's a real functional and well received feature for the majorit

          • by Luckyo ( 1726890 )

            And now, we went straight into lies. First, no one here is talking about "DLSS". The topic is DLSS 3. That is a very specific piece of techology that has nothing to do with DLSS 2 that works in 2xxx and 3xxx. Neither 2xxx nor 3xxx support DLSS 3. DLSS 2 is a good feature for gamers. DLSS 3 is an anti-feature for gamers. Former makes game responsiveness objectively better. Latter makes game responsiveness objectively worse.

            Also, 4060 is not out. As I mentioned above, the cheapest card that supports DLSS 3 th

            • Still in this sematic argument. DLSS in all itterations is a trade off to produce higher frame rates or similar frame rates with higher visuals. The method behind them doesn't matter to the typical consumer at all. This isn't an argument about technical jargon, it's about your false claim that a feature that's previous versions are well received is somehow not a feature. I'm glad you know better than the global leader in GPUs. MOST gamers are not the gamers you are selecting. You're comparing tens of

              • by Luckyo ( 1726890 )

                Integrated upscaler vs fake image generator. Totally semantic. It's not like those are completely different technologies that have little to nothing in common.

                >A small amount of latency added is nothing compared to getting to turn the visuals up a notch for the typical gamer.
                "I admit I was lying anyway. Also, muh sinematic experience!"

                I applaud your mindless fanboying. If nothing else, it reminds me that people who direct their religious impulse toward worshipping a company of their choice are real, deep

                • That's literally a semantic argument. You're trying to attack the terms used in the argument instead of the subject matter at hand. LITERALLY semantic argument. Maybe you thought I was using that word as some general put down but I'm not, I'm being literal.

                  Maybe we just let the market decide, and see if Nvidia throws DLSS out because it's a market failure or not.

  • by aldousd666 ( 640240 ) on Friday May 19, 2023 @10:13AM (#63534791) Journal
    Garage band AI enthusiasts can run the open source quantized LLM models at decent speed with a mere 8GB card.
    • There's no way there's that many of those. Not enough for a mass market GPU.

      This is something you buy because your old GPU died or you need to do a new build for your kid. i.e. you're out of options. Used GPU prices are pretty terrible until you get down to 1080/5700xt territory which might be too low end for some (I'm running a 1080 now and I'm fine with it, but it's 6 years old and showing it's age a bit).
      • i too was running a GTX 1080 for years and years. It's still a fine card, particularly if you don't care about RT or 4k.
        I did wind up biting the bullet and getting a 7900 xtx though; it's still overpriced for sure, but $600 cheaper than the 4090, and ~$200 cheaper than the 4080. I don't care at all about RT, which as near as I can tell is the primary differentiator in performance between the 7900xtx and the 4090.

        buying a mid range card at this point just seems like a silly idea. Due to the paltry VRAM, yo

  • Not worth it (Score:5, Insightful)

    by MBGMorden ( 803437 ) on Friday May 19, 2023 @10:13AM (#63534793)

    Honestly this 4060 should have been branded the 4050 (and priced accordingly) and the 4070 moved down to the 4060 moniker (and price).

    As it is the performance boost over the 3060 will be pretty unimpressive. Aside from some pretty specific features like AV1 encoding or the power efficiency gains there's little point in upgrading.

    • Honestly this 4060 should have been branded the 4050 (and priced accordingly) and the 4070 moved down to the 4060 moniker (and price).

      As it is the performance boost over the 3060 will be pretty unimpressive. Aside from some pretty specific features like AV1 encoding or the power efficiency gains there's little point in upgrading.

      I don't think the 3060 owners are the target for the 4060. Someone who just bought a new card last year is not in the market for a new card this year, unless they are the rich folks who buy the xx90 each year. Yes, the 4060 is incremental compared to the 3060, and that's exactly expected. The question is how many of the 1060 owners will decide to get the 4060, since there is more than an incremental difference, and the $300 price is arguably more important that all the ways that the 4060 is inferior to t

      • You're right... NVIDIA seems to be focused on comparing the 4060 with the 2060 Super rather than with the 3060. Interposing a two generation gap between your reference and the new card makes it look much better I guess.

        I use my system primarily for video editing and the 3060/12 has been the only really affordable option with enough VRAM for editors such as Davinci Resolve 18. No video editor in their right mind would purchase an 8GB card these days and memory/bus bandwith makes a *huge* difference in suc

        • Its the pricing that pisses me off. I just bought a reconditioned RTX3060 for $330. I just thought NVIDIA was going to give up on the low end, given their overpriced higher end offerings.

    • Aside from some pretty specific features like AV1 encoding

      I can't even see content creators upgrading just to get sped up AV1 hardware encoding. For a cell phone, absolutely. But even if one can't afford an expensive RTX 4000 series, their CPU should be able to crush AV1 encoding anyway.

  • by BrendaEM ( 871664 ) on Friday May 19, 2023 @10:24AM (#63534813) Homepage
    I don't know why I would another video card, hoping to have it for even 4 years--when it only has the same amount of RAM--as the one I am replacing it with.
    The GTX 1080 was a great card.
    • by Anonymous Coward
      You are comparing a higher end product of its time against a lower end product of its time.

      Like comparing an i7 from Intel's 7th generation that you bought for $350 has the same amount of cores as a 13th gen i3 that is priced at less than $100. So why go for something newer?

      Given the price class the GTX 1080 from six years ago, you should be comparing it with an RTX 4070, RTX 4070 Ti and perhaps soon an RX 7800 XT.
      • AMD Polaris cards (470/480 models) that were mid-range cards of that same era also had an 8 GB option. There are a growing number of titles where 8 GB isn't going to cut it anymore. The BVH tree used if you turn on ray tracing in a game can eat up around a GB of VRAM in some titles which makes it even more difficult to justify anything below 12 GB for someone who isn't upgrading on a regular basis.
    • by Hodr ( 219920 )

      I still have a 1080. Granted I haven't played any of the newest blockbuster games, but I seem to recall it did Witcher 3 at 4k just fine.

      For someone that primarily plays 1080P how would this card not be enough? And it's like 8 years old.

    • I don't know why I would another video card

      You accidentally a whole world. Yes it's a meme from a decade ago, but then so is the GTX 1080 :-)

      when it only has the same amount of RAM

      But to your technical questions, RAM is only a requirement and a limit for texture loading. If you think the GTX1080 will perform the same as a RTX4060 then you're crazy. The RTX3060 already outperformed many GTX1080s and was equally matched in some cases, this card will be faster than that.

      But as to why you would do something like this? The answer is you wouldn't. Not unless your GTX1080 breaks and you are lim

  • See, nvidia has a problem with competing with it's previous versions, I would assume a used 3000 series would far outperform a low end 4000 series. If Moores law is dead, they are going to have a hard time competing with their current flagship 4090. So far they have been cutting production early, ensuring they have no stock of the previous generation, and then bring out the next one that has marginal gains at double the price. These cards are "cheap", but I would imagine something like 95% of the price of
    • by Anonymous Coward
      At worst they could just pack it with twice the "cores" by doubling the die size and then also driving power draw to something like 1000W, where the next time you get a cable/plug failure, it will likely start a house fire.

      Just getting higher performance isn't that difficult. Getting higher performance while also increasing the performance per watt is a different matter.

      With today's RTX 4090 we already have GPUs that on air cooling need such a huge heat sink that it renders a lot of smaller PC cases whi
  • Nvidia will have to release DLSS4 for 50xx series that inserts two frames instead of one, this way they can still show the improvement from the 40xx series for DLSS3 vs DLSS4.

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...