Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Businesses Technology

The RTX 3090 Ti is NVIDIA's New Flagship GPU (engadget.com) 78

At its CES press conference today, NVIDIA teased a new flagship GPU: the RTX 3090 Ti. It says more details will arrive soon, but handed out a few specs to tide its fans over until then. From a report: As a refresher, NVIDIA currently has the RTX 3090 at the top of its stack, with the RTX 3080 Ti close behind and the RTX 3080 as the mainstream flagship. All three are based on the same GA102 chip, with the number of active cores, clock speeds and memory configurations being the key differentiators. The RTX 3090 Ti will usurp the 3090 as the ultra high-end GPU outside of its creator line. Like the 3090, the 3090 Ti will have 24GB of GDDR6X memory, except it'll be running at 21Gbit/s, as opposed to the 19.5Gbit/s of the 3090's memory. NVIDIA also says the GPU is capable of calculating 40 shader teraflops, 78 RT teraflops and 320 tensor (AI) teraflops That compares to the 3090's 35.6 shader teraflops, 69.5 RT teraflops and 285 tensor teraflops.
This discussion has been archived. No new comments can be posted.

The RTX 3090 Ti is NVIDIA's New Flagship GPU

Comments Filter:
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Tuesday January 04, 2022 @01:39PM (#62142257)
    Comment removed based on user account deletion
    • by skam240 ( 789197 )

      Personally I think the SRP at least on the regular 3080 is pretty reasonable, it's the same as the 2080 super from the prior generation. The real problem is that Nvidia cant make enough which encourages a scalpers market.

      I know I would vastly prefer they focus on meeting the demand of their current cards rather than creating new ones with minor improvements.

      • by EvilSS ( 557649 )
        Even the 3070 is extremely capable. They also announced a $250 3050 which should be fine for a large majority of people who don't need to play every game on ultra on a 250hz 1440p display at 100+ FPS. During any normal time, it would be an awesome time to get a new card. Of course, right now, it sucks.
  • by Baconsmoke ( 6186954 ) on Tuesday January 04, 2022 @01:49PM (#62142273)
    I do some very heavy 3d rendering projects. So, I could really use the 3090 or the 3090ti, but it has been extraordinarily difficult to get my hands on a 3090 so I will attempt to get my hands on a ti version. And my issue is that I really need two or three if at all possible. I dream of the day that buying video cards at MSRP is easy again.
    • Not likely anytime soon, Miners have already discovered they can firmware flash 3080's that were supposedly disabled for mining with a 3rd party 3090 firmware to "unlock" them. I'm sure any attempt to discourage these for mining will be overcome and they will be en-mass scooped up to be wasted on crypto.
    • by tragedy ( 27079 )

      How does the price/performance of a new premium GPU actually stack against cheaper, older GPUs these days? If you're playing a high framerate video game, clearly you're not going to beat the performance of the latest and greatest, but large rendering projects have different priorities. Which could also include details like power usage, etc. Interested in your take on the economics here.

      • by Baconsmoke ( 6186954 ) on Tuesday January 04, 2022 @03:05PM (#62142533)
        So, I can only speak for myself. The main issue is that this generation in particular has a very large number of CUDA cores, which is what is needed for IRAY rendering. The 3080ti, 3090, and 3090ti models all have over 10,000 cores (which is nuts compared to older cards). They are also incredibly efficient compared to previous models. To put it into perspective, my single 3080ti runs circles around my old dual 2080 cards. A single render that took my two cards around 22 minutes, now takes about 9 minutes. Considering that I do stills and animations it makes a really big difference in how quickly I can churn those out. Being able to get that down to 3 or 4 minutes per render by going to two 3090ti cards with the added benefit of the 24GB of ram will be very much worth the price tag. I run against the 12GB limit of the 3080ti on a daily basis and it sucks having to waste time reducing the quality of my textures and/or having to juggle the amount of polygons by reducing the complexity of the scenery or limiting how many characters I have on screen at once. Which is very counterproductive for what I do. For me to get the performance of two 3090ti cards, I would have to purchase six 2080ti cards. I don't have a system that can handle six cards and the pricing on the used 2080ti video cards is absolutely atrocious right now. Not to mention I can power the two 3090 or 3090ti cards with my single 1600W power supply. I'd have to buy a second computer, run three cards in each one with each using a 1200W power supply by doing the 2080ti cards. So, in that scenario my utilities would probably be much higher. As far as utilities go, running complex renders 18-24 hours a day has increased my utility bill by about $125 a month. Fortunately the amount of income I generate from my work more than makes that worth it. My utilities did decrease a little by going to a single card, but I'll lose that again as soon as I add a pair of 3090 or 3090ti cards. Sorry I typed so much. You probably only cared about one or two sentences of this, but I'm kind of stupid that way.
        • is this mostly blender or something else?
        • by tragedy ( 27079 )

          You probably only cared about one or two sentences of this, but I'm kind of stupid that way.

          No, that was an excellent post. Just what I wanted to know. So it looks like, as a workhorse for your purposes it's definitely worth it.

        • So question, because I'm not in that field, do the Quattro cards work for you, are they not cost efficient or are those only for like CAD?
          • The Quadros (actually they've dropped that moniker now) are effectively the same cards with just some slight differences. I have an A6000 which has slightly more CUDA cores, slightly slower core clocks, double the amount of memory but the memory itself is GDDR6 rather than the faster GDDR6X you get in the gaming cards.

            For my use case it's having a large amount of memory to handle very large scenes which is what boosts performance but it's very dependent on your individual use case which card(s) work best fo

          • The new "Quadro" line (A4000, A6000, etc) will very much do what is needed. However, they are very expensive for virtually no improvement in performance. They do provide a ton of RAM for sure, but for me I can buy two 3090ti cards which have plenty of RAM for my rendering needs and I can buy two of them for the price of a single comparable Quadro. If money was absolutely no object whatsoever, then I could justify the cost. However, since that isn't my reality I would far rather have two 3090ti cards which w
    • https://www.newegg.com/p/1FT-0... [newegg.com] In stock. You didn't look terribly hard. Or maybe you meant you can't get your hands on one for the original MSRP
      • I just bought a BMW for cheaper than one of those cards. LoL
      • Yes I am well aware you can buy the cards at insanely inflated prices. As are 99.9% of the people who have been trying to buy these cards since their release. Like I mentioned in my actual comment, I am quite interested in getting the cards at MSRP as most sane people are. Now be nice and stop making really bad assumptions about people you don't know. It's not a good look and it doesn't make you nearly as clever as you think you're being.
        • Like I mentioned in my actual comment, I am quite interested in getting the cards at MSRP as most sane people are

          The ONLY place that is happening is Bust Buy. Even EVGA has started selling theirs 20% above MSRP, direct from them. In short, purchasing at MSRP is practically impossible.

          Now be nice and stop making really bad assumptions about people you don't know. It's not a good look and it doesn't make you nearly as clever as you think you're being.

          How old are you? This is some weak talk I used to hea
  • And like most GPUs these days, they will continue to be indistinguishable from unobtainium.
    • by EvilSS ( 557649 )
      Pretty much. I got lucky and managed to get a 3090FE at MSRP from Best Buy. Wanted the 3080 but I took what I could get. I could have instantly flipped it for a 100%+ profit. And honestly, I don't see it letting up anytime soon. Scalpers who normally deal in stuff like shoes now have a taste for GPUs, and miners are still going strong over a year into the current bubble. Some are now buying A2000 and A5000 cards to mine on! It's insane.
  • $1,500 ? No thanks (Score:5, Insightful)

    by bb_matt ( 5705262 ) on Tuesday January 04, 2022 @01:55PM (#62142299)

    I'm still happy with my now "ancient" 970 GTX - which are still fetching close to $200 on eBay, which isn't much more than I paid for it years back.

    As for the RTX 3050 with a RRP of $250 - yeah, right, I'll believe that price when I see it!

    GPU mining continues to completely destroy this market for most gamers, with the chip shortage not helping either.
    Clearly, video card manufacturers remain uninterested in the problem, for the most part. I guess money talks, right?

    • Intel is coming out with their GPU which is supposedly pretty competetive on the low-midrange, so hopefully they're going to have the manufacturing capacity to keep them in stock. Not sure how it would compare to my GTX 1070 but anything is better than nothing.

    • Holy crap, you are not kidding.

      I found my email confirmation from Newegg for my 970 GTX back in 2014. It was a splurge purchase for me at the time -- $350. I never thought I'd still be using it in 2021!

      I went on eBay, and there are cards almost identical to mine going for $300.

      3090 FTW3 edition is going for almost $3000.

      Freaking crazy.

      • by Moridineas ( 213502 ) on Tuesday January 04, 2022 @03:05PM (#62142531) Journal

        Scratch that, I never thought I would be using it in 2022!

        • :D - use it till it stops working, I reckon you'll get another 5 years out of it, easy.
          Perhaps by that time, cryptocurrency will be dead and the chip shortage over.

      • Holy crap, you are not kidding.

        I found my email confirmation from Newegg for my 970 GTX back in 2014. It was a splurge purchase for me at the time -- $350. I never thought I'd still be using it in 2021!

        I went on eBay, and there are cards almost identical to mine going for $300.

        3090 FTW3 edition is going for almost $3000.

        Freaking crazy.

        It kinda shows that for the vast majority of gamers, the "peak" gaming experience, in terms of video performance, was reached almost a decade ago.
        These are gamers who aren't after every tiny little FPS gain, who are probably not playing the latest triple A titles with crazy graphics - or who are fine with 30 to 40 fps.

        There's also the problem that playing "catch up", often requires a new motherboard, which could require new RAM and will almost certainly require a new PSU.
        So, if you are 7 or 8 years "out of

    • Same, I am using a pair of 950 AMP! cards and I would love to replace them but it would cost at least twice as much to get barely more performance...

    • by EvilSS ( 557649 )
      What? This card isn't going to be $1500. What are you smoking? The 3090 is $1,500, this 3090 TI will be way more than that!
    • Clearly, video card manufacturers remain uninterested in the problem, for the most part. I guess money talks, right?

      You mean the manufacturers which are playing a cat and mouse game to try and prevent their hardware being used for mining? Those "uninterested" manufacutrers?

      https://www.businessinsider.in... [businessinsider.in] NVIDIA nerfed the GPU for mining.
      https://www.pcgamer.com/t-rex-... [pcgamer.com] miners figured out how to avoid it.
      https://www.pcgamer.com/evga-r... [pcgamer.com] EVGA fucked over gamers even more.

  • It doesn't matter if you keep making new fancy GPUs if the people that buy all of them don't even use them for graphics.

    • by DarkOx ( 621550 )

      GPU is almost a misnomer at this point because they are used for a much wider of set of application than graphics now even in common PC settings.

      However I do think the card makers need to follow NVIDIA's lead and start to protect other market segments from the crypto miner bros. APUs (even AMDs) still don't deliver the gaming performance at the high end discrete products do but they are rapidly becoming 'good enough' for a large segment of the games audience and have a distinct advantage in that you can ord

  • by SciCom Luke ( 2739317 ) on Tuesday January 04, 2022 @02:17PM (#62142377)
    I guess they really squeezed it just to take the 6900XT from its pedestal.
  • you cannot buy...

  • I have a 3090, which I was "lucky" enough to pay 1800$ for 2 months after launch.

    What's even the point of 3090 Ti, oh noes - 256 more cuda cores and a mad blazing 8 more tensor cores, yay - My life is over, end of an era.

    • by EvilSS ( 557649 )
      Yea Nvidia is slicing the 30 series pie pretty thin. 3080, 3080 ti, 3090, and now the 3090ti. Plus a rumored 3080 super in the mix as well. Between the 3080 and 3090ti, unless you really need that extra RAM, I'd bet there won't be more than a total 15% uplift in performance. 3080 to 3090 is what, about 10% on FE cards?
    • by drnb ( 2434720 ) on Tuesday January 04, 2022 @03:34PM (#62142639)

      NVidia Doesn't Support Wayland or Linux. So fuck them. Buy Radeon.

      I'm running Linux just fine on Nvidia. Perhaps you mean they have closed source drivers. Don't care, I use Linux for the *nix not the politics. Most Linux users are like that.

      • by Junta ( 36770 )

        I don't know about the 'Linux' comment, but Wayland isn't particularly well supported.

        However, this too is not that practical a concern, as Xorg continues to work and largely deliver the functionality people need, which is why nVidia has been able to be slow on that front since there's not a lot of practical demand to move to Wayland.

        For some distributions the binary driver can be a pain as they consciously make it awkward to use third-party drivers, but other distributions it's not a huge deal... except yo

        • by drnb ( 2434720 )

          Wayland isn't particularly well supported.

          "NVIDIA confirmed that the Sway Wayland compositor is working fine with their forthcoming driver supporting GBM"
          https://www.phoronix.com/scan.... [phoronix.com]

          "The Fedora 36 plan for Wayland by default with the NVIDIA proprietary driver stack is laid out on the Fedora Wiki for those interested in more of the technical details."
          https://www.phoronix.com/scan.... [phoronix.com]

    • I got a Radeon RX 5700 a couple years back when it came out. It was $350 USD I think. Newegg is selling them right now for $1500 USD.

  • I've been waiting for the better part of 5 years for a reasonably priced ($500ish) GPU that can run modern AAA games at 4k, which is pretty much the only thing you need a high end GPU for. My 5 year old GTX1080 still runs just about anything in 1080p with max settings, but never could run anything too demanding in 4k. Spending another $1k for that bump in resolution just isn't worth it.

    I remember when a 3 year old video card was woefully obsolete. That's far from the case today. Games just aren't advancing

    • I remember when a 3 year old video card was woefully obsolete. That's far from the case today. Games just aren't advancing like they used to.

      Modern games sure are bloated, though. Now it's all about SSD upgrades instead of video cards.

    • I use CUDA-based GPUs constantly in factory machine vision equipment. But we're talking 10s of units - but they hit a sweet spot besides games.

      I'm jonesing for a few Ax000 cards to power the AI training server though.

  • This is the first PCIe 5.0 video card [aroged.com], so you'll need to drop a lot more coin than the cost of just the card (on a completely new high end PCIe 5.0 motherboard, RDIMMs, and CPU at a minimum) in order to maximize its performance. Not sure why this rather important detail isn't mentioned in any of the articles cited.
    • by EvilSS ( 557649 )
      Way more than $1500. That is the 3090 price. NV hasn't announced a MSRP for the 3090ti. My guess will be $1900-$2000. Nvidia tacked on $500 to the 3080 MSRP when they released the 3080ti. I expect about the same here. Of course, unless you get lucky getting an FE card at MSRP, you will pay way more than even that. 3rd party cards are marked up significantly this generation, and on ebay, expect 100% markup over retail.
    • Not sure why this rather important detail isn't mentioned in any of the articles cited.

      That would be because if even PCI-E 3.0 is a bottleneck in your graphics pipeline, you're doing something very wrong - PCI-E 5.0's value [in even a high-end single-GPU gaming rig] pure marketing horseshit - what matters is whether you have the VRAM bandwidth (and quantity) to run smoothly at your chosen rez.

    • "drop a lot more coin" - I already tossed all my coin to the witcher...
  • The announcement & launch of a new video card is like a weather report on Mars. Sounds interesting, but has no practical use in the new few years...

  • As stupid as all crypto coins are, they don't seems to be going away any time soon and "miners" continue to hog all new video cards. I think it is time Nvidia, AMD or possibly a new player to make a dedicated mining card compatible with CUDA, OpenCL, etc. Get rid of all the graphics hardware and leave the cores and memory. Make it cheaper than the graphics cards.
    NVIDIA seems to have similar products but they look expensive and targeted at big data centers.
    I don't know, maybe it won't make a difference, si

Make sure your code does nothing gracefully.

Working...