The RTX 3090 Ti is NVIDIA's New Flagship GPU (engadget.com) 78
At its CES press conference today, NVIDIA teased a new flagship GPU: the RTX 3090 Ti. It says more details will arrive soon, but handed out a few specs to tide its fans over until then. From a report: As a refresher, NVIDIA currently has the RTX 3090 at the top of its stack, with the RTX 3080 Ti close behind and the RTX 3080 as the mainstream flagship. All three are based on the same GA102 chip, with the number of active cores, clock speeds and memory configurations being the key differentiators. The RTX 3090 Ti will usurp the 3090 as the ultra high-end GPU outside of its creator line. Like the 3090, the 3090 Ti will have 24GB of GDDR6X memory, except it'll be running at 21Gbit/s, as opposed to the 19.5Gbit/s of the 3090's memory. NVIDIA also says the GPU is capable of calculating 40 shader teraflops, 78 RT teraflops and 320 tensor (AI) teraflops That compares to the 3090's 35.6 shader teraflops, 69.5 RT teraflops and 285 tensor teraflops.
Comment removed (Score:5, Insightful)
Re: Who even cares anymore besides miners? (Score:5, Informative)
Umm cept they dont and still use farms of graphics cards for other x coins
Re: (Score:3)
2012 called, they want their uninformed and downright-incorrect GPU market assessment back.
Re: (Score:1)
2012 called, they want their uninformed and downright-incorrect GPU market assessment back.
It was true again 2018/19'ish. GPU mining was no longer profitable for the home miner, you could find midrange and high end GPUs easily.
Its about Etherium which abandons mining this year (Score:2)
Umm, miners don't even care about video cards anymore either. They've long since moved on to their own ASIC platforms that can run circles around these video cards for cryptographic hashing speed.
That's true for bitcoin but not etherium. Etherium is driving the current GPU mining craze. Fortunately etherium should be switching from proof of work to proof of stake so that will change and GPU mining will decline in popularity against as it did a few year ago.
they've been saying that for years (Score:1)
Re: (Score:2)
The people deciding to switch to "proof of stake" are the people who currently own the money-making machines.
Not really, Devs and Miners are two different groups and the devs are quite enthusiastic about leaving mining behind.
The users will decide not the miners. Brand name "etherium", the main branch, is going proof of stake. If miners create a legacy fork there is little reason for users to take a chance on it. The switch will happen, that is where the desired and expected functionality will be arriving.
Re: (Score:2)
> etherium should be switching from proof of work to proof of stake
Casper was originally scheduled for 2018 IIRC. It's far from a sure bet.
Re: (Score:2)
> etherium should be switching from proof of work to proof of stake
Casper was originally scheduled for 2018 IIRC. It's far from a sure bet.
If you read up you will find the did launch various pre-requisites and believe the switch will take place this year.
Re: (Score:2)
Re: (Score:2)
Yeah, that's why there's plenty of high-end graphics cards to go around for everyone who wants one.
Oh, wait...
Re: (Score:2)
F Nvidia. They purposely allow mass buying and scalping of their cards to raise their prices.
While I share frustration in Nvidia not doing enough to prevent scalping, how does Nvidia benefit from this behavior? Nvidia still sells their cards at the standard wholesale price, while the scalpers are the one's making extra money because $500 cards are selling for $1000.
Re: (Score:2)
NVIDIA is basically selling everything they can produce. And now, even a year and a half later, they're actually selling parts kits for these cards above MSRP.
Whereas, in previous generations, the price would have come down with market saturation.
And, instead of using production to alleviate bottlenecks in availability, they're going to produce more "halo" cards that does nothing to alleviate availability problems.
Things like 3080's are still nearly unobtaininum. There's people who're on various waiting l
Re: (Score:2)
Basically they're using production to produce halo cards instead of beefing up availability of non-halo products.
Oh no! Driver updates for AMD have given them wins! Quick! Produce another $1500 card that'll be scalped for $3000+!
Re: (Score:2)
NVIDIA is basically selling everything they can produce.
Yes we are currently in the midst of a supply chain crisis globally, everybody from their direct competitors to completely different industries like automotive are having this problem.
And now, even a year and a half later, they're actually selling parts kits for these cards above MSRP.
Yes it's called supply and demand. Supply is low - again due to the global crisis - and demand is high.
Whereas, in previous generations, the price would have come down with market saturation.
Yes but again this time it's different due to the global situation affecting everybody.
And, instead of using production to alleviate bottlenecks in availability
How exactly would they do that in the current climate? Certainly they are selling out of everything they can make so it's in their interes
Re: (Score:2)
Um, wasn't it pretty much the same prior to the shortage?
Wasn't what the same? When there wasn't a shortage there was availability and cards were readily available at MSRP.
Re: (Score:2)
You don't huh?
Never mind that the chip being used in the 3080 and 3090 are the same chip. GA102.
Re: (Score:2)
Re: (Score:2)
Personally I think the SRP at least on the regular 3080 is pretty reasonable, it's the same as the 2080 super from the prior generation. The real problem is that Nvidia cant make enough which encourages a scalpers market.
I know I would vastly prefer they focus on meeting the demand of their current cards rather than creating new ones with minor improvements.
Re: (Score:2)
Fingers crossed for actual availability (Score:3)
Re: (Score:2)
Re: (Score:2)
How does the price/performance of a new premium GPU actually stack against cheaper, older GPUs these days? If you're playing a high framerate video game, clearly you're not going to beat the performance of the latest and greatest, but large rendering projects have different priorities. Which could also include details like power usage, etc. Interested in your take on the economics here.
Re:Fingers crossed for actual availability (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
You probably only cared about one or two sentences of this, but I'm kind of stupid that way.
No, that was an excellent post. Just what I wanted to know. So it looks like, as a workhorse for your purposes it's definitely worth it.
Re: Fingers crossed for actual availability (Score:2)
Re: (Score:2)
The Quadros (actually they've dropped that moniker now) are effectively the same cards with just some slight differences. I have an A6000 which has slightly more CUDA cores, slightly slower core clocks, double the amount of memory but the memory itself is GDDR6 rather than the faster GDDR6X you get in the gaming cards.
For my use case it's having a large amount of memory to handle very large scenes which is what boosts performance but it's very dependent on your individual use case which card(s) work best fo
Re: (Score:2)
Re: (Score:2)
Re: Fingers crossed for actual availability (Score:2)
Re: (Score:2)
Re: (Score:2)
The ONLY place that is happening is Bust Buy. Even EVGA has started selling theirs 20% above MSRP, direct from them. In short, purchasing at MSRP is practically impossible.
Now be nice and stop making really bad assumptions about people you don't know. It's not a good look and it doesn't make you nearly as clever as you think you're being.
How old are you? This is some weak talk I used to hea
You can look, but you can't touch (Score:2)
Re: (Score:2)
$1,500 ? No thanks (Score:5, Insightful)
I'm still happy with my now "ancient" 970 GTX - which are still fetching close to $200 on eBay, which isn't much more than I paid for it years back.
As for the RTX 3050 with a RRP of $250 - yeah, right, I'll believe that price when I see it!
GPU mining continues to completely destroy this market for most gamers, with the chip shortage not helping either.
Clearly, video card manufacturers remain uninterested in the problem, for the most part. I guess money talks, right?
Re: (Score:2)
Intel is coming out with their GPU which is supposedly pretty competetive on the low-midrange, so hopefully they're going to have the manufacturing capacity to keep them in stock. Not sure how it would compare to my GTX 1070 but anything is better than nothing.
Re: $1,500 ? No thanks (Score:1)
The only Intel card we have seen to date keeps up with a nicer model 1030... So I dont expect 1070 performance anytime soon or at all from the first gen hardware
Re: (Score:2)
Holy crap, you are not kidding.
I found my email confirmation from Newegg for my 970 GTX back in 2014. It was a splurge purchase for me at the time -- $350. I never thought I'd still be using it in 2021!
I went on eBay, and there are cards almost identical to mine going for $300.
3090 FTW3 edition is going for almost $3000.
Freaking crazy.
Re:$1,500 ? No thanks (Score:4, Funny)
Scratch that, I never thought I would be using it in 2022!
Re: (Score:2)
:D - use it till it stops working, I reckon you'll get another 5 years out of it, easy.
Perhaps by that time, cryptocurrency will be dead and the chip shortage over.
Re: (Score:2)
Holy crap, you are not kidding.
I found my email confirmation from Newegg for my 970 GTX back in 2014. It was a splurge purchase for me at the time -- $350. I never thought I'd still be using it in 2021!
I went on eBay, and there are cards almost identical to mine going for $300.
3090 FTW3 edition is going for almost $3000.
Freaking crazy.
It kinda shows that for the vast majority of gamers, the "peak" gaming experience, in terms of video performance, was reached almost a decade ago.
These are gamers who aren't after every tiny little FPS gain, who are probably not playing the latest triple A titles with crazy graphics - or who are fine with 30 to 40 fps.
There's also the problem that playing "catch up", often requires a new motherboard, which could require new RAM and will almost certainly require a new PSU.
So, if you are 7 or 8 years "out of
Re: $1,500 ? No thanks (Score:2)
Same, I am using a pair of 950 AMP! cards and I would love to replace them but it would cost at least twice as much to get barely more performance...
Re: (Score:2)
Re: (Score:2)
Clearly, video card manufacturers remain uninterested in the problem, for the most part. I guess money talks, right?
You mean the manufacturers which are playing a cat and mouse game to try and prevent their hardware being used for mining? Those "uninterested" manufacutrers?
https://www.businessinsider.in... [businessinsider.in] NVIDIA nerfed the GPU for mining.
https://www.pcgamer.com/t-rex-... [pcgamer.com] miners figured out how to avoid it.
https://www.pcgamer.com/evga-r... [pcgamer.com] EVGA fucked over gamers even more.
But can you buy it or any others? (Score:2)
It doesn't matter if you keep making new fancy GPUs if the people that buy all of them don't even use them for graphics.
Re: (Score:2)
GPU is almost a misnomer at this point because they are used for a much wider of set of application than graphics now even in common PC settings.
However I do think the card makers need to follow NVIDIA's lead and start to protect other market segments from the crypto miner bros. APUs (even AMDs) still don't deliver the gaming performance at the high end discrete products do but they are rapidly becoming 'good enough' for a large segment of the games audience and have a distinct advantage in that you can ord
Re: But can you buy it or any others? (Score:2)
If APUs become normal someone will just invent a coin that is suited to mining on them.
Needed soemthing to top the 6900XT (Score:3)
Another card (Score:2)
you cannot buy...
It's a joke at this point. (Score:2)
I have a 3090, which I was "lucky" enough to pay 1800$ for 2 months after launch.
What's even the point of 3090 Ti, oh noes - 256 more cuda cores and a mad blazing 8 more tensor cores, yay - My life is over, end of an era.
Re: (Score:2)
NVidia Doesn't Support Wayland or Linux (Score:2)
So fuck them. Buy Radeon.
Running Linux just fine on Nvidia (Score:5, Insightful)
NVidia Doesn't Support Wayland or Linux. So fuck them. Buy Radeon.
I'm running Linux just fine on Nvidia. Perhaps you mean they have closed source drivers. Don't care, I use Linux for the *nix not the politics. Most Linux users are like that.
Re: (Score:2)
I don't know about the 'Linux' comment, but Wayland isn't particularly well supported.
However, this too is not that practical a concern, as Xorg continues to work and largely deliver the functionality people need, which is why nVidia has been able to be slow on that front since there's not a lot of practical demand to move to Wayland.
For some distributions the binary driver can be a pain as they consciously make it awkward to use third-party drivers, but other distributions it's not a huge deal... except yo
Re: (Score:2)
Wayland isn't particularly well supported.
"NVIDIA confirmed that the Sway Wayland compositor is working fine with their forthcoming driver supporting GBM"
https://www.phoronix.com/scan.... [phoronix.com]
"The Fedora 36 plan for Wayland by default with the NVIDIA proprietary driver stack is laid out on the Fedora Wiki for those interested in more of the technical details."
https://www.phoronix.com/scan.... [phoronix.com]
Re: (Score:2)
I got a Radeon RX 5700 a couple years back when it came out. It was $350 USD I think. Newegg is selling them right now for $1500 USD.
Waiting for Godot... er 4k (Score:2)
I've been waiting for the better part of 5 years for a reasonably priced ($500ish) GPU that can run modern AAA games at 4k, which is pretty much the only thing you need a high end GPU for. My 5 year old GTX1080 still runs just about anything in 1080p with max settings, but never could run anything too demanding in 4k. Spending another $1k for that bump in resolution just isn't worth it.
I remember when a 3 year old video card was woefully obsolete. That's far from the case today. Games just aren't advancing
Re: (Score:1)
I remember when a 3 year old video card was woefully obsolete. That's far from the case today. Games just aren't advancing like they used to.
Modern games sure are bloated, though. Now it's all about SSD upgrades instead of video cards.
Re: (Score:2)
I use CUDA-based GPUs constantly in factory machine vision equipment. But we're talking 10s of units - but they hit a sweet spot besides games.
I'm jonesing for a few Ax000 cards to power the AI training server though.
PCIe 5.0 = $1,500 plus a new cutting edge PC (Score:2)
Re: (Score:2)
Re: PCIe 5.0 = $1,500 plus a new cutting edge PC (Score:1)
Not sure why this rather important detail isn't mentioned in any of the articles cited.
That would be because if even PCI-E 3.0 is a bottleneck in your graphics pipeline, you're doing something very wrong - PCI-E 5.0's value [in even a high-end single-GPU gaming rig] pure marketing horseshit - what matters is whether you have the VRAM bandwidth (and quantity) to run smoothly at your chosen rez.
Re: PCIe 5.0 = $1,500 plus a new cutting edge PC (Score:2)
Yeah, right. (Score:2)
The announcement & launch of a new video card is like a weather report on Mars. Sounds interesting, but has no practical use in the new few years...
Just make mining cards already (Score:2)
NVIDIA seems to have similar products but they look expensive and targeted at big data centers.
I don't know, maybe it won't make a difference, si