Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology

Nvidia Says It's 'Unlaunching' the 12GB RTX 4080 After Backlash (theverge.com) 33

Nvidia is pausing the launch of its upcoming 12GB RTX 4080 graphics card. After originally unveiling the 12GB RTX 4080 last month alongside a much more powerful 16GB model, Nvidia now admits it messed up with the naming. From a report: "The RTX 4080 12GB is a fantastic graphics card, but it's not named right," says Nvidia in a blog post. "Having two GPUs with the 4080 designation is confusing." Nvidia is now pausing the launch of the 12GB RTX 4080 model but will still go ahead and launch the 16GB version on November 16th. Criticism had been building over Nvidia's decision to label the 12GB model as an RTX 4080, particularly when the 16GB model was so different.
This discussion has been archived. No new comments can be posted.

Nvidia Says It's 'Unlaunching' the 12GB RTX 4080 After Backlash

Comments Filter:
  • The 12Gb 4080 was fine. Pricing it like it was 4090 is not. 16GB should be 4080ti or 4090 Jr. 12GB should be no more than $600 USD, 16GB should be max $900, 4090 should be $1200 if they want to position in current market where you can buy a 6900XT for $650 brand new and it's faster than the 12GB and matches the 16GB in gaming.
    • by muh_freeze_peach ( 9622152 ) on Friday October 14, 2022 @12:25PM (#62966451)
      The 12gb 4080 should be a 4060ti, or 4070, based on previous generations' performance. The performance gap between the 4090 and 4080 16gb is huge.
      • Why not clear it up with a moniker like "Dual 4090" or 2x4090 for the 16gb? Make it easy to understand what is on the board? Two of the exact same chips vs one.
        • Nvidia even had the Titan brand they could've used!

          I think it's some sort of smart-ass attempt to push everything up and extract as much money out people as possible. A $1600 GPU is pretty outrageous when you think about it. But if you put it next to a $1200 4080, it looks like great value!

          But then they're fucked, their next GPU down is the 4070. But you can't sell it for $450 when it's 80% of the 4080 performance, nobody would ever by the latter. So let's also call it 4080 and charge $899 instead.

          Ta-da! An

    • Re: (Score:3, Informative)

      by Rally-555 ( 8991885 )

      The specs on the now defunct 12gb 4080 = a 4070 if related to the past cards. This wasn't a mistake it was a repositioning so that in the future they can make a 4070 that's actually a rebadged 3080ti (because they have oversupply of those chips ordered from TSMC and the AIB vendors are pissed).

      They thought they could get away with up-badging.

      The current 4090 is really what would normally be a 4080 or 4080ti (~2x perf over previous gen card 3080ti).

      The current 4080 16gb is what would normally be a 4070 (i

    • by ewhenn ( 647989 ) on Friday October 14, 2022 @01:20PM (#62966655)
      While the card itself may be fine, calling it a 4080 isn't fine. The model # typically designates a chip configuration, and the memory obviously indicates the amount of VRAM. In this case the graphics chips are different between the 4080 12G and 4080 16G. By calling it a 4080 and only changing the VRAM amount, one would deduce that the only difference is the amount of VRAM. This is not the case. Take a look at the 30 vs 40 series.

      The 3080 TI / 3080 12 GB are both GA102 chips. This makes sense because the only change is the RAM.
      The 4008 12 GB is a AD104 chip, the 3080 16 GB is a AD103 chip.

      Despite carrying the same number convention they are very different chips..

      If links are your thing:.
      https://www.techpowerup.com/29... [techpowerup.com]
      • Intel has managed fine for years with multiple i7's, i9's, and i5's every generation. Forward facing product names are not typically tied to such complicated underlying architecture.

      • The same thing happened with the 1060 6gb and 1060 3gb. Not only different memory configs,- but different chips as well. No-one batted an eye.
    • No it really wasn't fine, it was a different GPU, different memory, different bandwidth. It was a completely different product. The pricing is just the Nvidia on top of all that fucking over consumers.
    • by Jamlad ( 3436419 )
      I haven't bought an ATI card since my ~2000 All-in-Wonder Pro 128 Rage and the terrible driver experience I had. I had nothing but poor experiences with ATI boards prior.

      I admittedly suffered through the nVIDIA FX years, but this pricing nonsense has me reconsidering AMD again.

  • by xack ( 5304745 ) on Friday October 14, 2022 @12:25PM (#62966455)
    Or 4070 Ti? Or just 4070. Either way it costs $4080 once scalped.
  • Good to see them do the right thing here, they were completely different cards
    • by splutty ( 43475 )

      The fact they thought this was a good idea in the first place is all sorts of fucking stupid, though..

      So I'm not willing to give them any credit at this point.

    • Nvidia seems to have missed the fact that crypto winter is here and you can't mine Ethereum anymore with a GPU.

      If we aren't getting paid from upgrading our graphics cards we aren't going to pay much more for them.

      It was one thing to pay $1200 for a scalped 3080 when you were making $10-15 a day from it. You easily recouped the cost in less than six months. This subsidy is gone now though for those who wished to take it and costs for consumers are up across the board. Nvidia doesn't want to believe it I gues

  • Slashdot Popups (Score:5, Interesting)

    by dknj ( 441802 ) on Friday October 14, 2022 @01:00PM (#62966571) Journal

    After 22+ years, Slashdot has decided to add popup ads to subscribe to their newsletter. What could they possibly have in their newsletter that wouldn't be on an aggregated news site? Seems as though the emails on our accounts are not good enough and they want to collect more data.

    THIS is the last straw. I stayed through the various sales. I stayed through the horrible redesign. I stayed even while unicode support is still missing. Forcing me to react to a component on your webpage? How 1999 of you. You're dead to me now slashdot.

  • Some factory is printing up a bunch of "GeForce 4070 stickers" to put on these cards.

  • by Macdude ( 23507 ) on Friday October 14, 2022 @01:18PM (#62966647)

    The 4080 16 GB should have been named the 4070 and the 4080 12 GB should have been named the 4060.

    Nvidia just used the 4080 name so they could charge more and have people who don't follow things too closely over pay...

    • As long as the 4080 16GB perform better than 3080, they are fine to call it 4080, especially when they are in similar power budget.
      • Re:deceptively named (Score:5, Informative)

        by thegarbz ( 1787294 ) on Friday October 14, 2022 @03:19PM (#62967047)

        No they really aren't. The historical differences between the core configurations of the card have in the past been around 20-50% between the immediate model numbers, often with a Ti slotted in between at a later date after launch. E.g. a 3090 has 10496 shader cores, a 3080 has 8704, a 3070 has 5888. The story is not too different in the RTX 20xx series or even GTX 10xx series.

        Now we have a situation where the shader core count between cards is 70% higher for the 4090 than the (top) 4080. It's quite clear they way over positioned the 4080 and historically there would have been a product between these two cards. With such a huge gap it would be the equivalent of Intel deleting the i5 from their offering and renaming the i3 to i5 next generation.

  • I guess if too many chips are passing the qualification for the 16GB version, then they really don't have enough derated versions for the 12GB, so instead of short supplying everyone just sell the 16GB and wait a while for 12GB supply to catch up - or something something name is wrong something.
  • by Khyber ( 864651 ) <techkitsune@gmail.com> on Friday October 14, 2022 @07:24PM (#62967635) Homepage Journal

    Plain and simple. You don't sell a 4080 with 20% fewer cores, 25% less RAM, and a 192-bit memory bus vs a 256-bit memory bus, and still call it a 4080, you aren't even on the same performance level to warrant the numerical designation.

    They had to backpedal FAST before they got the fuck sued out of them.

I use technology in order to hate it more properly. -- Nam June Paik

Working...