Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics AMD Hardware

AMD Announces Fiji-based Radeon R9 Fury X, 'Project Quantum', Radeon 300 Series 76

MojoKid writes: Today AMD announced new graphics solutions ranging from the bottom to the top ($99 on up to $649). First up is the new range of R7 300 Series cards that is aimed squarely at gamers AMD says are typically running at 1080p. For gamers that want a little bit more power, there's the new R9 300 Series (think of them as R9 280s with higher clocks and 8GB of memory). Finally, AMD unveiled its Fiji graphics cards that feature onboard High Bandwidth Memory (HBM), offering 3x the performance-per-watt of GDDR5. Fiji has 1.5x the performance-per-watt of the R9 290X, and was built with a focus on 4K gaming. The chip itself features 4096 stream processors and is comprised of 8.9 billion transistors. It has a graphics core clock of 1050MHz and is rated at 8.6 TFLOPs. AMD says there will also be plenty of overhead for overclocking. Finally, AMD also took the opportunity to showcase its "Project Quantum," which is a small form-factor PC that manages to cram two Fiji GPUs inside. The processor, GPUs, and all other hardware are incorporated into the bottom of the chassis, while the cooling solution is built into the top of the case.
This discussion has been archived. No new comments can be posted.

AMD Announces Fiji-based Radeon R9 Fury X, 'Project Quantum', Radeon 300 Series

Comments Filter:
  • by account_deleted ( 4530225 ) on Tuesday June 16, 2015 @08:01PM (#49926085)
    Comment removed based on user account deletion
    • The water cooled Fury X and the Nano both look to put less strain on the PCIe slot than the previous generation cards.
    • I've been using slots as handles since forever (no really, since the original IBM PC) and the only problems I've ever had with slots have been related to the hateful plastic retention mechanisms used for both AGP and PCI-E x16 slots. I had a sapphire radeon (talk about a nightmare combination) which broke one of my slots because the cardedge was slightly long, too, but that wasn't the slot's fault.

      Once upon a time, full-length expansion cards were supported from their far end, because ISA slots were sloppy

      • I had a sapphire radeon (talk about a nightmare combination)

        What do you mean? (I just bought a Sapphire Radeon R7 260x; should I be worried?)

        • What do you mean? (I just bought a Sapphire Radeon R7 260x; should I be worried?)

          Sapphire cards are cheap, like Zotacs. AMD is bad at drivers. Cheap card with bad drivers equals suffer. YMMV, good luck!

          • by Anonymous Coward

            I've only ever owned Sapphire cards - and never had any problems with them. My friends Powercolour HD4850 however, was another story...

            As for drivers - how many problems are caused by nvidia bribing companies to only make games for their hardware/drivers, and treat AMD as second-hand citizens. If you want a self-fulfilling prophecy then that's how it happens. You understand why having AMD around is a good idea, yes? If AMD go, then get ready to spend serious £££/$$$$ on even mid-range

            • You understand why having AMD around is a good idea, yes?

              Yes, I just don't know why anyone would buy an AMD video card on purpose. I mean, on a truly massive discount? Maybe. Included with a system I don't plan to use for gaming? Eh, OK. Mining bitcoins, not that that's a thing any more because of mining ASICs, but OK, once you get it working you don't really need to run updates all the time since the machine can be firewalled down pretty tight. But actually go out and buy one? That I don't get.

              FWIW I am still using a Phenom II X6 1045T. It is hopelessly outclass

              • I guess Intel might eventually get there. For midrange stuff at least. Their Iris Pro GPUs are already getting close to AMD's APUs.

                But it is quite possible that they keep it as high price "laptop exclusive". Especially if AMD goes tits up.

          • by Anonymous Coward

            You're uninformed. The drivers are largely just fine unless trying to run SLI and a brand-new game.

          • Comment removed based on user account deletion
            • 15 yard penalty,

              This is sportsball! I know this!

              anecdote of a single purchase does not equal evidence,

              I only know personally one other person who bought a Sapphire card, and it sucked shit too. So I may be biased, but I'm going to stick with it.

              And please stop blaming AMD for the fact you got a bad ATI product when AMD didn't even own the company at the time your IGP was sold. AMD drivers are just as solid as Nvidia,

              Oh my god, shut the fuck up. ATI's shitty video drivers have been crashing Windows for me since Mach32/Windows 3.1. It's not a single purchase, ATI just sucks ass.

              they wouldn't sell millions of cards if they weren't

              Just like McDonalds wouldn't sell millions of hamburgers if they weren't fucking amazing, right? Seriously, what is wrong with you? Did they drop you on your head three or fo

    • by gstoddart ( 321705 ) on Tuesday June 16, 2015 @08:22PM (#49926173) Homepage

      Bah, I want to know if they've solved the refrobulation problem with the niblitz which was leading to the excess deuterium depletion in the fourth quarter at low revs.

      Honestly, as someone who stopped slavishly following hardware specs a very long time ago ... my eyes glazed over half way through the summary.

      You guys and your wacky video cards. :-P

    • It looks like a strategy.

      Rightly or wrongly AMD believes they don't get treated fairly by the trade press. Their reviews often sound like "Well it's OK I guess, but...". And the comments attached to review articles more often than not are "but the drivers..." "but the watts..." "but the hairworks..." "but the spaceheater..." "but the rebadge..." and so forth.

      This way at least they get to own the launch day to the extent possible.
    • To me, that case layout looks like it was designed for easily displaying and identifying the card in their presentations. Though there are a large number of small form factor cases with some kind of riser / 90 degree bend.
    • Re: (Score:2, Insightful)

      by hsa ( 598343 )
      Benchmarks, we don't need no stinking benchmarks!

      Seriously, benchmarks are only relevant for the high-end Fiji. It is the series you buy, because you like AMD and have too much money.If you have $500-$700 set aside for a new GPU, some random benchmark is not going to change your mind. You would already have Titan or GTX 980 Ti, if you wanted Nvidia.

      Did you even look at the cards? I know clicking on the article is a big deal, but you should try it sometimes. HBM allows the Fiji series to be SMALL. I don't se
      • by Cederic ( 9623 )

        Of course benchmarks are relevant. I don't give a fuck about size, running cooler is nice but frankly I want to know whether dropping $200, $300, $897 on whichever of these cards will run the games I play at the resolution I use or whether that same cash budget would buy higher performance from Nvidia.

        Running the benchmarks will help confirm that the cards and their drivers can cope with the vagaries of multiple configurations and potentially draw out certain less effective combinations.

        Benchmarks for the s

    • So it's a slashvertisement for the same site as every other hardware-related article, but in this case not a single review site has benchmarks. AMD is being a giant cocktease, first they flash a picture, then they give us some speeches and marketing slides, with the promise that they're oh so worth waiting for. Meanwhile nVidia is putting it out there saying if you want it, come get it. I don't think this "hard to get" strategy is working out to AMDs advantage, I'm sure a lot of people are tired of getting

  • by 0111 1110 ( 518466 ) on Tuesday June 16, 2015 @08:13PM (#49926131)

    I don't understand why the marketing people are so intent on telling us what we must do with their products. This whole 'gaming at 4K' seems like they are shooting themselves in the foot by excluding a huge segment of enthusiasts who are looking for any excuse to find a use for all that power. Why try to only sell your top of the line products to people with 4k monitors? I realize that consoles and just the overall cost of photorealistic graphics have somewhat reduced the need for high end cards, but jeez. At least try to sell high end products. Pathetic marketing strategy.

    • the summary states that they know most gamers game at 1080P. but it does make sense to show off your best 4K offerings as well.
      • What do you mean by '4k offerings'? That means nothing to me. Are you implying that the high end cards won't function at lower resolutions?

        • by batkiwi ( 137781 )

          If a $X GPU plays games at 1080p/60fps with all settings on max, there is no point to a $2X GPU, unless you are going to go at a higher resolution.

          So the high end cards will work at 1080p, but why would you buy one if the mid end cards are just as good for your usage?

          • by Ihlosi ( 895663 )
            If a $X GPU plays games at 1080p/60fps with all settings on max, there is no point to a $2X GPU, unless you are going to go at a higher resolution.

            Power, noise, heat ...

        • by gstoddart ( 321705 ) on Tuesday June 16, 2015 @08:36PM (#49926245) Homepage

          What do you mean by '4k offerings'? That means nothing to me

          Umm ... products (offerings) what do 4K resolution?

          Are you implying that the high end cards won't function at lower resolutions?

          Nobody is saying that ... but if you're trying to do the marketing of your big shiny product, you do the penis waggling and show off the 4K resolution because it's the new hotness.

          People can already get performance for 1080P, so why advertise it?

          • People can already get performance for 1080P, so why advertise it?

            To sell products and make money. Because it's up to the client to determine what they use the cards for. Not the manufacturer. If anything you should be trying to suggest new applications for the device rather than excluding them.

            You don't say, "Don't buy this card if you don't have a 4k monitor because it will be useless. There is nothing you can do with it. No reason to own one. Just stick with your old card until you decide to buy a 4k monitor."

            If a marketing droid came up with that genius campaign for m

            • You don't say, "Don't buy this card if you don't have a 4k monitor because it will be useless. There is nothing you can do with it. No reason to own one. Just stick with your old card until you decide to buy a 4k monitor."

              If a marketing droid came up with that genius campaign for my company he'd be out on his ass. If you sell exotic sports cars do you really want to emphasize how their current car is 'good enough' since both cars can reach the speed limit quite easily? No.

              Honestly, pick one.

              Either you want

              • Either you want them to flog the latest and greatest, or you don't. You've complaining that they're pushing 4K, and then saying they should totally push 4K.

                I don't personally care what they do. If they want to be idiots and sell fewer products that's their business. Do they sell 4k monitors or do they sell video cards? Maybe they should ask themselves that. They should push 4k *and* they should push non-4k applications. Both. If they have trouble finding a current game that can make use of their processing power then they can write something themselves. Maybe a video encoding GPGPU app. Or a short game with photorealistic graphics. What you don't do is try to

        • Surely it's pretty obvious that they're implying the mid range cards can't do 4k in a usable fashion. If you want 4k, you want the 4k offerings.
          • If you want 4k, you want the 4k offerings.

            They are also implying that anyone with no immediate plans to buy a 4k monitor should not buy their high end cards. They are telling a whole segment of potential customers not to buy their products. At least not their high end flaship product. For someone with a video card that isn't that old that means they won't be upgrading until/unless they buy a 4k monitor.

            It's very nice of them to be worried about me wasting my money on their products but maybe they should let us worry about that. I don't need them to

  • by Anonymous Coward

    The summary should say "(think of them as R9 290s with higher clocks and 8GB of memory)." Currently, it incorrectly says "R9 280s" instead of "R9 290s". That's a big performance difference between the 280 series and 290 series.

  • Who cares about a short video card unless it's also low-profile? That's what's needed to cram it into a tiny system.

    • by SeaFox ( 739806 )

      It can be short (length) and full height and fit fine in cube chassis like a CoolerMaster Elite 110, or a low-profile case that uses a horizontal riser.

  • Minesweeper (Score:5, Interesting)

    by darkain ( 749283 ) on Tuesday June 16, 2015 @09:04PM (#49926349) Homepage

    But is it powerful enough to run the Windows 10 Minesweeper game?! http://wscont1.apps.microsoft.... [microsoft.com]

    Seriously, no joke. The Win10 version of games are horribly resource hungry for fuck knows what reason. In the time it took to just load Minesweeper on the Win 10 tech preview, I loaded up a web browser, played an entire game of mines in it, closed the browser, came back, and it was STILL loading.

    I originally played Minesweeper in Windows 3.1 on a 386sx 16MHz. I'm now on a 3GHz quad-core. On raw cycle processing power alone, that is literally 1,000 the speed (this is before accounting for enhancements to the architecture over the past 20 years). And yet the game struggles on modern hardware!? If this isn't the definition of bloat, I don't know what is!

    • by _xeno_ ( 155264 )

      Seriously, no joke. The Win10 version of games are horribly resource hungry for fuck knows what reason.

      They are in Windows 8.1 as well. I tried playing Microsoft Sudoku on my Surface Pro 3, but - no joke - it forced the fan on and reduced the battery life to the point where I just gave up playing it.

      I'm not sure how Microsoft fucked up their Metro - er, "universal" - versions of their games, but they did.

      • by Anonymous Coward

        An educated guess:

        The games are written in .NET using the WPF graphical framework.

        This implies that the rendering of screen elements, which presumably consist of a lot of smooth vectors and gradients, is very CPU intensive. It may also be GPU intensive depending on how crappy the graphics driver is at translating the rendering requests into efficient hardware rendering operations.

        It also implies that the program code is somewhat inefficient, given that .NET (C#), like Java, encourages novice programmers to

    • After a 50$ CPU was able to run Office and your browser with room to spare, Intel and AMD needed more resource-hungry software to make people buy new CPUs, and what better than the most used piece of software in offices around the world, i.e.: Minesweeper :).
      On a more serious note, they also used its popularity to get people accustomed to using the Windows Store (in Win 10 you must install it from there or no Minesweeper or other Microsoft games for you).
  • by voss ( 52565 ) on Tuesday June 16, 2015 @09:25PM (#49926425)

    Apparently the Leap models with Project Quantum have been having problem with users inadvertently causing time-space distortions including memory loss with at least one user vanishing without a trace.

  • by Anonymous Coward

    And I say this as a lifelong nVidia card buyer (first card was a Canopus Spectra Riva TNT2, back in the day).

    • I abandoned Nvidia about 5 years ago after their support and drivers made me want to throw my $500 card against a wall. I sold it and switched to AMD (which hasn't been trouble free either, but definitely better). I was just considering switching back to Nvidia for the lower TDP, but the HBM info that was released earlier this month stopped my purchase. I will wait to see some benchmarks from reputable sites before I decide whether to stay AMD or jump ship.

      • by JustNiz ( 692889 )

        Thats funny, I had so much frustration with AMD/ATI's Linux driversI pretty much threw an $800 laptop away, and before and since then have never had any problems at all with nVidia's drivers.
        It boggles my mind how anyone can believe AMD drivers are better/more stable than nVidia's, especially on Linux.

  • Si instead of outsourcing to India, AMD outsources to Indians living on a couple of islands in the middle of the Pacific.

  • In February of 2012, I ordered a graphics card made on the TSMC 28nm process node, the Radeon HD7970. The cards hit the market on January 9, 2012. It has 4.3 billion transistors and a die size of 352 mm^2. It has 2048 GCN cores.

    In June of 2015, the Radeon Fury X is a graphics card made on the TSMC 28nm process node, with 8.9 billion transistors, and a die size likely to be somewhere around 600 mm^2 based on a quadratic fit regression analysis using the existing GCN 1.1 parts as data points. It has 4096 GCN

    • To some extent, you are running into that problem everywhere in computing. If you had bought a processor in 2012, you would also be struggling for a reason to upgrade. Heck, my Q6600, which is now 8 years old, still suffices for 1080p gaming with contemporary titles. It could very well be a decade processor. Imagine trying to game with a 10 year old processor in 2007. It seems that video cards are starting to hit the same wall, and die shrinks probably won't change the things much- they haven't with CPUs.
    • by ponos ( 122721 )

      What is the likelihood that, in three years' time, they have made any significant innovations on the hardware front whatsoever, aside from stacking memory modules on top of one another?

      To me this looks like an attempt to continue to milk yesterday's fabrication processes and throw in a few minor bones (like improved VCE, new API support) while not really improving in areas that count, like power efficiency, performance per compute core, cost per compute core, and overall performance per dollar.

      They explicitly mentioned 50% more perf-per-watt with respect to the R9-290X. In the end, if you get the performance you want and a reasonable power consumption, what do you care if it's made in 28nm or 22nm or whatever? Process technology is only relevant if it enables these targets.

God made the integers; all else is the work of Man. -- Kronecker

Working...