Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD Graphics Technology Games

AMD Offers a Performance Boost, Over 20 New Features With Catalyst Omega Drivers 73

MojoKid writes: AMD just dropped its new Catalyst Omega driver package that is the culmination of six months of development work. AMD Catalyst Omega reportedly brings over 20 new features and a wealth of bug fixes to the table, along with performance increases both on AMD Radeon GPUs and integrated AMD APUs. Some of the new functionality includes Virtual Super Resolution, or VSR. VSR is "game- and engine-agnostic" and renders content at up to 4K resolution, then displays it at a resolution that your monitor actually supports. AMD says VSR allows for increased image quality, similar in concept to Super Sampling Anti-Aliasing (SSAA). Another added perk of VSR is the ability to see more content on the screen at once. To take advantage of VSR, you'll need a Radeon R9 295X2, R9 290X, R9 290, or R9 285 discrete graphics card. Both single- and multi-GPU configurations are currently supported. VSR is essentially AMD's answer to NVIDIA's DSR, or Dynamic Super Resolution. In addition, AMD is claiming performance enhancements in a number of top titles with these these new drivers. Reportedly, as little as 6 percent improvement in performance in FIFA Online to as much as a 29 percent increase in Batman: Arkham Origins can be gained when using an AMD 7000-Series APU, for example. On discrete GPUs, an AMD Radeon R9 290X's performance increases ranged from 8 percent in Grid 2 to roughly 16 percent in Bioshock Infinity.
This discussion has been archived. No new comments can be posted.

AMD Offers a Performance Boost, Over 20 New Features With Catalyst Omega Drivers

Comments Filter:
  • Why are they comparing it to 13.12, which is a year-old driver package? Is is because it isn't actually that much quicker than 14.9 or 14.11?
    • by Anonymous Coward

      Why are they comparing it to 13.12, which is a year-old driver package?

      Is is because it isn't actually that much quicker than 14.9 or 14.11?

      I think their argument is that 13.12 was the launch driver for the GPU in the figure, so they're saying "look how much we've improved since launch!".

      On the one hand yeah, that's more bang for your buck than when you bought the card, but isn't this more to do with tweaking the driver specifically for the game in question than a case of general driver improvements across the board?

      • Re: (Score:2, Insightful)

        I always see it the other way 'round - "Look how rubbish our drivers were!"

        Of course, anyone but the absolute most stalwart AMD fan already knew their drivers were rubbish, so I guess this is an improvement.

        • by JustNiz ( 692889 )

          >> Of course, anyone but the absolute most stalwart AMD fan already knew their drivers were rubbish,

          I completely agree, however it seems that AMD's customer base is mostly just a cult of drooling fanbois that won't ever agree with anything that is less than stellar praise of AMD and their products, or ever believe that any other manufacturer could ever make anything better.

          Its probably a self-fulfilling prohecy that most current AMD customers are like that, because anyone with some actual knowledge of

  • Another added perk of VSR is the ability to see more content on the screen at once.

    What is that supposed to mean?

    http://hothardware.com/gallery... [hothardware.com]

    ^ Wow, that blurry, dark, downscaled JPEG really shows off the difference, doesn't it?

    • The rendering at higher resolution then down-scaling without the game being aware of it is a pretty dreadful idea, you're just going to get tiny interfaces in most games or, as apparently pictured, a massive field of view which makes it harder to see smaller details. Microsoft's DirectX12 (or was it 11?) for mobile devices allows you to render the game world at higher or lower resolution and the interface at native, then merges them when displaying it; requires hardware support, apparently, but that seems l

      • And isn't the GPU doing WAY more work than it needs to, if it first renders everything at 4k and then scales it to some crap $120 1600x900 display?

        No thanks, I'll take the framerate increase of rendering in the resolution I'm actually displaying.

        • I am not even remotely an expert on the matter, but I believe the point is to render at higher resolution and then reduce AA and such.

          If you can make the game engine render the content at such a high resolution then there less need to do post processing of the images to do things like smooth edges as long as you have a way to efficiently scale down the image.

          For instance the new Final Fantasy XIII PC port has no graphics options (update to fix that is supposed to be out tomorrow). You can't even pick reso

    • by Kjella ( 173770 )

      I have an UHD monitor (3840x2160) and without exception the only games that change view like that are terrible games that have fixed UI elements that are x pixels wide, meaning that on a high def monitor the actual action happens with tiny ants in the 800x600 center with microbuttons and a lot of scenery. I decrease the resolution to get a "normal" gaming experience. Good games on the other hand look roughly the same on my 28" UHD screen as they looked on the 24" 1080p screen, only more detailed with the zo

  • by Anonymous Coward on Wednesday December 10, 2014 @07:00AM (#48563107)

    I don't need any of those new fancy features.
    What I need is their OpenGL driver complying with the OpenGL specification. Whenever I do anything advanced, it reliably works on my Nvidia system and reliably needs annoying and performance-degrading workarounds on my AMD/ATI system. We're talking about stuff like simple branching in shader code causing the optimizer to emit returns, or unnecessarily having to feed in vertex data when the geometry could be deduced from gl_VertexID alone, or the Uniform Buffer Object layout specified in the shader not being preserved when using the binary shader format (means I have to recompile it every time), or atan in shaders yielding results that are half a degree off under some circumstances, or the builtin attributes not working if you use your own attributes with names that would alphanumerically be sorted before gl_*.

    Through the ATI support forum back then I once got in touch with a technician who looked up these things in the driver sources and confirmed a few bugs with me, but later on I only got automated responses stating that he is leaving the company, and then the forum was trashed and a new AMD forum put in its place.

    Yes. This is driving me nuts. On Nvidia, it all just works as it should.

    • by Anonymous Coward

      What I need is their OpenGL driver complying with the OpenGL specification. Whenever I do anything advanced, it reliably works on my Nvidia system and reliably needs annoying and performance-degrading workarounds on my AMD/ATI system.

      AMD do not really want great OpenGL support in their drivers, because that would mean more competition for Mantle, most notably if/when it gets ported to non-Windows platforms where Direct3D is not natively available, and OpenGL is the de facto standard. It would be more beneficial for AMD if OpenGL dies because of the bad driver situation, and gets replaced by Mantle.

      • by Shinobi ( 19308 )

        And there's nothing new there. Back in the days of Radeon 9700 Pro etc, ATI were deliberately doing their best to sink OpenGL, including working against everyone else on the ARB(thanks Eskil for the gossip back then :p )

        Their focus on DirectX and some of their own specific stuff back then was so extreme that the gaming cards could not run even SpecViewPerf without crashing(if it even managed to start...), and even their pro cards had abysmall performance and, well, we could politely call it "erratic" functi

        • The FireGL brand took a HUGE step back when it was purchased by AMD / ATI. I remember when they were the best OpenGL performers you could buy...

          • by Shinobi ( 19308 )

            I remember that the card me and some others in my 3D class were drooling over was the DP Oxygen 402.

    • Yes, Nvidia is always better, but what about Intel then? How does Intel OpenGL support stack up against AMD?

      • by Anonymous Coward

        http://richg42.blogspot.com/2014/05/the-truth-on-opengl-driver-quality.html

    • by bigmo ( 181402 )

      A lot of it depends on what you consider correct. I work almost exclusively on amd platforms with opengl and am pretty happy over all with what I get. I have the reverse problem as you because supporting nVidia requires a lot of adjustment where amd and intel opengl work pretty much as is in my code. You can say that's because I'm doing it wrong and that nVidia has the proper implementation, but I think it's more that you get used to working with your own solutions and anything that requires additional w

  • Drop? (Score:4, Informative)

    by Buchenskjoll ( 762354 ) on Wednesday December 10, 2014 @07:54AM (#48563289)

    AMD just dropped its new Catalyst Omega driver package

    Is this a new meaning of the word drop, that I was until now unaware of? To my ears it sounds like they're not releasing anything.

    • by Anonymous Coward

      >> AMD just dropped its new Catalyst Omega driver package

      > Is this a new meaning of the word drop, that I was until now unaware of? To my ears it sounds like they're not releasing anything.

      An instance of dropping supplies or making a delivery, sometimes associated with delivery of supplies by parachute.

      (transitive, slang) To impart. "I drop knowledge wherever I go. Yo, I drop rhymes like nobody's business."

      (transitive, music, African American Vernacular) To release to the public.

      (intransitive, mu

    • Its common slang, at lest in the US. I'm guessing it has roots from either the past newspaper industry 'dropping' their new edition or military supply drops.

      • I noticed it about ten+ years ago when rappers/hip-hop artists used it to mean that they released a new album. It didn't take long before hip-hop slang became just American slang and artists of every genre started using it as if saying "release" was too effete. Now everyone says drop. I thought it sounded odd when it was just a hip-hop thing and even more odd now that it has been thoroughly appropriated by those quite far removed from that culture.
    • I've not heard it used for a production release, but in QA-speak, a "code drop" is whenever a new build comes into the lab for it's shakedown from development.

      I think I may have heard it used in this way in the hip hop circles, and it should remain there.

    • by Arykor ( 966623 )
      It is released. The last line in TFA pointed right to the driver download page, from which you just have to pick your platform. For example http://support.amd.com/en-us/d... [amd.com] shows Revision Number Omega (14.12)
    • Among the meanings of the verb "to drop" are both "to discontinue" and "to offload goods". This lead to the use of the word at both start of provision and end of provision.

  • by Anonymous Coward

    Driver updates are worth /. articles now? Really...? But they used a Greek letter...

  • Omega (Score:5, Funny)

    by TeknoHog ( 164938 ) on Wednesday December 10, 2014 @09:43AM (#48563687) Homepage Journal
    So it's their last driver release ever?
    • I'm wondering if it's a nod to a guy who used to make lots of custom drivers for graphics cards. It was the only way to upgrade my old Mobility card at the time, so I used them quite often. He still has a site up here:

      http://www.omegadrivers.net/ [omegadrivers.net]

  • I'm not sure they've released Bioshock Infinity yet.
  • Sorry. AMD's main problem is not their hardware. Their hardware ROCKS.

    The problem is, their driver packages are flaky, buggy, unstable pieces of shit. And, after being burned so many times by their crap, I won't trust them ever again.

  • Thankfully this also included a new Linux driver. The current one was many months old. Hopefully there are some good improvements! http://support.amd.com/en-us/k... [amd.com]
  • and cares about 16% better performance from Bioshock Infinite? I've got a GTX 660 in a 6 year old Athlon XP 6000 and it kicks that game in the fanny.

    What I want is stable drivers. I bought an nVidia because I still don't trust AMD after my last experience (admittedly from 3 years ago).

Arithmetic is being able to count up to twenty without taking off your shoes. -- Mickey Mouse

Working...