Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Businesses Graphics

It's Official — AMD Will Retire the ATI Brand 324

J. Dzhugashvili writes "A little over four years have passed since AMD purchased ATI. In May of last year, AMD took the remains of the Canadian graphics company and melded them into a monolithic products group, which combined processors, graphics, and platforms. Now, AMD is about to take the next step: kill the ATI brand altogether. The company has officially announced the move, saying it plans to label its next generation of graphics cards 'AMD Radeon' and 'AMD FirePro,' with new logos to match. The move has a lot to do with the incoming arrival of products like Ontario and Llano, which will combine AMD processing and graphics in single slabs of silicon."
This discussion has been archived. No new comments can be posted.

It's Official — AMD Will Retire the ATI Brand

Comments Filter:
  • Great news (Score:4, Interesting)

    by mangu ( 126918 ) on Monday August 30, 2010 @07:06AM (#33413680)

    "The move has a lot to do with the incoming arrival of products like Ontario and Llano, which will combine AMD processing and graphics in single slabs of silicon."

    Good. Getting rid of the PCI-e bus between CPU and GPU is one important step in getting massive parallelism to work well.

    Since we hit the 3 GHz barrier, where the speed of light itself becomes a limit, putting the processing elements physically closer is essential to get better performance. Now let's see them put 4 GB or so of fast RAM on the same chip.

    • Re: (Score:3, Informative)

      by sanosuke001 ( 640243 )
      Yeah, 3ghz doesn't come close to the light speed barrier. i think the issue is more from heat dissipation and electron bleed...

      moving the gpu on-die will fix the latency associated with the pci-e bus, but it's not because of the reasons you seem to believe
      • Re:Great news (Score:5, Informative)

        by bertok ( 226922 ) on Monday August 30, 2010 @07:43AM (#33413868)

        Yeah, 3ghz doesn't come close to the light speed barrier. i think the issue is more from heat dissipation and electron bleed...

        moving the gpu on-die will fix the latency associated with the pci-e bus, but it's not because of the reasons you seem to believe

        Want to bet?

        At 3 GHz, light moves just 7.2 cm [wolframalpha.com], given a typical upper range for the velocity factor of copper of 0.72. Silicon and fibre optics are usually worse, with a VF between 0.4 and 0.6, or between 4 and 6cm per clock. That's barely enough to traverse a CPU die, let alone the motherboard. Moving parts physically closer together has a lot to do with the speed of light!

        • Re:Great news (Score:5, Informative)

          by IndustrialComplex ( 975015 ) on Monday August 30, 2010 @08:19AM (#33414174)

          Want to bet?

          At 3 GHz, light moves just 7.2 cm, given a typical upper range for the velocity factor of copper of 0.72. Silicon and fibre optics are usually worse, with a VF between 0.4 and 0.6, or between 4 and 6cm per clock. That's barely enough to traverse a CPU die, let alone the motherboard. Moving parts physically closer together has a lot to do with the speed of light!

          I really would mod this informative, since I was about to make a similar point. I think a lot of the confusion is that people hear things like the Speed of Light in terms of Kilometers per second, and it gets filed away by the brain as inconsequential for scales which are measured in centimeters and MUCH smaller.

          But when you realize that that scale which is only a factor measured in millions meters per second is being divided into segments that are fractions of billionths of a second, the speed of light manifests in a much more physically understandable term.

          • Re:Great news (Score:5, Interesting)

            by mangu ( 126918 ) on Monday August 30, 2010 @08:52AM (#33414428)

            There's an anecdote that admiral Grace Hopper [wikipedia.org] gave "nanoseconds" as gifts:

            "Although she was an interesting and competent speaker, the most memorable part of these talks was her illustration of a nanosecond. She salvaged an obsolete Bell System 25 pair telephone cable, cut it to 11.8 inch (30 cm) lengths (which is the distance that light travels in one nanosecond) and handed out the individual wires to her listeners"

            I've also read about someone else giving out "picoseconds" in the form of tiny mustard seeds to illustrate how much the speed of light limits data processing.

        • Re: (Score:3, Informative)

          As one of my electrical engineering professors was fond of saying: "What is the speed of light? As far as you're concerned, it's nine inches per nanosecond."
        • ....
          Want to explain why light has to traverse through a cpu in one clock cycle?
          besides the fact that it takes multiple clock cycles to finish 1 calculation?
          Besides the fact that we have multiple cpus doing multiple calculations per cycle?

          Who thinks of these things?

          Assuming you had some mystical cpu that completed execution of every instruction in one cycle, maybe you'd have a point.

        • Re: (Score:3, Informative)

          by Pigeon451 ( 958201 )

          Your calculation assumes light is traveling in a vacuum. The velocity of light is always slower in a medium than in a vacuum. Our computers use copper and silicon (and other materials), in which propagation is by electrons, not light. Anyways, light speed would be slower in fibre optics than in a vacuum.

          The propagation of electrons in copper is about 2/3 that of light speed in a vacuum, which on the time and length scales we're using in computers, is quite significant.

          • The propagation of electrons in copper is about 2/3 that of light speed in a vacuum, which on the time and length scales we're using in computers, is quite significant.

            This is wrong on two counts.

            (1) It has little to do with copper. The reduced propagation speed in a copper cable is due to the dielectric constant of the insulating material. If the copper cable were made of conductors in a vacuum, the propagation speed would be essentially the speed of light (although not quite, due to secondary factor

      • Re: (Score:2, Informative)

        by Rockoon ( 1252108 )
        Overclockers have gone above 6ghz here [tomshardware.com] and above 7ghz here [geek24.com] and dont forget over 8ghz here [softpedia.com]

        In each case, its always about the heat.

        Pretty much all CPU's sold today (even "2.x ghz" chips) can go over 4ghz with proper air cooling. The reason they dont sell 4ghz+ chips is because chips have warranties and require a proper cooling setup in order to not fail at those speeds. Most important of course is heat sink and cpu fan which Intel and AMD do have some control over, but also of considerable importance is
      • by gmarsh ( 839707 )

        And latency is a bad thing, depending on algorithm.

        Doesn't matter how fast the CPU or GPU is, if the implementation spends 90% of its time stalled waiting for data to arrive to/from the GPU.

    • by A12m0v ( 1315511 )

      Since we hit the 3 GHz barrier

      IBM says hello.

    • http://en.wikipedia.org/wiki/POWER7 [wikipedia.org]

      IBM settled around 4.25 Ghz now. Their original promise (which seems to be very expensive) is around 5+ Ghz speeds.

      Don't get me wrong, that is a high end/enterprise UNIX server chip, I don't say Apple should be shipping POWER7 now.

      If they just... took consumer desktop&portable CPU business serious...

  • Are there any deeper changes to come behind the re-brand? ATi involved in producing open source drivers ans specs for their GPU. Will this name change carry some bad news about the current openness?

    • by Ironhandx ( 1762146 ) on Monday August 30, 2010 @07:13AM (#33413712)

      ATI really only started doing that after they were acquired by AMD so I wouldn't worry too much.

    • by ProppaT ( 557551 )

      I think that the "deeper changes" are that AMD's prepping for their integrated CPU/GPU launch. It only makes sense. If they're gonna start merging chips, it would be awful awkward to have to brand names AND a product name attached to a chip.

      I would image that better Linux drivers might come down the pipeline, though. These integrated approaches lend themselves nicely towards Linux workstations and they'd definitely loose out on a potential market if they completely ignored the issue.

      • by Qubit ( 100461 ) on Monday August 30, 2010 @07:49AM (#33413914) Homepage Journal

        ...AMD's prepping for their integrated CPU/GPU launch. ...
        I would image that better Linux drivers might come down the pipeline, though...they'd definitely loose out on a potential market if they completely ignored the issue.

        I'd go one step further and say that I think that AMD has an opportunity to highlight their hardware here.

        Intel's CPUs and integrated graphics have long had great support in the Linux kernel. Because Intel controls the tech, they can actually provide the correct and full source for the graphics drivers. The problem is that Intel integrated graphics aren't ever anything special.

        If AMD is seriously working on integrating their graphics cards and processors -- perhaps even onto the same die -- then they have an opportunity to provide a much more powerful, integrated hardware platform with fully-open drivers. Intel can't compete with that kind of setup, especially as NVidea appears to have an aversion to opening the source to their graphics card drivers.

        • Re: (Score:3, Informative)

          by hedwards ( 940851 )

          Intel's CPUs and integrated graphics have long had great support in the Linux kernel. Because Intel controls the tech, they can actually provide the correct and full source for the graphics drivers. The problem is that Intel integrated graphics aren't ever anything special.

          Methinks you might are being a bit generous with Intel. I went with an Intel integrated chipset a number of years back because the alternatives weren't very well supported on FreeBSD, but the graphics weren't just not special, they were bad. Sufficiently bad that I've stayed away from them ever since. Which for Intel is just dumb, I have a very hard time believing that Intel couldn't do any better than what they've been doing. Hopefully with AMD owning ATI that'll kick a bit of sand in Intel's collective f

    • Re: (Score:3, Informative)

      Other way around; AMD has always released specs and started releasing ATI specs after ATI was acquired. You may notice that http://www.x.org/docs/AMD/ [x.org] is lacking docs for the r200 and earlier; that's because AMD made the acquisition during the r400 era, and the docs for older chipsets were more or less lost forever at that point.

      Right now, the open-source drivers are called radeon, r300, r600, etc.; one developer committed his code as "amd" instead at one point. (It got changed to avoid end-user confusion.)

  • fglrx (Score:4, Insightful)

    by leathered ( 780018 ) on Monday August 30, 2010 @07:23AM (#33413746)

    ..can they retire that too? please?

    • The name? Sure they can. To please you it will (continue to) be known as Catalyst. (http://support.amd.com/us/gpudownload/linux/Pages/radeon_linux.aspx)

    • Also amdcccle while we're at it. (Yes, that's the correct number of c's... I think).
    • Re:fglrx (Score:5, Informative)

      by MostAwesomeDude ( 980382 ) on Monday August 30, 2010 @08:29AM (#33414224) Homepage

      fglrx support for r500 and earlier (anything before the HD lines) is already delegated to the open-source drivers. We're working on getting r800 (redwood) support for acceleration together, and r600 support is getting better by the day.

  • They can give the AMD brand a big boost by associating it directly with the graphics cards - and it will probably mean that people buying an AMD graphics card will be more likely to buy an AMD processor to go with it.

  • by Anonymous Coward on Monday August 30, 2010 @07:32AM (#33413810)

    In May of last year, AMD took the remains of the Canadian graphics company and melded them into a monolithic products group, which combined processors, graphics, and platforms. Now, AMD is about to take the next step: kill the ATI brand altogether.

    Oh, please, J. Dzhugashvili, don't hold back. Tell us how you REALLY feel. What'd the rejected original form of this summary look like?

    In May of last year, the poor, innocent Canadian angels of technology, ATI, had their very remains tortured and raped by the evil, evil AMD, cruelly melded into a hideous abomination of a monolithic products group, creating an unholy, soulless combination of processors, graphics, and platforms. Now, the faceless anti-christ forces of AMD plan to take the next step in their plans to destroy all that is good in the world: Slaughter the angelic ATI brand altogether, laughing with sadistic glee as it begs for mercy in a futile appeal to the quickly-evaporating last shreds of AMD's humanity and compassion, ATI having never having harmed a fly in its too-short, sad, sad life.

  • Red or green? (Score:4, Interesting)

    by cerelib ( 903469 ) on Monday August 30, 2010 @09:33AM (#33414868)
    Without the red ATI logo, will they continue to use red as the brand color of their graphics products? Or, will people now be choosing between AMD green and Nvidia green? It may sound superficial (because, by definition, it is), but rival groups always seem to have different colors. It makes for a nice mental distinction when looking at their products. My only guess is that it will probably look like the "AMD Vision" logo or might even be an extension of that branding.
  • About time. (Score:2, Funny)

    by Stavr0 ( 35032 )
    With all the driver trouble, I was beginning to think ATI stood for Always Trapping IRQLs
  • Video card article (Score:3, Insightful)

    by Spatial ( 1235392 ) on Monday August 30, 2010 @02:09PM (#33418278)

    * My [Nvidia/ATI] anecdote trumps your [Nvidia/ATI] anecdote. You are stupid for buying their products.

    * [Nvidia/ATI] has terrible drivers. You are stupid for buying their products.

    * [Nvidia/ATI] produced hardware with a design flaw 25 generations ago. I will never buy their hardware again.

    * Based on my comprehensive study of one graphics card, here is my 100% accurate assessment of the failure rate of every graphics card [Nvidia/ATI] produces. I will never buy their hardware again.

    * Here's an opinion I formed more than ten years ago. Presumably it's still relevant because technology moves so incredibly slowly. You are stupid for buying [Nvidia/ATI]'s products.

It's time to boot, do your boot ROMs know where your disk controllers are?

Working...