It's Official — AMD Will Retire the ATI Brand 324
J. Dzhugashvili writes "A little over four years have passed since AMD purchased ATI. In May of last year, AMD took the remains of the Canadian graphics company and melded them into a monolithic products group, which combined processors, graphics, and platforms. Now, AMD is about to take the next step: kill the ATI brand altogether. The company has officially announced the move, saying it plans to label its next generation of graphics cards 'AMD Radeon' and 'AMD FirePro,' with new logos to match. The move has a lot to do with the incoming arrival of products like Ontario and Llano, which will combine AMD processing and graphics in single slabs of silicon."
Great news (Score:4, Interesting)
Good. Getting rid of the PCI-e bus between CPU and GPU is one important step in getting massive parallelism to work well.
Since we hit the 3 GHz barrier, where the speed of light itself becomes a limit, putting the processing elements physically closer is essential to get better performance. Now let's see them put 4 GB or so of fast RAM on the same chip.
Re: (Score:3, Informative)
moving the gpu on-die will fix the latency associated with the pci-e bus, but it's not because of the reasons you seem to believe
Re:Great news (Score:5, Informative)
Yeah, 3ghz doesn't come close to the light speed barrier. i think the issue is more from heat dissipation and electron bleed...
moving the gpu on-die will fix the latency associated with the pci-e bus, but it's not because of the reasons you seem to believe
Want to bet?
At 3 GHz, light moves just 7.2 cm [wolframalpha.com], given a typical upper range for the velocity factor of copper of 0.72. Silicon and fibre optics are usually worse, with a VF between 0.4 and 0.6, or between 4 and 6cm per clock. That's barely enough to traverse a CPU die, let alone the motherboard. Moving parts physically closer together has a lot to do with the speed of light!
Re:Great news (Score:5, Informative)
Want to bet?
At 3 GHz, light moves just 7.2 cm, given a typical upper range for the velocity factor of copper of 0.72. Silicon and fibre optics are usually worse, with a VF between 0.4 and 0.6, or between 4 and 6cm per clock. That's barely enough to traverse a CPU die, let alone the motherboard. Moving parts physically closer together has a lot to do with the speed of light!
I really would mod this informative, since I was about to make a similar point. I think a lot of the confusion is that people hear things like the Speed of Light in terms of Kilometers per second, and it gets filed away by the brain as inconsequential for scales which are measured in centimeters and MUCH smaller.
But when you realize that that scale which is only a factor measured in millions meters per second is being divided into segments that are fractions of billionths of a second, the speed of light manifests in a much more physically understandable term.
Re:Great news (Score:5, Interesting)
There's an anecdote that admiral Grace Hopper [wikipedia.org] gave "nanoseconds" as gifts:
"Although she was an interesting and competent speaker, the most memorable part of these talks was her illustration of a nanosecond. She salvaged an obsolete Bell System 25 pair telephone cable, cut it to 11.8 inch (30 cm) lengths (which is the distance that light travels in one nanosecond) and handed out the individual wires to her listeners"
I've also read about someone else giving out "picoseconds" in the form of tiny mustard seeds to illustrate how much the speed of light limits data processing.
Re: (Score:3, Informative)
Re: (Score:2)
....
Want to explain why light has to traverse through a cpu in one clock cycle?
besides the fact that it takes multiple clock cycles to finish 1 calculation?
Besides the fact that we have multiple cpus doing multiple calculations per cycle?
Who thinks of these things?
Assuming you had some mystical cpu that completed execution of every instruction in one cycle, maybe you'd have a point.
Re: (Score:3, Informative)
Your calculation assumes light is traveling in a vacuum. The velocity of light is always slower in a medium than in a vacuum. Our computers use copper and silicon (and other materials), in which propagation is by electrons, not light. Anyways, light speed would be slower in fibre optics than in a vacuum.
The propagation of electrons in copper is about 2/3 that of light speed in a vacuum, which on the time and length scales we're using in computers, is quite significant.
Re: (Score:2)
This is wrong on two counts.
(1) It has little to do with copper. The reduced propagation speed in a copper cable is due to the dielectric constant of the insulating material. If the copper cable were made of conductors in a vacuum, the propagation speed would be essentially the speed of light (although not quite, due to secondary factor
Re: (Score:2, Informative)
In each case, its always about the heat.
Pretty much all CPU's sold today (even "2.x ghz" chips) can go over 4ghz with proper air cooling. The reason they dont sell 4ghz+ chips is because chips have warranties and require a proper cooling setup in order to not fail at those speeds. Most important of course is heat sink and cpu fan which Intel and AMD do have some control over, but also of considerable importance is
Re:Great news (Score:5, Informative)
Reading comprehension fail. Nobody said that you can't go above 3 GHz for the CPU, but that if you do, if a chip needs data from another and there's a distance of five centimeters or more between both chips the data will not arrive in the same clock cycle [slashdot.org]
Re: (Score:2)
Reading comprehension fail.
Mostly writing comprehension fail. The first sentence was poorly written and prone to misunderstanding.
Re: (Score:2)
And latency is a bad thing, depending on algorithm.
Doesn't matter how fast the CPU or GPU is, if the implementation spends 90% of its time stalled waiting for data to arrive to/from the GPU.
Re: (Score:2)
IBM says hello.
We? No, X86 vendors hit it, RISC is at 4.25 Ghz (Score:3, Informative)
http://en.wikipedia.org/wiki/POWER7 [wikipedia.org]
IBM settled around 4.25 Ghz now. Their original promise (which seems to be very expensive) is around 5+ Ghz speeds.
Don't get me wrong, that is a high end/enterprise UNIX server chip, I don't say Apple should be shipping POWER7 now.
If they just... took consumer desktop&portable CPU business serious...
Re:Great news (Score:5, Informative)
With a 3 GHz clock, a signal at the speed of light travels 10 cm during one clock cycle. This means that if a chip needs data from another and there's a distance of five centimeters or more between both chips the data will not arrive in the same clock cycle.
Re:Great news (Score:5, Interesting)
So with current die-sizes of about 146mm^2, assuming it's really square, we have a maximum length of about 1.7cm. Sounds like we can go up to 9Ghz, at least if we are just using the speed of light in vacuum.
Re:Great news (Score:5, Informative)
Assuming the signals travel in a straight line. If you look at current motherboards and video cards, you'll notice that many of the copper traces are "wiggly", not straight. That is done in order to get bits in parallel buses to arrive at the same time, and conductor traces on the chips must be designed similarly, it's the longest distance that any of the bits must travel that limits the others.
Besides, there are capacitance and inductance effects to be considered. Transitions from one to zero and vice-versa aren't instantaneous and that must be taken into account.
One could say that 9 GHz would be the absolute physical limit for a 1.7 cm chip and the technical limit is somewhat lower than that.
For a set of chips on a board, the absolute physical limit is much lower, and that's the reason why on-chip cache memory has become so important lately.
Re: (Score:3, Informative)
An XFI-SFI interconnect runs up to 10.3 Gbps on a single serial link. It is double-pumped (bit on each end of the clock) so the clock rate is half that. This is the connection that links a 10Gbps phy to the transceiver module. You do have to keep the interconnects pretty short though.
http://www.altera.com/technology/high_speed/protocols/10gb-ethernet-xfi-sfi/pro-xfi-sfi.html [altera.com]
XDR ram can transmit 8 bits per clock on a serial line: http://en.wikipedia.org/wiki/XDR_DRAM [wikipedia.org]
Re: (Score:2)
It will also need to use instant quantum-FTL communication between harddrives, ram and GPU to not make it a massively bottlenecked waste of performance.
Re: (Score:2)
exactly
Re: (Score:2)
that doesnt really matter that much, clock/data propagation stages in a data channel are nothing new. Intel's Pentium 4 had pipeline stages which only served to propagate the data/instruction one step further down the pipeline.
Sure, the added latency isnt all that nice, but 1 clock-cycle to anywhere isnt needed anyway
It's not light speed (Score:4, Informative)
Re: (Score:2)
It wasn't the signal speed that became a limiting factor above 3Ghz, but transistor power leakage current, which sort of goes "to hell in a handbasket" above 3Ghz.
http://arstechnica.com/old/content/2004/06/prescott.ars/2 [arstechnica.com]
But that sort of explains why Moore's law went all multi-core after Intel gave up trying to make 4Ghz CPUs that didn't leak power all out the wazoo.
Re:It's not light speed (Score:4, Informative)
If the propagation speed of an electrical signal is .96C in an uninsulated chunk of copper and only .66C in a coaxial cable, what is it reduced to in an on-chip environment? On a computer bus? I have seen the figure .33C, but I can't find any primary source for this.
Let's assume the 0.33C for the moment, and consider what this means. A CPU contains some fairly large functional units that need to be run synchronously - meaning that all transistors within the unit switch are synchronized by a master clock signal. If this is to work, the propagation delay across the unit must be significantly less than 1/2 of a clock cycle. Taking .33C figure as correct, and limiting delay to 1/4 of a clock cycle, the maximum size of a functional unit is about 8mm. This is not far removed from the size of structure on modern CPU chips. You can make functional units accept larger delays (that's one application of pipelines), but this carries the price of complexity.
The point: power consumption is an important problem, but signal propagation is also very relevant. If 3GHz isn't the limit, from a signal propagation point of view, it is not so far away from that limit...
Here's a chart showing how the race to ever-faster processors came to a screeching halt [tomshardware.com] a few years ago.
Re:Great news (Score:5, Insightful)
We're not, but even if we were, that's the fundamental limit. Electricity traveling slower than this makes the problem worse.
You've clearly misunderstood his post, so adding insults just makes you look foolish.
No we wouldn't. If it can't be done in one clock cycle, it'll be done in two (or more). Who said anything about this limiting clock speed?
Anyway, at a higher clock speed, the problem becomes even more pronounced. With a 3.8 GHz clock, a signal at the speed of light only travels 7.9 cm during one clock cycle (but let's estimate about 6.5 cm for electricity).
Re: (Score:3, Insightful)
The same thing happened with math coprocessors. I once had an AMD Am386 chip with an Intel 80387 floating point chip. With the 486 CPU series Intel fully integrated the floating point functions in the same chip as the CPU.
I don't think the market for separate graphics chips will last much longer. The only way to get more performance out of CPUs now is by adding cores and it makes sense to let the CPU use the GPU cores. Integrating graphic functions in the CPU seems inevitable by now.
Re:Great news (Score:4, Interesting)
Re: (Score:2)
Re: (Score:2)
I haven't seen any hint that AMD will drop their line of processors that do not have integrated graphics. So there is no limiting of consumers choice that I can see.
If you had RTFA, you would have noted that AMD is only able to have a level playing field to compete because the FTC has put a stop to Intel's unfair trade practices. [techreport.com]
Wil this affect open source drivers (Score:5, Interesting)
Are there any deeper changes to come behind the re-brand? ATi involved in producing open source drivers ans specs for their GPU. Will this name change carry some bad news about the current openness?
Re:Wil this affect open source drivers (Score:5, Informative)
ATI really only started doing that after they were acquired by AMD so I wouldn't worry too much.
Re: (Score:2)
I think that the "deeper changes" are that AMD's prepping for their integrated CPU/GPU launch. It only makes sense. If they're gonna start merging chips, it would be awful awkward to have to brand names AND a product name attached to a chip.
I would image that better Linux drivers might come down the pipeline, though. These integrated approaches lend themselves nicely towards Linux workstations and they'd definitely loose out on a potential market if they completely ignored the issue.
Opportunity knocking for AMD here... (Score:5, Insightful)
...AMD's prepping for their integrated CPU/GPU launch. ...
I would image that better Linux drivers might come down the pipeline, though...they'd definitely loose out on a potential market if they completely ignored the issue.
I'd go one step further and say that I think that AMD has an opportunity to highlight their hardware here.
Intel's CPUs and integrated graphics have long had great support in the Linux kernel. Because Intel controls the tech, they can actually provide the correct and full source for the graphics drivers. The problem is that Intel integrated graphics aren't ever anything special.
If AMD is seriously working on integrating their graphics cards and processors -- perhaps even onto the same die -- then they have an opportunity to provide a much more powerful, integrated hardware platform with fully-open drivers. Intel can't compete with that kind of setup, especially as NVidea appears to have an aversion to opening the source to their graphics card drivers.
Re: (Score:3, Informative)
Intel's CPUs and integrated graphics have long had great support in the Linux kernel. Because Intel controls the tech, they can actually provide the correct and full source for the graphics drivers. The problem is that Intel integrated graphics aren't ever anything special.
Methinks you might are being a bit generous with Intel. I went with an Intel integrated chipset a number of years back because the alternatives weren't very well supported on FreeBSD, but the graphics weren't just not special, they were bad. Sufficiently bad that I've stayed away from them ever since. Which for Intel is just dumb, I have a very hard time believing that Intel couldn't do any better than what they've been doing. Hopefully with AMD owning ATI that'll kick a bit of sand in Intel's collective f
Re: (Score:3, Insightful)
I'm not sure exactly what you're playing but anything with decent graphics post 2002 is pretty much a snail fest on even the newest on board intel gfx.
Hell, I still get lag on my laptop Intel Integrated gfx on Baldurs Gate sometimes. Thats what, 1996?
Re: (Score:3, Informative)
Other way around; AMD has always released specs and started releasing ATI specs after ATI was acquired. You may notice that http://www.x.org/docs/AMD/ [x.org] is lacking docs for the r200 and earlier; that's because AMD made the acquisition during the r400 era, and the docs for older chipsets were more or less lost forever at that point.
Right now, the open-source drivers are called radeon, r300, r600, etc.; one developer committed his code as "amd" instead at one point. (It got changed to avoid end-user confusion.)
fglrx (Score:4, Insightful)
..can they retire that too? please?
Re: (Score:2)
The name? Sure they can. To please you it will (continue to) be known as Catalyst. (http://support.amd.com/us/gpudownload/linux/Pages/radeon_linux.aspx)
Re: (Score:2)
Re:fglrx (Score:5, Informative)
fglrx support for r500 and earlier (anything before the HD lines) is already delegated to the open-source drivers. We're working on getting r800 (redwood) support for acceleration together, and r600 support is getting better by the day.
Not terribly surprising (Score:2)
They can give the AMD brand a big boost by associating it directly with the graphics cards - and it will probably mean that people buying an AMD graphics card will be more likely to buy an AMD processor to go with it.
Re: (Score:3, Informative)
ATI graphics cards work just fine with Intel processors. I don't believe there's any move to stop them doing so when they rebrand.
Re: (Score:2)
Little bit of hate? (Score:5, Funny)
In May of last year, AMD took the remains of the Canadian graphics company and melded them into a monolithic products group, which combined processors, graphics, and platforms. Now, AMD is about to take the next step: kill the ATI brand altogether.
Oh, please, J. Dzhugashvili, don't hold back. Tell us how you REALLY feel. What'd the rejected original form of this summary look like?
In May of last year, the poor, innocent Canadian angels of technology, ATI, had their very remains tortured and raped by the evil, evil AMD, cruelly melded into a hideous abomination of a monolithic products group, creating an unholy, soulless combination of processors, graphics, and platforms. Now, the faceless anti-christ forces of AMD plan to take the next step in their plans to destroy all that is good in the world: Slaughter the angelic ATI brand altogether, laughing with sadistic glee as it begs for mercy in a futile appeal to the quickly-evaporating last shreds of AMD's humanity and compassion, ATI having never having harmed a fly in its too-short, sad, sad life.
Red or green? (Score:4, Interesting)
About time. (Score:2, Funny)
Video card article (Score:3, Insightful)
* My [Nvidia/ATI] anecdote trumps your [Nvidia/ATI] anecdote. You are stupid for buying their products.
* [Nvidia/ATI] has terrible drivers. You are stupid for buying their products.
* [Nvidia/ATI] produced hardware with a design flaw 25 generations ago. I will never buy their hardware again.
* Based on my comprehensive study of one graphics card, here is my 100% accurate assessment of the failure rate of every graphics card [Nvidia/ATI] produces. I will never buy their hardware again.
* Here's an opinion I formed more than ten years ago. Presumably it's still relevant because technology moves so incredibly slowly. You are stupid for buying [Nvidia/ATI]'s products.
Re: (Score:3, Funny)
What confusion?
As you said, there are two physical CPUs, one from each manufacturer, in that computer. Where's the confusion?
Re:Let The Confustion Begin (Score:5, Insightful)
The confusion is that most regular people are only marginally aware of an AMD/Intel distinction, although don't know what it means, and don't know at all ATI or nVidia.
Fixed that for you.
Re: (Score:3, Funny)
The confusion is that most regular people bla bla bla I want my banana bla bla.
Fixed that for you.
Re: (Score:2)
GP assumed that only integrated GPU/CPU units will be sold by AMD.
Re:Let The Confustion Begin (Score:4, Insightful)
I can't wait to sort out the confused people around me thinking there are two physical CPUs
I'd imagine that the only people who care to hear about the internals of your computer (if any) will be able to figure it out for themselves.
Re: (Score:2)
GPUs are being used more and more for non graphics co-processing these days, so any people thinking of the machine as having two CPUs aren't that far off..
Re: (Score:2)
Re: (Score:3, Informative)
Classic example of not reading the article... (Score:5, Informative)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
You could RTFA. Oh, right. Carry on then.
Re: (Score:2)
Or you could RTFA (I know, new here etc...) and see the bit that says...
"The lower row omits the AMD logo, so PC makers shipping Intel-based systems will be able to avoid the oil-and-water combo of Intel and AMD branding, if they wish."
... and save yourself all that trouble.
Re: (Score:2)
You think people know what those stickers mean anyway?
The only thing that counts is the logo, people see the adverts on TV then see the blue Intel logo and something goes 'click' (in theory). A extra red AMD Radeon sticker is just background noise, something else they don't understand.
Re: (Score:2)
How is it insightful to make a comment that is obviously wrong to anyone that had RTFA, (or even looked at the pictures in the article, for that matter)? Oh, wait... it's /. Not even the mods do that.
To clarify (repeat what was mentioned in TFA): The graphics brands "Radeon" and FirePro" will have new logos with the "AMD" branding, but they will also have logos with no "AMD" branding on them (just generic "graphics" instead), for use in Intel systems.
Re:That's retarded. (Score:4, Informative)
Re: (Score:3, Informative)
AMD was founded 16 years before ATI and was producing branded processors before ATI existed.
Re: (Score:2)
For some definitions of the word "surviving"... :-/
Re: (Score:3, Informative)
Matrox [wikipedia.org] is older.
Re: (Score:2)
Funny how fast intel seemed to have dumped that "Completely annihilated by AMD using less than half of our R&D budget" badge that they were wearing for a couple of years after the Athlon64 was released.
WTF how is this offtopic? (Score:2)
I think ATI was a more reputable brand than AMD that has to carry Defeated-by-Intel badge for years.
I think that ATI is one of the least reputable brands in PC hardware, every single ATI 3d accelerator I've ever owned has caused me some kind of problem. Retiring the ATI brand won't fool any geeks but it will fool the people they told to never buy ATI.
Re: (Score:3, Insightful)
>it will fool the people they told to never buy ATI.
who would be so irresponsible as to tell someone that?
Re: (Score:3, Insightful)
who would be so irresponsible as to tell someone that?
Friends don't let friends buy ATI. I will no longer attempt to help friends with ATI driver problems because usually the answer is "you're fucked" or "become a driver developer" which is the same thing. I can't remember the last time I had an ATI graphics solution with which I've had zero problems, because that has never happened and I have used hardware from almost every generation of ATI graphics chips. Wait, that's no true, there was one combination I had no graphics problems with, Mach32 on NT3.51. But
Re: (Score:2, Informative)
To counter your posts: I've never had a single driver problem or failure with ATI cards, even in Linux (several distributions across several major release revisions). To top it, they have always had superior image quality compared to Nvidia, regardless of the back-and-forth of performance lead.
Re:WTF how is this offtopic? (Score:4, Informative)
Re: (Score:3, Insightful)
I don't know what you are talking about. I have had just as many Nvidia problems as ATI in the past. Currently, I have no ATI driver problems.
You and the sibling poster are in the minority. This would be a nice application for a Slashdot poll which would prove it. Vast numbers of slashdotters have reported ATI problems and nVidia solutions. ATI is constantly going backwards; on my R690M chipset (R2xx graphics) ati driver causes display corruption and r2xx is no longer supported by fglrx. Not long ago this was a currently shipping chipset and yet ATI actually offered no working driver on any OS but Vista. (I am using Windows 7 now and Suspend/Resu
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
AMD is about to put its own 32nm process into production chips, so at the very least the very top end will not be Intel-only land anymore. The only question is whether or not AMD's new chips will continue the long standing trend of spanking Intel on the price/performance metric ("defeated-by-i
Re: (Score:2)
My laptop with a GeForce 310M does it just fine. Pushes out 1080p, too.
Re:Justice Department on vacation since 1980 (Score:4, Interesting)
Re: (Score:3, Interesting)
If this wouldn't of happened then how much longer would ATI of survived. They basicly said FU to Linux and ignored it.
Are you seriously implying that ATI's rebound after being bought by AMD was because they started to provide Linux drivers?
Re: (Score:3, Interesting)
I'm hoping he's not saying that, but I think it IS clear that AMD has done of better job of managing ATI than ATI was doing itself. Improved Linux drivers are merely one tiny part of that.
The reality is, if AMD HAD been blocked from purchasing ATI and no one else did, they likely would have folded and we'd simply have nVidia is the (almost) sole provider of discrete graphics chips.
What really scares me though is that if AMD ever ends up folding, we revert to single supplier situations for both CPU's and GP
Re: (Score:3, Informative)
Which manufacturing workers exactly?!
ATI does not have a plant. It's all TMSC and the other one I forgot how's it called.
Re: (Score:2)
They were too busy dragging their heels on the Sirius/XM merger at the behest of the NAB to notice.
Re: (Score:3, Insightful)
I don't understand. There were two major manufacturers of CPUs and two major manufacturers of GPUs before the merger, exactly the same number as after the merger. Where is your problem exactly?
I personally see problems elsewhere. One example is ebay, the online auction monopoly, being allowed to not only buy paypal, but also disallow any other payment system...
Re: (Score:3, Informative)
Four companies became three. ATI, Nvidia, Intel, AMD. AMD bought ATI.
Intel has been trying to buy Nvidia for years, saying that they need to merge in order to "compete". Nvidia resists, but it'll happen eventually.
Re: (Score:3, Insightful)
Before AMD bought ATI:
- Intel
- AMD
- nVidia
- ATI
If AMD dies, ATI is still there. You have no choice but to buy intel, but you can at least choose between nVidia and ATI.
After AMD bought ATI:
- Intel
- AMD+ATI
- nVidia
If AMD dies, ATI dies too. You have no choice but to buy intel and nVidia.
I'm guessing a lot of slashdot users are too young to be able to remember that in the past years and decades, sometimes AMD was better than Intel, sometimes the other way around. Same thing goes for ATI and nVidia.
Just becaus
Re:Justice Department on vacation since 1980 (Score:4, Insightful)
Re:Justice Department on vacation since 1980 (Score:4, Funny)
AMD bought ATI. Four companies became three.
The lines between CPU and GPU will blur.
Fewer companies does not mean more competition. Less competition means we get fucked.
If you still need clarification, contact me offline and I'll explain it with charts.
Re: (Score:2)
Are we really OK when there are only two major manufacturers of processors and graphics hardware?
They are apparently OK with Intel making graphics hardware. Even though Intel's integrated graphics processors suck, they probably outsell both ATI and NVIDIA in the low-end consumer desktop and laptop markets.
Re: (Score:3, Funny)
Re: (Score:3, Insightful)
No, it wasn't a merger. It was a hostile takeover of the US government.
And it's been going on for decades. September, 2008 was just the closing party.
Re: (Score:2)
No, it wasn't a merger. It was a hostile takeover of the US government.
And it's been going on for decades. September, 2008 was just the closing party.
LOL! That's rich! Most of those Goldman Sachs executives got their Federal administrator positions as appointments by Obama. That's not a hostile takeover at all, it was by mutual agreement, and it wasn't closed in September, that was just the IPO.
Re: (Score:2)
Re: (Score:2)
Wow what sillines sis this.
First drop the Bush Justice department crap. Obama has been president for a while and the UA Continental merger is on his watch.
Second what is your problem?
There are several GPU makers are then market right now. Via, Nvidia, Intel, and AMD all make GPUs for the PC X86 market.
You have three x86 CPU makers Intel, AMD, and VIA.
So there are several GPU makers in this one product segment.
Now for your other issues.
Manufacturing jobs? ATI doesn't have any. They used FABs to make their ch
Re: (Score:2)
There are a lot more than three CPU makers.
There are just three x86 CPU makers.
Would I lake more. That would probably be a good thing.
But here is the flip side...
Do you want less?
AKA if AMD had not bought ATI odds are pretty good that AMD would be less competitive than it is now.
If anything the AMD ATI merger is pro competition because it help maintain AMD as a competitor to Intel.
ATI never made CPUs at all. So the end result is things not getting any worse.
Which is better than ATI and AMD going belly up.
Fr
Re: (Score:3, Informative)
Sure, that's why air fares doubled in many major routes last year. And foreign carriers raised prices even more because of the mergers going on over there.
And maybe you didn't hear now that American wants to buy Southwest so they can "compete" with the new United merger. If Southwest is out of the market, how much do you think fares will go up? T
Re: (Score:2)
Radeon outlasts ATI (Score:4, Funny)
It's interesting that the Radeon brand, or series at least, has outlived it's creator. Who will be there to give away Radeon to it's new life partner?
Something old (AMD), Something new (Radeon), Something borrowed (x86 architecture), Something blue (Intel?)
Re:Retired ati a long time ago.. (Score:4, Interesting)
Re: (Score:3, Funny)
Yes.