Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

AMD Planning 1GHz CPUs 200

idan writes "This ZDnet article article indicates that AMD is opening a fab that will produce 1GHz Athlon CPUs." I'm sure it's pure coincidence that AMD is making this announcement so soon after Intel announced their "real soon now" 1100 MHz "Athlon Killer". Do we get to call this one the "Athlon Killer Killer"?
This discussion has been archived. No new comments can be posted.

AMD Planning 1GHz CPUs

Comments Filter:
  • An interesting point brought out in one my courses here at school... When analyzing algorythms you're not supposed to "consider" the machine it will be running on. If you think about it, it makes sense.

    A really fast machine running an n^2 algorythm will still be running an n^2 algorythm. When you reach a certain (often relatively small) set of data, you will SERIOUSLY notice how laggy the system is performing. Doubling the speed of the processor doesn't mean that you can double the amount of data you give to an n^2 sort before the time it takes is greater than before. It's a mere fraction of that.

    People will continue to program the way they're used to. People will program in a manner to scale, if they need to scale! If they don't need to, then they won't (why bother wasting the extra time?). I don't know about most programmers, but for me programming is an ego trip. My goal is to get the slickest smallest fastest most bug free piece of code out there. Now, I realize that there are many coders out there that don't think like that but the thing is they'd code using the n^2 algorythm anyway.

    ...and another diversion: in some cases an n^3 algorythm will outperform an n^2 algorythm (other examples can be made); you also have to consider the data set you use. If KNOW you're going to use a data set smaller than "1" (one is a relative term, where the two functions intersect; it may be a rather large number if one algorythm is measured in minutes and the other seconds) then then n^3 would be the better choice.

    I seriously doubt this will change the way code is written...
  • "The failure of the Athlon"? I know it's hard to find Athlon Mobos, but isn't it a bit too soon to call it a failure?
  • by LocalYokel ( 85558 ) on Tuesday October 19, 1999 @09:32PM (#1600192) Homepage Journal
    ... and this was also reported in The Register [theregister.co.uk]. Check the dates!

    (my apologies to The Register staff for 'deep linking'...)

  • I'm sure if you asked Intel, they would tell you they'd be perfectly happy if AMD would stop making faster chips.

    Or did you mean to write AMD?
  • as we need faster throughput. Face it, chipsets aren't as interesting to consumers. As long as consumers buy machines based on procesor speed, manufacturers will continue to sell junk with fast processors. Just how useful is a 1 Ghz chip when the memory runs at 100 mhz and the system is using an IDE disk controller.
  • some really bad bugs(by bad i mean something like the F00F bug)

    The F00F bug only really was at all serious on a server which is running untrusted code. While this is certainly a bug, it's not one that will affect a large quantity of the people using this chip. Furthermore once someone smart enough thought about it, it became easy to fix/work-around in software. The Pentium's FDIV bug is probably a better example of a bad bug.
  • IBM is selling Aptivas with 550Mhz Athlon processors. The pricing seems competative with a similarly clocked Pentium 3. I went to Dull and te Cow Place...neither had any Athlon systems....
  • I'm willing to bet that if tomorrow, AMD said they were working on a 2GHz CPU, intel would announce a 3GHz CPU the next day. They're just playing one-upmanship, which is easy to do with vapor. But in SILICON, AMD still kicks intel's butt today.

    "The number of suckers born each minute doubles every 18 months."
  • Have Intel been giving us their best - hardly likely - until now, no-one's pushed them, so they've been able to deliver lower cost, lower performing processors as "state-of-the-art", simply because no-one could show us, the consumers, otherwise. Now that AMD have finally "made it", they're being forced to bring out the more advanced processors - IA-32 (Williamette) launch pushed forward 9 months as an example.

    AMD - without a doubt - they've been struggling to keep in the market, as a smaller, less well financed and therefor less well financed company, they've had no choice but to put forward their best just to keep up with the market - AMD chips have until recently been renowned for not being very overclockable - because they ran close to their maximum performance levels already. With the release of the Athlon, they've finally caught, and surpassed Intel - and now Intel are being forced to change tactics in order to compete with the people who could quite possibly steal the "PC clone performance kings" title from them. Much kudos to AMD for what they've managed to achieve.

    As an aside, I opened up an OLD PC the other day (I can't remember what it was, but I do know it was OLD) and a large number of the chips were labelled with Intel AND AMD logos - together, on the same chips - kind of hard to comprehend considering the way Intel and AMD are now so fiercely competitive....
  • What about all the people who need something low power? It will drive prices down for other chips, but who needs a 1ghz cpu for home?

    Didn't you just answer your own question? Re-read those two lines.

    Also, how would the advent of 1GHz CPUs allow code bloat to increase at a quicker rate than it already is increasing?

    -A.P.
    --


    "One World, one Web, one Program" - Microsoft promotional ad

  • Actually, I'd say the Celeron is the perfect example of a company rushing a product out the door to "fill" a need.

    The first Celeron's released had no L2 cache. They were DIRT SLOW. It was nothing but a PII without a case nor the L2 cache on the processor slot. It was designed to compete with AMD and Cyrix at the low end. Which it didn't (not having an L2 cache really killed performance; not as bad for games, as those often have L2 misses anyway).

    Months later, the intel guys re-released the celeron with 128k L2 cache running at chip speed. I don't think they even thought about the overclocking potential of the sucker, or think that it's performance would approach/beat that of their "cash crop" processor (the PII).

    Intel didn't take it's time, it just threw something out the door hoping to put a rival out of business.
  • If I were to plug one of them into my box right now, most of the cycles would be admittedly unused.

    Do you have unused cycles? What kind of person are you? Everybody should donate their unused cycles to distributed.net! :) ( http://www.distributed.net)


    --
  • Good to see the old pattern is still working like a champ:

    Announcement: New faster/better/bigger XYZ!

    Response 1: Do we really need faster/better/bigger? Most consumers don't need it! And I'm the first person ever to ask such a profound, socially-conscious question!

    Response 2: I bet that makes a lot of heat. Hyuk, hyuk!

    Response 3: Huh, I bet that would make a kickass Beowulf cluster.


    Repeat responses 1, 2, and 3 as necessary. Presto, automatic Slashdot conversation generator.

    MJP
  • I picked up some AMD stock 2 or 3 years ago when it became clear that Cyrix was on its way out. Since then, AMD has gone from a "catch up and take a piece of the low end market" to what we saw today, charging Intel head on. I'm still down about 25% on my investment. Cool CPUs don't make the stock price go up, earnings reports do.

    -Barry

    BTW: Did anyone else do a double take on "1.9 billion"?
  • Uhm.. That's also AMD's fault. IIRC, they're making the chipsets that the mobo companies need. And not olny are Athlons in short supply, so are the chipsets!

    D'oh!
  • After checking out this story a little bit, I bought stock in AMD. A few reasons:

    1.) Right no AMD is ahead in the race. Pretty impressive for company with something like 1 billionth (give or take) Intel's market cap.

    2.) Even if Intel takes the lead with their "Athalon killer" AMD's production of the athalon has shown that they can now compete with Intel as equals, sort of.

    3.) If Intel DOES rush up the production of their 1100 mhz chip, it will result in a production and distribution nightmare. I would bet on one or two recalls too. Conversely, AMD has almost flawlessly executed production for all of it's recent releases. If THEY say they can rush their 1Ghz, I tend to believe them.

    4.) Buying Mote or IBM won't make me money off the G4, which I'd still rather have than any of these. Despite the fact that I'm using a second rate Pentium Laptop, I'm a Mac Man at heart.
  • We need it so that we can crack root in 10% of the time. duh. =]
  • It blows my mind that people publish such error filled articles. As many have pointed out, they blew the definition of a micron (I suspect they really intended to say a micron is a thousand of a millimeter since most people have a feel for a millimeter and a factor of 1,000), but I have yet to see anyone say that 0.18 micron technology does NOT mean that the transistors are now 0.18 microns apart! Far from it! All that means is that the finest lines they draw are 0.18 microns, when drawn. Usually, only the gate length is this size. I even saw a post that said the wires connecting the transistors were going to be 0.18 microns wide. That would be one of the slowest CPUs ever made since the RC time constant would kill you! If you make it to the metal layer with your wire, it is going to be WAY bigger than the gate width to reduce the resistance. OK, enough of my rambling, just wanted to point out another REALLY STUPID comment from the author who obviously has never dreamed of layout an IC.


    (Disclaimer: I have worked for both AMD and Intel, although am not currently employeed by either)
  • I don't know about that CAR thing, I think that the overall quality of certain factors with cars has declined greatly since the "japanese invasion".

    Now, we have a trend towards smaller, lighter cars, which offer much less protection in collisions. Now we have a trend towards front-wheel drive cars which reduces your control at high speeds (really only matters to racers), and acceleration off the line. Now we have a trend towards smaller engines which are packed with all kinds of electronic equipment and emissions systems, which helps the environment, but reduces overall performance and reliability, and when performance is made up for with more gadgets, reliability suffers, and when more engineering goes in to the reliability issue, you get a much higher cost, not only original purchase price, but also maintenance.

    Car owners in the 60's and 70's had things much better. Or will you argue with the tens of thousands of VW enthusiasts who still drive their 60's and 70's era air-cooled rear-engine rear-wheel-drive cars, and get 20-30 miles per gallon?

    You can see the same trend sort of taking hold in the intel product line; $4k Xeons with four-pound heatsinks, versus $300 Celerons, which probably cost the same to manufacture (cache-memory considerations aside). A Celeron is like the front-wheel-drive econobox of CPU's. And if intel has their way, the whole chip industry will become the same way.

    "The number of suckers born each minute doubles every 18 months."
  • by The Musician ( 65375 ) on Tuesday October 19, 1999 @08:44PM (#1600217) Homepage

    A micron is a 1,000th of a meter

    Last I checked we called that a millimeter. Can you even image a chip done in 0.18mm?

    --

  • FatSean mentioned that neither Gateway nor Dell sold Athlon based systems.

    I work indirectly (through an agency) for the one of these which rhymes with Hell, and that's true -- none to be expected in the next few months, either. :(

    Below is the text of the letter I Sent to Michael Dell a little while ago; I believe it eventually sent, but I was amused to see that it was first returned by the mailer as having "permanent fatal errors" ...



    Subject:
    processor diversity vs. Intel dependence

    Date:
    October 12, 1999 5:44:40 PM EDT

    To:
    michael@dell.com

    Dear Michael:

    First of all, I own a (piddling) amount of Dell stock, but none in AMD, though that might soon change. I also work for an ad agency which does a lot of Dell work.
    [note: deleted the name of agency. tl]

    Now: As far as I know, Dell uses Intel chips in every computer it builds. If that is not true, then the rest of this message is based on false premises and you can stop reading.

    However, if Dell really uses no processors other than Intel, I think the company is worth less to me (and you) than it would if it also built systems with AMD chips, or even Cyrix chips.

    Dell was screwed as much as anyone with the sudden *un*release of the anticipated 820 / Camino chipset; that fact alone should be enough evidence that being in bed with a sole provider is chancey. In the case of some other PC makers, though, some of their higher-end systems would be unaffected, because they are based on the AMD Athlon.

    Dell finally preloads Linux (thank you!) at least on some systems, and even with a premium. You wouldn't stick with a single hard drive manufacturer or memory supplier, so why do it with the driving point of your systems, the
    CPU?

    Cordially,

    Timothy Lord





    No response, so far ... ;)

    timothy
  • I don't want to be repetitive and say the same thing that so many have already said. Instead, this is simply a "confirmation" that yes, AMD did announce works of an Athlon clocked at 1GHz without supercooling to get fabricated this year. It's very old news. I do like seeing Intel trying to say they are better by adding 100MHz to that number, when in fact an Athlon 1GHz will perform just as well or better than a 1.1GHz Intel (P3, whatever, etc.) CPU.
    The scariest thing about this is that I am still using 366MHz, and my cousin still has a Pentium 100!!! Sheesh, I'm just hoping for maybe 550MHz in my near future! People out there are raving about 1GHz, this is getting too silly. I would like to go back and say that our CPUs are all fast enough now, and the market should start concentrating on all of the other things slowing us down (hard disks, busses, memory, etc.)
    I'm done.
  • Ooops, Yeah I mixed them up, but Intel..AMD..same point applies to both of them anyway, but sorry for the confusion.
  • by Mike Hicks ( 244 ) <hick0088@tc.umn.edu> on Tuesday October 19, 1999 @08:48PM (#1600224) Homepage Journal
    Reminds me of the BBS quote wars (remember?)

    > this
    > > is
    > > > getting
    > > > > out
    > > > > > of
    > > > > > > hand
    > > > > > > > !

    :-)
    --
  • Well, personally, I'd like a 2GHz chip, just to get Word fired up in a reasonable amount of time.

    What's a reasonalble amount of time?

    Fast enough so that by the time it comes up, I can still remember why I started it.

    "The number of suckers born each minute doubles every 18 months."
  • Who is it that falls for these "announcements." Of course AMD is working on a 1GHz chip, that's what chip manufacturers do.

    I can understand, a little anyway, why it makes sense for software manufacturers to promote vapor ware. After all, they are trying to keep you from buying into their competitors completely incompatible system. AMD and Intel, on the other hand are making products that are essentially drop in replacements for each other.

    Does the average consumer care where his wheat was grown? Heck no. Soon they won't care who made their processor either. It will all be about speed and price.
  • No, the original Celery was "crippled" intentionally to make people feel good about paying $4k for a Xeon.

    Then when AMD made even more inroads they thought - "shit, this low end (low margin) market SUCKS to sell to, but we gotta for marketshare".

    So then they did a quick fix and crippled the Xeon by making one with less on-die cache, and of a form-factor that doesn't fit any multiprocessor motherboards, and then we had the "new" Celeron. Which people quickly figured out that it was really a slightly crippled Xeon, and were able to overclock the shit out of it, and crowbar it into multiprocessor motherboards (with the socket 370 to Slot 2 conversions). It was a quick fix for their failure to segment the market with a cheaper to produce "original" Celeron. They couldn't execute a different design quickly enough, and so they just took their lumps with the overclocking.

    Now that AMD threatens their midrange CPUs with Athalon, I'm wondering what shoehorning Intel is going to have to do to compete, and whether consumers will end up with another win. . .

    basically, it's intel's fault in the first place for greedily trying to force this artificial market segmentation, where none should exist. It's just a way of sucking money from businesses who can afford it, without sacrificing the marketshare of the consumers/home users who can't.

    "The number of suckers born each minute doubles every 18 months."
  • by Anonymous Coward
    I'd like to write some cool TV ad's for AMD. maybe some lame ass dudes in bunny suits driving some weird, silly auto being blown by some hot sports car with a guy looking back at the bunny suits, shrugs and thinks "Freaks..." Seriously, I'd be REALLY nice to see arrogant Intel put in their place for once. AMD has worked hard and deserves the recognition from both the media, HW manufacturers (HELLO, ASUS??) and ultimately, the consumers. GO AMD!
  • I forgot a few classics:

    Response 4: This is vaporware!

    Response 5: Oh my God I want one of those so badly!

    Response 6: This isn't news, I saw this before.


    If Slashdot had the ability to automatically moderate all such posts to 'Redundant', I'm willing to bet the signal-to-noise ratio would rise dramatically.

    MJP
  • Forem :)
  • As someones sig states: Gates law: Every 18 months the speed of software halves. combine this with what I think was Moores law: Every 18 months the speed of hardware doubles and you get something which goes like this: Any and all resources can and will be used... ....+1 byte
  • Everyone already knew this. It's been known that
    the Athlon scales *VERY* beautifully, and that once the Dresden Fab30 comes online, they'll easily ramp up to 1Ghz.. it's why Intel is scrambling. Kryotech has been saying they'll sell thier SuperG 1Ghz (cooled) Athlon by december, and Fab30 is due to be online by the first week in 2000. So where's the news?

    Now, alls AMD needs to do is make a better CHIPSET (or VIA, whichever comes first), one that supports SMP and more than 512k cache. I'm a supercomputer/scientific researcher. And i write tight code no matter how fast the processor goes.
    What do I see in my future? Clusters of Athlons and Alpha's for now, and multi-threaded hardware beyond that. Funny, unless EPIC really surprises me, Intel is nowhere in my future... hmmm..

    --ps, i still think AMD should buy the Alpha and it's designers.

    PianoMan8
  • I'm always a little wary of this sort of stuff. Faster processors *do not* always mean greater productivity or even speed. Coders will become even more slack as a result of this, Im afraid.

    There is really something to be said for developing on a slow machine and spending a fair amount of time *optimising*... I hope the Linux kernal hackers dont get caught up in this and start bloating the kernal (ack!).

    Anyone who has seen recent (1998-1999) Commodore 64 demo will know what I'm saying. You wouldn't believe what they do with a 1Mhz processor these days, and its all due to *optimising*...

    Now, running well optimised code on a 1Ghz processor, well, thats something different :)
  • Okay, I found the above post mildy amusing, and as such I feel it should have gotten a rating of +1, funny. I can understand that some people have different opinions.

    But who the fuck decided on "Interesting"? I'd choose "Troll" over "Interesting" for that article.

    Moderate me through the floor if you see fit :)

    "Binaries may die but source code lives forever"
    -- Unknown

    SkyHawk
    Andrew Fremantle
  • Has anyone noticed that when you use programs that let you create programs or web pages from templates such as front page (shudder) the code created is mainly filled with useless babble. Because computers are becoming bigger and faster Software can be written really sloppy and people don't notice because the machines are now fast enough so it dosn't make much difference. It would be interseting to see how much programming code has deteriated since the days when people had to hack away at pragrams at MIT in the 60's.
  • That's because it's probably actually going to be 666.6666_, which rounds up to 667. Just like a 99.99999_ MHz chip (33.333_MHz bus tripled), was really sold as a 100MHz.

    "The number of suckers born each minute doubles every 18 months."
  • class CPUs {
    $popularity;
    $speed; // In Mhz

    function CPUs($speed) {
    $this->speed = $speed;
    $this->popularity = 1000;
    }
    }

    $intel = true;
    $pentium = new CPUs("1100");

    $amd = true;
    $athlon = new CPUs("1000");

    for( ; $intel && $intel->popularity != 0; $athlon->popularity++, $pentium->popularity--) {
    $athlon->speed += 100;
    }

    $intel = false;

    return ($winner = "amd");
  • Maybe it's for real!

    Maybe the chips are 1000 times larger! Maybe the new athalon will be the size of your desktop! The one you put your monitor on! It would be hard to get a high yeild that way, though. The wafers would probably have to be 1000 times larger,or about 18000 inches in diameter. What is that, 1500 feet? You could fit 9 wafers in the area of Hoboken, NJ. (But why would you want to?)
  • Oh please, they were all ripped off the old WWiV source in the beginning anyway, so why not just stick with it. ;)

    True, but then again there were a lot of WWIV spinoffs. Telegard 2.7i... wow the memories. :-)

    ... then Renegade ripped off Telegard, and all that good stuff.. God I miss those days!

    Andrew
  • ...you could fit 9 wafers in the area of Hoboken, NJ...

    Dude, I'm not going to check your math, but either you just made that up or you're got way too much time on your hands...

    --

  • Yeah, cuz that's what avengers do...they kill the killers.

    (just fitting the requirement that something be in the comment box)



  • Your statements regarding metal widths are not inaccurate.

    I have worked with three different 0.18 micron processes, and the minimum
    metal width for lower metals is between 0.26 and 0.28 microns. The minimum
    wire widths are process limits, not RC limits. An 0.18 wide wire would be
    about 50 % more resistive than than an 0.28 wide wire, but would have lower
    capacitance. If the fabs could make wires this narrow, they probably would.

  • Noway - Forem predated WWIV.

    Gotta love hitting break - dropping to the CLI, modifying code and variables and typing...

    resume

    hehehe
  • Do I really need this? Well, AoE II really lags down at times on my 64MB P2-266. And I wont even go into the performance of 3d games like Descent3, Unreal Tourney, etc... So I'd say that I'd really like one! (or at least more ram, too bad its so f****** expensive!)
  • I have also heard from informed sources that they have had *very* good results with the preliminary Athlon batches from the Dresden fab. One of the later reports claimed that AMD engineers had sucessfully run multiple CPUs in the 900Mhz range with nothing more than large heatsink and fan cooling units from "rough" pre-production wafers.

    These comments are about a month and a half old and I've heard no more specifics. If this is an indication of production quantity yield, then I think AMD will finally get some of the rewards they have worked for so long and hard.

    p.s. For the folks that whine that people don't really need a 1Ghz cpu, I still have a fully functional 486-33 with 4megs ram and a 200meg HD that was quite fast when it was new. I'll trade it to you for that P-III 500 that is way too fast for your applications. Then you can run your apps at a more relaxed pace. Watch the apps grow in size and complexity for a couple of cpu generations and you won't be confused about the need or desire for faster systems... ;)

  • You're right to some extent. About as right as this guy a little bit farther down the stream which contribution has "Race to the GHz, chasing the wrong carrot" for its title. Obviously the youngest programmers being educated using fast processors will not necessarilly be sensitive to algorithms's complexity, at least not experimentally. That's for the human impact of such high core frequencies BUT the performance of a computer is something more complex and maybe some more thought (and more money) should be dedicated to the design of faster memory busses and memory chips.

    What's the point of running at 1GHz if an L2 cache miss stalls you for (wild guess) 20 cycles? Assuming that a program running on a 1GHz CPU is reading from memory sequentially word by word (32 bits words) and causes an L2 miss every four accesses with the latency stated above, what would be the actual number of loads executed per second?
    1000E6*(4/23)=173E6. We may be better off doing some data prefetching to minimize the cost of a cache miss...
  • I didn't mean failure in the sense that nobody was buying it, but the Athlon was talked about as if it was going to be a major competitor to the Pentium III. The Athlon failed to even come close to meeting this goal, that is what I meant by failure. The Athlon will certainly be around for a while, and people will continue to buy them -- I did not mean to imply anything to the contrary.
  • Read this [tomshardware.com] and THIS [tomshardware.com] and I'm sure you'll look at it a bit differently.


  • Overclocking an Athlon involves soldering the chip. No. Just... no. Tempting, but no way am I taking a soldering gun to one!
  • He was quoting the ZD article which explains that a micron is 1,000th of a meter.

    Here's ZDnet: A micron is a 1,000th of a meter.

    Here's The Musician: Last I checked we called that a millimeter.

    See the thread now?

    --
  • ...And just when I was getting hopeful to get some day a 666MHz 'InHell' chip, they're entering the next decade! Darnit, gotta make some overclocking on my Pentagram motherboard >;E
  • (not to get too OTP) Yes, since processors get faster, coders dont need to optimize as much. But still, you fail to see the full potential of 1GHz. Not only will it be good for games (especially non-GPU's), but good for servers, slow algorithms (namely compression), and the like. Who cares about optimization anyhow??? With faster CPU's comes sloppier code, but that means that the programmers dont work as much, therefore (somewhat) lowering the prices/increasing their pay/hour.
  • The C=64 has something going for it PC's don't have -- a consistent hardware configuration.

    When was the last time you opened a PC, saw the same cpu, same sound card, same I/O controller, same video card, same brand floppy drive, etc? When was the last time you looked at a hardware config and saw all of the same types of devices on the same IRQs?

    You don't. In order to accomidate that sort of flexibility stuff needs to be "abstracted" out so it doesn't depend on the same hardware, but rather the same functionality.

    For example, all sound cards can play a sound. Now, the process of getting a SB16 to play a sound and an A3d board to play a sound is very different, but if you have some sort of software abstraction layer that just says "play a sound" and will call some code that knows how to play sound (usually called a driver, ooh goodie). And sound is played if the driver doesn't suck ass. :).

    I always see people complaining about code bloat (usually referring to microsoft products -- and I can NOT understand how Word got to be so frigging big).

    These same people don't realize that you don't need to optimize 90% of the code in a product. You only need to optimize the parts the user waits on. Seriously, what's the point of optimizing a print routine (for example...)? All of your time is spent waiting on the printer...

    These same people also don't realize that sometimes it's better to use the "slower" algorythm; not only is it easier to understand what the code is doing, but sometimes it's actually faster to use a bubblesort over a quicksort (try sorting mostly sorted list with a quicksort, then with a good bubblesort and tell me which one returns faster).

    And to top that off, the process of optimization often leads to really wierd looking code (and it's amusing to watch someone try to figure out what the hell you were smoking when you wrote it) that's hard to debug/fix/modify.

    ...sorry, just one of my frustrating rants I guess...
  • There arn't many fabs that can manufacture at .18um using copper interconnects. IBM is one of the few, and their fabs are tied up making PPC chips for motorolla.

    The second reason is capacity. AMD's austin fab is a pretty decent sized plant, but it can't make chips fast enough. They want more capacity, so they made a plan that can fullfill that...

    A third good reason is that they don't depend on third parties to make the chips. It'd really suck if, say, intel bought out the fab that was making AMD's chips for them...:)
  • I need this. Some applications just aren't fast enough on a 300 mhz processor. I have a recording studio where we record to a PC hard drive. Noise reduction on a 350 mhz machine takes about 15-20 minutes per track. On a 1 Ghz machine, this would only take 5 minutes or so. Big difference. Could turn a 2 hour mixdown into about half an hour.
  • It's really a 286 Killer,Killer,Killer,Killer,Killer,Killer,Killer,K iller,Killer,Killer,Killer,Killer,Killer ,Killer.
  • Not necessaerily, although i do take my hat of to the Developers of the ARM.. quite a nice CPU from what I've read but i've never had the chance to work with it.

    What I was talking about was multi-threaded hardware such as Tera (www.tera.com) is working
    on. Very good Integer performance, almost linearly scalable. As for floating point i dont know that much. But it's probably whats going to be the "next big thing" in supercomputers that will filter it's way down to the micros...

    PianoMan -- Still waiting to invent the Vector Co-Processor. ;)
  • I could be wrong on this, but isn't the cache OFF-DIE for the Athlon?? or maybe i'm just really confused. (has been known to happen ;).. Although that would make sense as right now i can't find a motherboard that says how much cache it has on it... well then, i wait for the 2MB cache version ;)

    Also, you're right, i really don't need ata-66 or agp 4x.. what i'm waiting for is multi-processor support. (and the money to afford it.)

    someone want to hire me? ;-)

    PM.

  • Microwave radiation is a health hazard.
    That's true, in sufficient (i.e. very, very big) doses. However, I'd like to point out that you receive far more microwave radiation from the Sun and lightning than any man made source, unless you stand in front of radar or long distance comm antennas all day long. The next time you are at an airport, think about how much effect the ATC radar, which radiates microwave energy in the tens to hundreds of kilowatts, has on you. The fact that people who live near high power ATC and military radar sites have average rates of cancer should help ease your mind.

    The radiation produced by consumer devices like computers and cellular phones is so low compared to natural and other man made sources, you shouldn't even consider it. In fact, you get much, much, much more radiation (albeit at lower frequencies) from your monitor than your CPU. Also, I'm not totally sure but I think the FCC has strict requirements for radiation/interference from class C devices even in the microwave bands.

    The most common effect of long term low-level exposure is damage to the cornea of the eye.
    That depends on the frequency of the radiation. The eye is particularly sensitive to very short wavelength radiation (UV and above), and not so sensitive to microwave frequencies. Basically, the only effect microwave radiation has on any human tissues is heating. And at microwave frequencies you need extremely high field intensities (like in a high power waveguide or resonant cavity) to produce any measurable heating.
    And some studies have linked long term exposure to certain cancers--although the debate seems to be open.
    It's funny you mentioned this. I just read in the Oct issue of Scientific American that a biochemist from Lawrence Berkeley NL named Robert Liburdy intentionally faked the results he published in a landmark paper in 1992. His paper was one of the first (or the first) scientific studies of the effects of low level (ie. not enough to produce heating) electromagnetic fields on cells. It turns out he falsified his findings to show that the low level radiation affected his cell cultures. Other scientists trying to pinpoint cellular effects of low level radiation have found nothing. So despite the best efforts of many scientists to find justification for their paranoia, nobody has found any link between low level radiation and any form of cancer.

    Now, it is obvious that radiation at extremely high field intensities (e.g. in a microwave oven or standing in front of a big radar) as well as very high frequencies (X, Gamma, cosmic rays) can cause cell damage. However, there is no reason to believe that low level radiation from everyday products like computers, TVs, cellular phones, etc. will do you any harm. I know lots of colleagues (and some of their mentors) who have spent practically their entire careers in radar/antenna test chambers absorbing low level microwave radiation with no ill effects.

  • I'm staying with my trusty ol' P200, at least until Transmeta out-does AMD and Intel (well, maybe not, but we can still pretend). Can I get on a waiting list for the very first Transmeta Linium (let's make up our own names!) CPUs yet?
  • I've got this dual Celeron 550 compressing MJPA video with both processors maxing out the local buss. With 1100 MIPS they'll do only 15 frames/sec of 640x480. The problem isn't the MIPS but this 100Mhz buss. We need at least a 300Mhz bus before CPUs become useful. Anyone else benchmarking video capture on their Athlons?
  • This chip has been out for a couple of months,
    and is way beyond anything Intel has to offer.
    It is seriously better than the PIII in all
    respects. You're saying that you DON'T mean
    failure in the sense that nobody is buying them.
    Then WHAT are you trying to say?
    There are three things Amd tried with the Athlon:
    1. Creating a chip that would better the latest
    intel offering.
    -Check! Done... better in both floating point and
    integer operations.
    2. Creating a chip whose design would have a lot
    of headroom for future revisions and speed.
    -Check! Done... the design can supposedly go far
    over 2 GHZ. And is supposedly not that difficult
    to move over to 64bit.
    3. Creating a chip that sells far better than
    intels (to make AMD profitable).
    -This one has a bit to go.. The processor _IS_
    better, but Intel has better yield, and motherboardmanufacturers enough.
  • If ZDNet were out by a factor of a 1000 on the die technology, maybe they were also out on the speed. Maybe AMD will be producing terahertz chips, or more likely only a 1MHz chip :(
  • This whole thing is getting out of hand.
    I liked that comment i saw yesterday, it went
    something like this:

    "Nice, but do i need to install it in my freezer?"
  • Oh please, they were all ripped off the old WWiV source in the beginning anyway, so why not just stick with it. ;)

    Sides, they're still around, which is more then I can say for alot of the cloners. :)

  • You have opened a 286-er or 386-er. Up to somewhere around 93 AMD was manufacturing under licence. After that it modified the 286 core and the 386 core to improve performance (by about 5-10% over intel) and boosted the frequencies compared to Intel. Then the war began as we know it.

    Actually AMD has so far bested Intel sooner or later on every compatible category, the problem being that by that time Intel was delivering the "new bigger and greater". Though quite often it was worse than the AMD top of the line for the old design. Compare a P5 at 60 with an AMD X5 at 166 for example ;-)

    Every time intel actually took on a war with AMD for a compatible product it lost. Examples are numerous: 386 vs am386, 486 vs Enahcedam486 and X5, P5 vs K5-PR series, MMX vs K6, P6 (PPro, PII, PIII) vs k6-2/K6-3. So I guess it will NOT win this time. It will have to deploy the new latest and greatest (namely IA64) in order to win. And then the cycle will start a new. The interesting part being that now the timing GAP between them is much much shorter.
  • Yes, they publish anything that's funny or profitable -- when they do that, however, they give a subtle warning. Any story with an image of Fuller's London Pride on the right is untrue. If you follow the link below that image, they tell you how much it costs to get your own story published in The Register, true or otherwise...
  • Even The Register who first reported the 1.1GHz Willamette noted it is a rumour, and likely to be untrue. Besides, The Register had the 1GHz Athlon story before the Willamette, so there is no way it would have been sparked by the Willamette "announcement".

    Even if the Willamette thing was true, I'm not sure it would be good idea. Shipping a CPU nine months early probably means that it has not been tested very througly, and will therefore contain a lot bugs. No CPU is 100% bug free, but insufficient testing could mean that even some really bad bugs(by bad i mean something like the F00F bug) might slip trough.
  • It's pretty typical for the CPU not to be the main bottleneck on a computer system, at least these days.

    Sun's SPARC provides nice evidence of this; they are selling lots of systems for high end database and web applications not because the SPARC architecture is vastly superior to its competitors, but because the rest of the system is fast.

    On a PC, the real "critical component" is the motherboard, as that tends to be a determinant of such things as:

    • The speed of the memory bus, and how much RAM can be added to the system;
    • IDE/SCSI controller(s), and their quality/speed;
    • In the old days, how many bytes of buffer you had on your UART was pretty significant; RS-232 has pretty much gotten maxxed out since then...
    • The move from ISA and EISA and (less so) VESA to PCI was as much a signal of better performance in and of itself than the move from 80486 to Pentium...
    • I can't decide if AGP is actually a good thing; it makes it harder to build multiheaded systems...
    • These days, graphics cards have more RAM, and presumably more processing power, than one used to have on a 486 box for the main CPU. (These days, I have more cache on my CPU than I had disk space on my Atari 400... That's the most frightening ratio to compare...)
  • Athlon, the Immortal

    -k
  • Actually, W2K BETA 3 ran great on a 100Mhz Pentium with 48 MB RAM! No kidding. It was nearly as fast as NT4 and way faster than Win9x ever ran on that machine.

    I love it when something runs great that far below the official spec (what is it now? 200Mhz PII/64MB RAM?).

  • A look at the CPU info center charts here [berkeley.edu] show that the Alpha is still almost twice as fast as the Pentium III and Athlon at SPEC INT. Benchmarks mean very little but its such a shame this wonderful processor isn't getting the headlines, machine architecture and recognition it deserves.
  • Who ever said that those days are over eh? SOME OF US STILL ADDICTED TO MUD's.... I just wish they had more scripting programs for *nix
  • What do I see in my future? Clusters of Athlons and Alpha's for now, and multi-threaded hardware beyond that. Funny, unless EPIC really surprises me, Intel is nowhere in my future...

    Not so fast there - put 2 + 2 together. Multi-threaded hardware, right? That means SMP on a chip, right? That means: transistors/mip matters. Well, as far as I know, the crown for best transistors/mip rating in the business goes to ARM - guess what Intel is heavily involved in [intel.com]?
  • And really, what does the average user need more than 640k of ram for?


    ^. .^
  • the trend for the last couple years has been a return to large, gas guzzing SUVs. The air in Los Angeles was the cleanest in 50 years, but thanks in large part to massive sales of SUVs over the last couple years, this year will have worse air.

    the ford expedition weights over 3 tons and get TWELVE miles per gallon.

    blech.
  • AMD chips after the super7 (athlon and future faster chips) run on a completely different bus than any pentium ever has or likely will run. there is absolutely no compatibility between alpha ev6 slot A and wintel slot 1, and I don't know why it is so hard for people to realize this
  • is preinstalling linux really that great an idea? sure, open source and all, but isn't the great part of open source the freedom to choose? does preinstalling give you freedom of choice? of course not, vendors won't preinstall every os under the sun. I would much rather vendors ship systems with NO os, and let users decide (vendors lost that logic long long ago, when there was no "common" os like windows).
  • yes, as anyone who's read tomshardware.com's writeup has noticed, you can overclock an athlon by desoldering some surface mount resistors and changing their relative locations. HOWEVER, I can' think of any way a motherboard could get around the clock and multiplier settings of the resistors on the chip (and I should know, I've designed a processor before). anyone who tells you there is a motherboard that can do this is wrong. anyway, the arrangement is similar to intel's multiplier-locked chips (sy033 et al), but you can still change it with some effort. for the same reason you can't overclock those chips using _whatever_ motherboard, you can't oc an athlon with motherboard tricks.
  • All the advertising for computers that I've seen focuses on the idea of getting more information faster, learning more, etc.

    Ads for any other kind of product all appeal to masculinity by featuring seductive females, fast cars, and loud music.

    Intel ads have a little people in neon suits dancing around like idiots with a little jingle at the end. I don't know how the hell this has been one of the most successful ad campaigns in recent history.

    Intel claims the P3 makes "the internet go faster," which is nothing short of a blatant lie. AMD needs to flaunt the one /real/ thing they have: SPEED. Show the Bunny People (tm) getting run over by a scantily clad female in a fast car or something.

    While Intel advertises the speed of the internet (which looks more like CAD in their commercials), AMD should be advertising the amazing performance with shots of violent Q3 timedemo's and gorgeous women.
  • can anyone say "quake skins"? add a flight recorder and whamo! instant ads and loads of fun.
  • by Tsian ( 70839 )
    Note that i would LOVE a 1GHz computer (and plan to buy around an 800MHz Athlon after Christmas to upgrade my p2-233) but in reality i don't need the power (i *want* it, but try explaining why i need it to someone who doesnt use a computer regularily, and thinks doom is the best PC game out there)
  • OK, let me give you a different example to prove linux_penguin's point. I work with a group of very good software engineers, they all came from systems engineering - so they are all low level bit twiddlers by mentality; but they also all came from an 100% OO based system - so they truely know when to use an object and when not to. In the good old days they all wrote in C/C++ were they all produced really good code, with great performance.

    Part of this group were early adopters of this thing called Java (back in the 1.0 days) the other part of the group didn't move over to Java untill very recently and started right out on IBM's JDK. The new guys are still good developers (they didn't rust while still working in C/C++) but they have no problems at all coding something like:

    String foo;
    ...
    if (foo.equals("")) {
    ...

    while the original java people become violently ill at even seeing that code and replace it with:

    String foo;
    ...
    if (foo.length == 0) {
    ...

    There really isn't much difference, and from a purely OO standpoint the first is better. But those who remember the pain of running under 1.0 without a JIT won't waste even that small amount of overhead.

    When it comes to java, if you want a fast product at the end of the release cycle: FORCE the developers to run in interpreted mode ONLY. (let test, performance and marketing groups use the JIT - just not the developers.)
  • Actually.. Intel will label that one 667.. and I'm sure AMD will, too.

    Sad, but true
  • It has been observed that exponential-time algorithms (requiring, say 2^n steps on input of size n) can be executed in linear time by the following procedure:

    Wait 1.5*n years, while the speed of computers increases by 2^n. Execute the algorithm in unit time.
  • With all the focus on these Gigahertz chips, does that mean that those of us with small budgets will finally be able to afford the 600Mhz chips due to drastic price reductions?
  • I think ZDNet just reported it *after* reporting the Intel announcement to give the impression that it was a defensive reaction by AMD.
  • A micron is a 1,000th of a meter
    Mental note: don't let ZDNet have anything to do with navigating Mars probes.
  • Uh oh -- If The Register [theregister.co.uk] reported on this, then it must be true.

    As a reminder -- a while back The Register reported that Apple was switching to Intel chips. That came true -- didn't it?

    -B
  • This is what proponents of capitalism envisioned when they promoted a free market. For a long time, Intel was able to do just about whatever they wanted and still maintain their stranglehold on the personal computing processor market. Now, AMD and other corporations like it are challenging Intel's dominance.

    In a lot of ways, it is like the time when Japanese cars quickly replaced those produced by complacent American car companies. Maybe now, the processor market will see a jump in quality and a dip in price.

    Then again, its just as likely that we will just see a rise in quality along with a corresponding rise in price. But hey, I guess supply and DEMAND is part of capitalism too.

  • More megahertz is more polygons, pure and simple. Floating point speed is the key limitation to what we can do in games these days, and limits all aspects -- from doing transforms for polys to nifty effects like lighting.

    The real point of offloading generic tasks like geometry and lighting onto the graphics board is to be able to do crazy, highly specific features on the CPU. Games will only get wilder and wilder. Great things are to come!

    ...

    Oh, you wanted to get work done? I can give you a 386 for free that will run LaTeX, lynx, mutt, gcc/g++, gdb, and everything else you need to be productive :)
  • Does the world really need 1Ghz processors?

    Well, i know *i'd* like one, but in reality, what does the average user need with 1 GHz (atleast at the moment anyway)?

    While i am sure game designers are leaping for joy at the excessively fast calculations these processors wil do (and the resulting impact on gaming), we as yet don't *need* that power. Afterall, Some of the best looking games coming out will run nicely on 300Mhz
    Obviously certain roles are perfect for these processors.....servers spring to mind....but for the average home user, it really isn't needed (once again, however, i wouldn't complain to owning one ;)
  • Anyone who has seen recent (1998-1999) Commodore 64 demo will know what I'm saying. You
    wouldn't believe what they do with a 1Mhz processor these days, and its all due to *optimising*...


    The C64 was released back in '84, wasn't it? It only took a few thousand skilled hackers 15 years to get to the point where the code is "optimized".
    And Windows still doesn't run on the thing.

    Well, good luck in convincing anyone that it is wiser to spend 15 real-time years (and countless man-years) developing "optimized" software than to pay a premium for the extra CPU and RAM needed for the bloatware solution it takes 1 real-time year to develop...

    Sometimes brute force is all it takes to be the best.

  • by Keeper ( 56691 ) on Tuesday October 19, 1999 @09:05PM (#1600358)
    The rumour mill is feeding this post, so take it as you will.

    AMD has been working on their Germany plant for quite some time (last couple of years). From the moment I've heard of it, it was always AMD's goal to produce chips in huge quantities using state of the art technology (being .18um and copper interconnects right now). They've been producing samples of K6-2/K6-3's using a .18um process using copper interconnects for the last few months.

    Within the last few weeks, rumour had it that they had been producing sample K7's the Dresden plant and sent stuff back to Austin for "verification" (ie: look over each nanometer [or whatever they do] to make sure everything is good).

    To me, this article seems to indicate that everything is looking good in the verification process, and they're confident enough to start ramping up to full production (or begin preparations to ramp up).

    Word is soon after the 733mhz cuMine process is released AMD will drop prices (which I think they just did actually...) and release a 750 mhz version. This, incidentally, is still on the .25um process (I find it remarkable that they were able to get to 700 air cooled).

    Kryotech has systems running at 900mhz using current .25um chips; it is VERY reasonable to expect AMD to be able to produce 1ghz chips soon after bringing Dresden online.

    My 2c.
  • I remember the same things being said about the Pentium... who needs a processor that fast when there weren't even any 32-bit applications?

    If you build it, they will use it.

    In reality I bet a lot of us have Celerons or PII, possibly PIII processors. You non-Intel users don't feel left out, you know what I mean, right? Anyway, I don't really *need* a Celeron, but it sure runs a lot nicer than my P200, ya know?

    In short, in order for progress to be made, you have to progress.

  • At the moment? No.

    If W2K is RTM this year and available by February 2000 as everybody expects? Lots of people will need 1 gigahertz chips -- in the same timeframe as they are expected to be out.
  • "Why in the world would the average person want a computer?"

    -Xerox circa 1977

    Don't stand in the path of progress, run in it!
  • I'll be the first inline to get a Linux box running @ 1 GHz+. As companys race to fill that demand (for first silicon in high volume GHz+ CPUs), I'm reminded of a computing saying: MIPS= Meaninless Indication of Processing Speed.

    Celeron is a perfect example of a company taking there time, optimizing design and fab, and turning out a cooler running/tighter chip(read:over clockers delight). I just get the feeling that the GHz milestone is going to be so tempting that some companies will be rushing them out the door before the work is done.

    The last time a company shipped there chips to early, we ended up with a 5 volt 60MHz Pentium that wasn't pin compatible with any other Pentium that could actualy do math. In light of these mis-steps, I hope AMD and Intel have there eye on quality first.

    When it comes to new technology, the early bird gets the worm, but the second mouse gets the cheese.

  • Actually a micron is one millionth of a meter, or one thousandth of a millimeter, being a contraction of micrometer, where micro is 1 * 10^-6. (that was one times ten to the minus six).
  • I disagree. Having used Windows 2000 extensively over the past few months, on three different boxes (K6-3, Celeron 400-500, Pentium III 450), I can tell you that if anything, Windows 2000 runs much faster than a corresponding task in Windows 98. I have moved all of my office, graphics, web design, and page design software onto my NT NTFS paritions, and without fail every single one of the apps runs noticeably faster than on Windows 98. And a lot more stable, as well. This is on both AMD and Intel CPUs. As an aside, I think the Linux community should really be on the lookout for Windows 2000. It is extremely stable (1 BSOD, three boxes, four months) and, although I'm ashamed to admit this, it has me using Linux less. I think fluff like Active Directories that requires a Cray XMP just to run will still make Linux a viable alternative to Windows NT in the server market, but for workstations, watch out. This isn't flamebait - it's my honest opinion. Windows 2000 has a lot to offer the average home/business user. It just makes my computer run more peppy, and not many things do that these days.

"Experience has proved that some people indeed know everything." -- Russell Baker

Working...