AMD Planning 1GHz CPUs 200
idan writes "This ZDnet article article indicates that AMD is opening a fab
that will produce 1GHz Athlon CPUs." I'm sure it's pure coincidence that AMD is making this announcement so soon after Intel announced their "real soon now" 1100 MHz "Athlon Killer". Do we get to call this one the "Athlon Killer Killer"?
n^2 algorythms... (Score:2)
A really fast machine running an n^2 algorythm will still be running an n^2 algorythm. When you reach a certain (often relatively small) set of data, you will SERIOUSLY notice how laggy the system is performing. Doubling the speed of the processor doesn't mean that you can double the amount of data you give to an n^2 sort before the time it takes is greater than before. It's a mere fraction of that.
People will continue to program the way they're used to. People will program in a manner to scale, if they need to scale! If they don't need to, then they won't (why bother wasting the extra time?). I don't know about most programmers, but for me programming is an ego trip. My goal is to get the slickest smallest fastest most bug free piece of code out there. Now, I realize that there are many coders out there that don't think like that but the thing is they'd code using the n^2 algorythm anyway.
...and another diversion: in some cases an n^3 algorythm will outperform an n^2 algorythm (other examples can be made); you also have to consider the data set you use. If KNOW you're going to use a data set smaller than "1" (one is a relative term, where the two functions intersect; it may be a rather large number if one algorythm is measured in minutes and the other seconds) then then n^3 would be the better choice.
I seriously doubt this will change the way code is written...
Re:AMD Indifference? (Off Topic) (Score:1)
AMD was first to announce (Score:3)
(my apologies to The Register staff for 'deep linking'...)
Re:Intel does... (Score:2)
Or did you mean to write AMD?
We don't need them as urgently . (Score:2)
Re:Intel has *not* announced anything. (Score:2)
The F00F bug only really was at all serious on a server which is running untrusted code. While this is certainly a bug, it's not one that will affect a large quantity of the people using this chip. Furthermore once someone smart enough thought about it, it became easy to fix/work-around in software. The Pentium's FDIV bug is probably a better example of a bad bug.
IBM Sells Athlon Systems! (Score:1)
Re:Markets (Score:1)
"The number of suckers born each minute doubles every 18 months."
Re:Markets (Score:2)
AMD - without a doubt - they've been struggling to keep in the market, as a smaller, less well financed and therefor less well financed company, they've had no choice but to put forward their best just to keep up with the market - AMD chips have until recently been renowned for not being very overclockable - because they ran close to their maximum performance levels already. With the release of the Athlon, they've finally caught, and surpassed Intel - and now Intel are being forced to change tactics in order to compete with the people who could quite possibly steal the "PC clone performance kings" title from them. Much kudos to AMD for what they've managed to achieve.
As an aside, I opened up an OLD PC the other day (I can't remember what it was, but I do know it was OLD) and a large number of the chips were labelled with Intel AND AMD logos - together, on the same chips - kind of hard to comprehend considering the way Intel and AMD are now so fiercely competitive....
Huh? (Score:2)
Didn't you just answer your own question? Re-read those two lines.
Also, how would the advent of 1GHz CPUs allow code bloat to increase at a quicker rate than it already is increasing?
-A.P.
--
"One World, one Web, one Program" - Microsoft promotional ad
Celery wasn't rushed?!? (Score:1)
The first Celeron's released had no L2 cache. They were DIRT SLOW. It was nothing but a PII without a case nor the L2 cache on the processor slot. It was designed to compete with AMD and Cyrix at the low end. Which it didn't (not having an L2 cache really killed performance; not as bad for games, as those often have L2 misses anyway).
Months later, the intel guys re-released the celeron with 128k L2 cache running at chip speed. I don't think they even thought about the overclocking potential of the sucker, or think that it's performance would approach/beat that of their "cash crop" processor (the PII).
Intel didn't take it's time, it just threw something out the door hoping to put a rival out of business.
Re:Do we really need this? (Score:1)
Do you have unused cycles? What kind of person are you? Everybody should donate their unused cycles to distributed.net!
--
The Slashdot Pattern (Score:2)
Announcement: New faster/better/bigger XYZ!
Response 1: Do we really need faster/better/bigger? Most consumers don't need it! And I'm the first person ever to ask such a profound, socially-conscious question!
Response 2: I bet that makes a lot of heat. Hyuk, hyuk!
Response 3: Huh, I bet that would make a kickass Beowulf cluster.
Repeat responses 1, 2, and 3 as necessary. Presto, automatic Slashdot conversation generator.
MJP
Re:AMD Indifference? (Off Topic) (Score:1)
-Barry
BTW: Did anyone else do a double take on "1.9 billion"?
Re:AMD Indifference? (Off Topic) (Score:1)
D'oh!
I vote AMD (Score:1)
1.) Right no AMD is ahead in the race. Pretty impressive for company with something like 1 billionth (give or take) Intel's market cap.
2.) Even if Intel takes the lead with their "Athalon killer" AMD's production of the athalon has shown that they can now compete with Intel as equals, sort of.
3.) If Intel DOES rush up the production of their 1100 mhz chip, it will result in a production and distribution nightmare. I would bet on one or two recalls too. Conversely, AMD has almost flawlessly executed production for all of it's recent releases. If THEY say they can rush their 1Ghz, I tend to believe them.
4.) Buying Mote or IBM won't make me money off the G4, which I'd still rather have than any of these. Despite the fact that I'm using a second rate Pentium Laptop, I'm a Mac Man at heart.
Re:Do we really need this? (Score:1)
Amazing genius at ZDNet (Score:1)
(Disclaimer: I have worked for both AMD and Intel, although am not currently employeed by either)
Re:The joys of capitalism (Score:1)
Now, we have a trend towards smaller, lighter cars, which offer much less protection in collisions. Now we have a trend towards front-wheel drive cars which reduces your control at high speeds (really only matters to racers), and acceleration off the line. Now we have a trend towards smaller engines which are packed with all kinds of electronic equipment and emissions systems, which helps the environment, but reduces overall performance and reliability, and when performance is made up for with more gadgets, reliability suffers, and when more engineering goes in to the reliability issue, you get a much higher cost, not only original purchase price, but also maintenance.
Car owners in the 60's and 70's had things much better. Or will you argue with the tens of thousands of VW enthusiasts who still drive their 60's and 70's era air-cooled rear-engine rear-wheel-drive cars, and get 20-30 miles per gallon?
You can see the same trend sort of taking hold in the intel product line; $4k Xeons with four-pound heatsinks, versus $300 Celerons, which probably cost the same to manufacture (cache-memory considerations aside). A Celeron is like the front-wheel-drive econobox of CPU's. And if intel has their way, the whole chip industry will become the same way.
"The number of suckers born each minute doubles every 18 months."
ZDNet's intelligence... (Score:4)
A micron is a 1,000th of a meter
Last I checked we called that a millimeter. Can you even image a chip done in 0.18mm?
--
Re:IBM Sells Athlon Systems! (Score:2)
I work indirectly (through an agency) for the one of these which rhymes with Hell, and that's true -- none to be expected in the next few months, either.
Below is the text of the letter I Sent to Michael Dell a little while ago; I believe it eventually sent, but I was amused to see that it was first returned by the mailer as having "permanent fatal errors"
Subject:
processor diversity vs. Intel dependence
Date:
October 12, 1999 5:44:40 PM EDT
To:
michael@dell.com
Dear Michael:
First of all, I own a (piddling) amount of Dell stock, but none in AMD, though that might soon change. I also work for an ad agency which does a lot of Dell work. [note: deleted the name of agency. tl]
Now: As far as I know, Dell uses Intel chips in every computer it builds. If that is not true, then the rest of this message is based on false premises and you can stop reading.
However, if Dell really uses no processors other than Intel, I think the company is worth less to me (and you) than it would if it also built systems with AMD chips, or even Cyrix chips.
Dell was screwed as much as anyone with the sudden *un*release of the anticipated 820 / Camino chipset; that fact alone should be enough evidence that being in bed with a sole provider is chancey. In the case of some other PC makers, though, some of their higher-end systems would be unaffected, because they are based on the AMD Athlon.
Dell finally preloads Linux (thank you!) at least on some systems, and even with a premium. You wouldn't stick with a single hard drive manufacturer or memory supplier, so why do it with the driving point of your systems, the
CPU?
Cordially,
Timothy Lord
No response, so far
timothy
Confirmation (Score:1)
The scariest thing about this is that I am still using 366MHz, and my cousin still has a Pentium 100!!! Sheesh, I'm just hoping for maybe 550MHz in my near future! People out there are raving about 1GHz, this is getting too silly. I would like to go back and say that our CPUs are all fast enough now, and the market should start concentrating on all of the other things slowing us down (hard disks, busses, memory, etc.)
I'm done.
Oops.. (Score:1)
Athlon Killer Killer (Score:3)
> this
> > is
> > > getting
> > > > out
> > > > > of
> > > > > > hand
> > > > > > > !
:-)
--
Re:Do we really need this? (Score:1)
What's a reasonalble amount of time?
Fast enough so that by the time it comes up, I can still remember why I started it.
"The number of suckers born each minute doubles every 18 months."
Wake me up when I can actually buy one. (Score:2)
I can understand, a little anyway, why it makes sense for software manufacturers to promote vapor ware. After all, they are trying to keep you from buying into their competitors completely incompatible system. AMD and Intel, on the other hand are making products that are essentially drop in replacements for each other.
Does the average consumer care where his wheat was grown? Heck no. Soon they won't care who made their processor either. It will all be about speed and price.
Re:Celery wasn't rushed?!? (Score:1)
Then when AMD made even more inroads they thought - "shit, this low end (low margin) market SUCKS to sell to, but we gotta for marketshare".
So then they did a quick fix and crippled the Xeon by making one with less on-die cache, and of a form-factor that doesn't fit any multiprocessor motherboards, and then we had the "new" Celeron. Which people quickly figured out that it was really a slightly crippled Xeon, and were able to overclock the shit out of it, and crowbar it into multiprocessor motherboards (with the socket 370 to Slot 2 conversions). It was a quick fix for their failure to segment the market with a cheaper to produce "original" Celeron. They couldn't execute a different design quickly enough, and so they just took their lumps with the overclocking.
Now that AMD threatens their midrange CPUs with Athalon, I'm wondering what shoehorning Intel is going to have to do to compete, and whether consumers will end up with another win. . .
basically, it's intel's fault in the first place for greedily trying to force this artificial market segmentation, where none should exist. It's just a way of sucking money from businesses who can afford it, without sacrificing the marketshare of the consumers/home users who can't.
"The number of suckers born each minute doubles every 18 months."
Need the competition! (Score:2)
Re:The Slashdot Pattern (Score:1)
Response 4: This is vaporware!
Response 5: Oh my God I want one of those so badly!
Response 6: This isn't news, I saw this before.
If Slashdot had the ability to automatically moderate all such posts to 'Redundant', I'm willing to bet the signal-to-noise ratio would rise dramatically.
MJP
Re: Telegard (Score:1)
Re:Huh? (Score:1)
Where's the News? (Score:2)
the Athlon scales *VERY* beautifully, and that once the Dresden Fab30 comes online, they'll easily ramp up to 1Ghz.. it's why Intel is scrambling. Kryotech has been saying they'll sell thier SuperG 1Ghz (cooled) Athlon by december, and Fab30 is due to be online by the first week in 2000. So where's the news?
Now, alls AMD needs to do is make a better CHIPSET (or VIA, whichever comes first), one that supports SMP and more than 512k cache. I'm a supercomputer/scientific researcher. And i write tight code no matter how fast the processor goes.
What do I see in my future? Clusters of Athlons and Alpha's for now, and multi-threaded hardware beyond that. Funny, unless EPIC really surprises me, Intel is nowhere in my future... hmmm..
--ps, i still think AMD should buy the Alpha and it's designers.
PianoMan8
hmmm, I dunno (Score:2)
There is really something to be said for developing on a slow machine and spending a fair amount of time *optimising*... I hope the Linux kernal hackers dont get caught up in this and start bloating the kernal (ack!).
Anyone who has seen recent (1998-1999) Commodore 64 demo will know what I'm saying. You wouldn't believe what they do with a 1Mhz processor these days, and its all due to *optimising*...
Now, running well optimised code on a 1Ghz processor, well, thats something different
what *are* the moderators smoking this time? (Score:1)
But who the fuck decided on "Interesting"? I'd choose "Troll" over "Interesting" for that article.
Moderate me through the floor if you see fit
"Binaries may die but source code lives forever"
-- Unknown
SkyHawk
Andrew Fremantle
Sloppy Code (Score:2)
Re:Where's my sweet 666? (Score:1)
"The number of suckers born each minute doubles every 18 months."
For the geeks in all of us... (Score:1)
$popularity;
$speed;
function CPUs($speed) {
$this->speed = $speed;
$this->popularity = 1000;
}
}
$intel = true;
$pentium = new CPUs("1100");
$amd = true;
$athlon = new CPUs("1000");
for( ; $intel && $intel->popularity != 0; $athlon->popularity++, $pentium->popularity--) {
$athlon->speed += 100;
}
$intel = false;
return ($winner = "amd");
Re:ZDNet's intelligence... (Score:1)
Maybe the chips are 1000 times larger! Maybe the new athalon will be the size of your desktop! The one you put your monitor on! It would be hard to get a high yeild that way, though. The wafers would probably have to be 1000 times larger,or about 18000 inches in diameter. What is that, 1500 feet? You could fit 9 wafers in the area of Hoboken, NJ. (But why would you want to?)
Re: Telegard (Score:1)
True, but then again there were a lot of WWIV spinoffs. Telegard 2.7i... wow the memories.
... then Renegade ripped off Telegard, and all that good stuff.. God I miss those days!
Andrew
Re:ZDNet's intelligence... (Score:1)
Dude, I'm not going to check your math, but either you just made that up or you're got way too much time on your hands...
--
Call it the "Athlon Avenger" (Score:1)
(just fitting the requirement that something be in the comment box)
Re:Amazing genius at ZDNet (Score:1)
Your statements regarding metal widths are not inaccurate.
I have worked with three different 0.18 micron processes, and the minimum
metal width for lower metals is between 0.26 and 0.28 microns. The minimum
wire widths are process limits, not RC limits. An 0.18 wide wire would be
about 50 % more resistive than than an 0.28 wide wire, but would have lower
capacitance. If the fabs could make wires this narrow, they probably would.
Re: Telegard (Score:1)
Gotta love hitting break - dropping to the CLI, modifying code and variables and typing...
resume
hehehe
Re:Do we really need this? (Score:1)
Re:Sorta old news... (Score:2)
These comments are about a month and a half old and I've heard no more specifics. If this is an indication of production quantity yield, then I think AMD will finally get some of the rewards they have worked for so long and hard.
p.s. For the folks that whine that people don't really need a 1Ghz cpu, I still have a fully functional 486-33 with 4megs ram and a 200meg HD that was quite fast when it was new. I'll trade it to you for that P-III 500 that is way too fast for your applications. Then you can run your apps at a more relaxed pace. Watch the apps grow in size and complexity for a couple of cpu generations and you won't be confused about the need or desire for faster systems...
Memory latency and bandwith are more important (Score:1)
What's the point of running at 1GHz if an L2 cache miss stalls you for (wild guess) 20 cycles? Assuming that a program running on a 1GHz CPU is reading from memory sequentially word by word (32 bits words) and causes an L2 miss every four accesses with the latency stated above, what would be the actual number of loads executed per second?
1000E6*(4/23)=173E6. We may be better off doing some data prefetching to minimize the cost of a cache miss...
Re:AMD Indifference? (Off Topic) (Score:1)
Mostly not... (Score:1)
Re:oh my GOD! (Score:1)
Correction to the unneeded correction (Score:1)
Here's ZDnet: A micron is a 1,000th of a meter.
Here's The Musician: Last I checked we called that a millimeter.
See the thread now?
--
Where's my sweet 666? (Score:1)
True, coding is getting sloppy (Score:1)
Re:hmmm, I dunno (Score:2)
When was the last time you opened a PC, saw the same cpu, same sound card, same I/O controller, same video card, same brand floppy drive, etc? When was the last time you looked at a hardware config and saw all of the same types of devices on the same IRQs?
You don't. In order to accomidate that sort of flexibility stuff needs to be "abstracted" out so it doesn't depend on the same hardware, but rather the same functionality.
For example, all sound cards can play a sound. Now, the process of getting a SB16 to play a sound and an A3d board to play a sound is very different, but if you have some sort of software abstraction layer that just says "play a sound" and will call some code that knows how to play sound (usually called a driver, ooh goodie). And sound is played if the driver doesn't suck ass.
I always see people complaining about code bloat (usually referring to microsoft products -- and I can NOT understand how Word got to be so frigging big).
These same people don't realize that you don't need to optimize 90% of the code in a product. You only need to optimize the parts the user waits on. Seriously, what's the point of optimizing a print routine (for example...)? All of your time is spent waiting on the printer...
These same people also don't realize that sometimes it's better to use the "slower" algorythm; not only is it easier to understand what the code is doing, but sometimes it's actually faster to use a bubblesort over a quicksort (try sorting mostly sorted list with a quicksort, then with a good bubblesort and tell me which one returns faster).
And to top that off, the process of optimization often leads to really wierd looking code (and it's amusing to watch someone try to figure out what the hell you were smoking when you wrote it) that's hard to debug/fix/modify.
...sorry, just one of my frustrating rants I guess...
Re:a *NEW* fab? why? (Score:1)
The second reason is capacity. AMD's austin fab is a pretty decent sized plant, but it can't make chips fast enough. They want more capacity, so they made a plan that can fullfill that...
A third good reason is that they don't depend on third parties to make the chips. It'd really suck if, say, intel bought out the fab that was making AMD's chips for them...:)
Re:Do we really need this? (Score:1)
You got it all wrong!! (Score:1)
Re:Intel nowhere in the future? (Score:1)
What I was talking about was multi-threaded hardware such as Tera (www.tera.com) is working
on. Very good Integer performance, almost linearly scalable. As for floating point i dont know that much. But it's probably whats going to be the "next big thing" in supercomputers that will filter it's way down to the micros...
PianoMan -- Still waiting to invent the Vector Co-Processor.
Re:Where's the News? (Score:1)
Also, you're right, i really don't need ata-66 or agp 4x.. what i'm waiting for is multi-processor support. (and the money to afford it.)
someone want to hire me?
PM.
Re:MODERATORS--why was this moderated down? (Score:1)
The radiation produced by consumer devices like computers and cellular phones is so low compared to natural and other man made sources, you shouldn't even consider it. In fact, you get much, much, much more radiation (albeit at lower frequencies) from your monitor than your CPU. Also, I'm not totally sure but I think the FCC has strict requirements for radiation/interference from class C devices even in the microwave bands.
That depends on the frequency of the radiation. The eye is particularly sensitive to very short wavelength radiation (UV and above), and not so sensitive to microwave frequencies. Basically, the only effect microwave radiation has on any human tissues is heating. And at microwave frequencies you need extremely high field intensities (like in a high power waveguide or resonant cavity) to produce any measurable heating. It's funny you mentioned this. I just read in the Oct issue of Scientific American that a biochemist from Lawrence Berkeley NL named Robert Liburdy intentionally faked the results he published in a landmark paper in 1992. His paper was one of the first (or the first) scientific studies of the effects of low level (ie. not enough to produce heating) electromagnetic fields on cells. It turns out he falsified his findings to show that the low level radiation affected his cell cultures. Other scientists trying to pinpoint cellular effects of low level radiation have found nothing. So despite the best efforts of many scientists to find justification for their paranoia, nobody has found any link between low level radiation and any form of cancer.Now, it is obvious that radiation at extremely high field intensities (e.g. in a microwave oven or standing in front of a big radar) as well as very high frequencies (X, Gamma, cosmic rays) can cause cell damage. However, there is no reason to believe that low level radiation from everyday products like computers, TVs, cellular phones, etc. will do you any harm. I know lots of colleagues (and some of their mentors) who have spent practically their entire careers in radar/antenna test chambers absorbing low level microwave radiation with no ill effects.
Pentium Killer Killer Killer (Score:1)
Is 1Ghz fast enough? (Score:2)
What??? I think your sources are Intel themselves. (Score:1)
and is way beyond anything Intel has to offer.
It is seriously better than the PIII in all
respects. You're saying that you DON'T mean
failure in the sense that nobody is buying them.
Then WHAT are you trying to say?
There are three things Amd tried with the Athlon:
1. Creating a chip that would better the latest
intel offering.
-Check! Done... better in both floating point and
integer operations.
2. Creating a chip whose design would have a lot
of headroom for future revisions and speed.
-Check! Done... the design can supposedly go far
over 2 GHZ. And is supposedly not that difficult
to move over to 64bit.
3. Creating a chip that sells far better than
intels (to make AMD profitable).
-This one has a bit to go.. The processor _IS_
better, but Intel has better yield, and motherboardmanufacturers enough.
Re:ZDNet's intelligence... new terahertz chip? (Score:1)
And once again, what about the CPU cooler? (Score:1)
I liked that comment i saw yesterday, it went
something like this:
"Nice, but do i need to install it in my freezer?"
Re: Telegard (Score:1)
Sides, they're still around, which is more then I can say for alot of the cloners.
Re:Markets (Score:1)
Actually AMD has so far bested Intel sooner or later on every compatible category, the problem being that by that time Intel was delivering the "new bigger and greater". Though quite often it was worse than the AMD top of the line for the old design. Compare a P5 at 60 with an AMD X5 at 166 for example
Every time intel actually took on a war with AMD for a compatible product it lost. Examples are numerous: 386 vs am386, 486 vs Enahcedam486 and X5, P5 vs K5-PR series, MMX vs K6, P6 (PPro, PII, PIII) vs k6-2/K6-3. So I guess it will NOT win this time. It will have to deploy the new latest and greatest (namely IA64) in order to win. And then the cycle will start a new. The interesting part being that now the timing GAP between them is much much shorter.
Re:AMD was first to announce (Score:1)
Intel has *not* announced anything. (Score:2)
Even if the Willamette thing was true, I'm not sure it would be good idea. Shipping a CPU nine months early probably means that it has not been tested very througly, and will therefore contain a lot bugs. No CPU is 100% bug free, but insufficient testing could mean that even some really bad bugs(by bad i mean something like the F00F bug) might slip trough.
Motherboards, The Meaty Vegetable (Score:3)
Sun's SPARC provides nice evidence of this; they are selling lots of systems for high end database and web applications not because the SPARC architecture is vastly superior to its competitors, but because the rest of the system is fast.
On a PC, the real "critical component" is the motherboard, as that tends to be a determinant of such things as:
Let's call it... (Score:1)
-k
Re:Do we really need this? - Not for running W2K (Score:1)
I love it when something runs great that far below the official spec (what is it now? 200Mhz PII/64MB RAM?).
Alpha (Score:1)
Re:Athlon Killer Killer (Score:1)
Intel nowhere in the future? (Score:2)
Not so fast there - put 2 + 2 together. Multi-threaded hardware, right? That means SMP on a chip, right? That means: transistors/mip matters. Well, as far as I know, the crown for best transistors/mip rating in the business goes to ARM - guess what Intel is heavily involved in [intel.com]?
Re:Do we really need this? (Score:1)
^.
Re:The joys of capitalism (Score:1)
the ford expedition weights over 3 tons and get TWELVE miles per gallon.
blech.
Re:Wake me up when I can actually buy one. (Score:1)
Re:IBM Sells Athlon Systems! (Score:1)
Re:oh my GOD! (Score:1)
Advertising strategies (Score:1)
Ads for any other kind of product all appeal to masculinity by featuring seductive females, fast cars, and loud music.
Intel ads have a little people in neon suits dancing around like idiots with a little jingle at the end. I don't know how the hell this has been one of the most successful ad campaigns in recent history.
Intel claims the P3 makes "the internet go faster," which is nothing short of a blatant lie. AMD needs to flaunt the one
While Intel advertises the speed of the internet (which looks more like CAD in their commercials), AMD should be advertising the amazing performance with shots of violent Q3 timedemo's and gorgeous women.
forget the suits... (Score:1)
True (Score:1)
?better? example (Score:1)
Part of this group were early adopters of this thing called Java (back in the 1.0 days) the other part of the group didn't move over to Java untill very recently and started right out on IBM's JDK. The new guys are still good developers (they didn't rust while still working in C/C++) but they have no problems at all coding something like:
String foo;
...
if (foo.equals("")) {
...
while the original java people become violently ill at even seeing that code and replace it with:
String foo;
...
if (foo.length == 0) {
...
There really isn't much difference, and from a purely OO standpoint the first is better. But those who remember the pain of running under 1.0 without a JIT won't waste even that small amount of overhead.
When it comes to java, if you want a fast product at the end of the release cycle: FORCE the developers to run in interpreted mode ONLY. (let test, performance and marketing groups use the JIT - just not the developers.)
Re:Where's my sweet 666? (Score:1)
Sad, but true
Exponential algorithms execute in linear time (Score:2)
Wait 1.5*n years, while the speed of computers increases by 2^n. Execute the algorithm in unit time.
1Ghz+ == Major markdowns on other fast chips? (Score:1)
ZD again... (Score:1)
Re:ZDNet's intelligence... (Score:2)
Mental note: don't let ZDNet have anything to do with navigating Mars probes.
Re:AMD was first to announce (Score:2)
As a reminder -- a while back The Register reported that Apple was switching to Intel chips. That came true -- didn't it?
-B
The joys of capitalism (Score:2)
In a lot of ways, it is like the time when Japanese cars quickly replaced those produced by complacent American car companies. Maybe now, the processor market will see a jump in quality and a dip in price.
Then again, its just as likely that we will just see a rise in quality along with a corresponding rise in price. But hey, I guess supply and DEMAND is part of capitalism too.
Of course! (Score:2)
The real point of offloading generic tasks like geometry and lighting onto the graphics board is to be able to do crazy, highly specific features on the CPU. Games will only get wilder and wilder. Great things are to come!
...
Oh, you wanted to get work done? I can give you a 386 for free that will run LaTeX, lynx, mutt, gcc/g++, gdb, and everything else you need to be productive
Do we really need this? (Score:2)
Well, i know *i'd* like one, but in reality, what does the average user need with 1 GHz (atleast at the moment anyway)?
While i am sure game designers are leaping for joy at the excessively fast calculations these processors wil do (and the resulting impact on gaming), we as yet don't *need* that power. Afterall, Some of the best looking games coming out will run nicely on 300Mhz
Obviously certain roles are perfect for these processors.....servers spring to mind....but for the average home user, it really isn't needed (once again, however, i wouldn't complain to owning one
Re:hmmm, I dunno (Score:2)
wouldn't believe what they do with a 1Mhz processor these days, and its all due to *optimising*...
The C64 was released back in '84, wasn't it? It only took a few thousand skilled hackers 15 years to get to the point where the code is "optimized".
And Windows still doesn't run on the thing.
Well, good luck in convincing anyone that it is wiser to spend 15 real-time years (and countless man-years) developing "optimized" software than to pay a premium for the extra CPU and RAM needed for the bloatware solution it takes 1 real-time year to develop...
Sometimes brute force is all it takes to be the best.
Sorta old news... (Score:4)
AMD has been working on their Germany plant for quite some time (last couple of years). From the moment I've heard of it, it was always AMD's goal to produce chips in huge quantities using state of the art technology (being
Within the last few weeks, rumour had it that they had been producing sample K7's the Dresden plant and sent stuff back to Austin for "verification" (ie: look over each nanometer [or whatever they do] to make sure everything is good).
To me, this article seems to indicate that everything is looking good in the verification process, and they're confident enough to start ramping up to full production (or begin preparations to ramp up).
Word is soon after the 733mhz cuMine process is released AMD will drop prices (which I think they just did actually...) and release a 750 mhz version. This, incidentally, is still on the
Kryotech has systems running at 900mhz using current
My 2c.
Wow, I feel old... (Score:2)
If you build it, they will use it.
In reality I bet a lot of us have Celerons or PII, possibly PIII processors. You non-Intel users don't feel left out, you know what I mean, right? Anyway, I don't really *need* a Celeron, but it sure runs a lot nicer than my P200, ya know?
In short, in order for progress to be made, you have to progress.
Re:Do we really need this? (Score:2)
If W2K is RTM this year and available by February 2000 as everybody expects? Lots of people will need 1 gigahertz chips -- in the same timeframe as they are expected to be out.
Re:Do we really need this? (Score:2)
-Xerox circa 1977
Don't stand in the path of progress, run in it!
Race to GHz . . .chasing the wrong carot (Score:2)
Celeron is a perfect example of a company taking there time, optimizing design and fab, and turning out a cooler running/tighter chip(read:over clockers delight). I just get the feeling that the GHz milestone is going to be so tempting that some companies will be rushing them out the door before the work is done.
The last time a company shipped there chips to early, we ended up with a 5 volt 60MHz Pentium that wasn't pin compatible with any other Pentium that could actualy do math. In light of these mis-steps, I hope AMD and Intel have there eye on quality first.
When it comes to new technology, the early bird gets the worm, but the second mouse gets the cheese.
Correction (Score:2)
Re:Do we really need this? (Score:2)