We're Not Prepared For the End of Moore's Law (technologyreview.com) 148
Gordon Moore's 1965 forecast that the number of components on an integrated circuit would double every year until it reached an astonishing 65,000 by 1975 is the greatest technological prediction of the last half-century. When it proved correct in 1975, he revised what has become known as Moore's Law to a doubling of transistors on a chip every two years. Since then, his prediction has defined the trajectory of technology and, in many ways, of progress itself. Moore's argument was an economic one. It was a beautiful bargain -- in theory, the more transistors you added to an integrated circuit, the cheaper each one got. Moore also saw that there was plenty of room for engineering advances to increase the number of transistors you could affordably and reliably put on a chip.
Almost every technology we care about, from smartphones to cheap laptops to GPS, is a direct reflection of Moore's prediction. It has also fueled today's breakthroughs in artificial intelligence and genetic medicine, by giving machine-learning techniques the ability to chew through massive amounts of data to find answers. But what happens when Moore's Law inevitably ends? Or what if, as some suspect, it has already died, and we are already running on the fumes of the greatest technology engine of our time?
Almost every technology we care about, from smartphones to cheap laptops to GPS, is a direct reflection of Moore's prediction. It has also fueled today's breakthroughs in artificial intelligence and genetic medicine, by giving machine-learning techniques the ability to chew through massive amounts of data to find answers. But what happens when Moore's Law inevitably ends? Or what if, as some suspect, it has already died, and we are already running on the fumes of the greatest technology engine of our time?
Just a thought (Score:2)
Re: (Score:2)
Yeah, not only did I prepare, I'm already there, sipping lemonade.
Re: (Score:3)
Re: (Score:2)
Back in the 70s and early 80s, people could be truthfully saying the same thing about their calculators, phones, electronic typewriters, pencils, and papers. Thing is, people won't know what help more calculation speed can be until they figure out how to use it, and 99% of people won't figure out how to use it if it isn't available yet.
That's been true for a while (Score:5, Insightful)
Re: That's been true for a while (Score:2)
Re: (Score:2)
Re: That's been true for a while (Score:2)
Re: (Score:2)
Re: (Score:2)
There is definitely a very big need for more software to use more CPU's,
Not really seeing it. The only examples you give work just as well with pre-emptive threading as they do with multiple cores. Also, you seem unfamiliar with common data paradigms like select() and epoll().
Re:That's been true for a while (Score:5, Insightful)
There is not a lot of preparing to do in my opinion, however for the past >10 years we have been told it is the end of Moores law and every time Moore (who had a law based on an observed trend rather than a scientific principal) keeps on coming through.
We are still seeing Moores law helping with 5G, phone SOCs more memory in smaller packages. However things are getting to the point where how much Moore do we need. Do we need to double the speed of phone processors or get more megapixels in our cameras or more storage in our watch? The pressure for more (Moore) is possibly reducing because of the applications side but what do I know, I am not going to predict the end of Moore's law anytime soon. Didn't some guy say we can now close the patent office because everything that will be invented has been invented? It takes a great deal of arrogance to predict the end and many have tried and fallen under the momentum of Moore.
Re: (Score:3)
nonBORG pointed out:
There is not a lot of preparing to do in my opinion, however for the past >10 years we have been told it is the end of Moores law and every time Moore (who had a law based on an observed trend rather than a scientific principal) keeps on coming through.
Wish I had points to give you the +1 Insightful upmod this deserves.
I've been calling it Moore's Observed Trend since the mid-1990's, because "the number of transistors on a chip" will eventually, inevitably crash into the brick wall of physical limits (such as the minimum number of atoms-width separation between traces to prevent electron leakage from one to the next), and BANG! - there will go the densification of the neighborhood.
"Moore's Law" is still the sexier name, though, I have
Re: (Score:2)
I can't help but notice that your user ID is more than 5 million. You can't be very old. Do you even remember when cores used to get dramatically faster every year? Moore's law is indeed already effectively over. It was over 10 years ago.
Are there some gains still being eeked out of die shrinks and optimization? Yes, but it is no longer following any sort of dramatic increase every year. At least not for CPUs. GPUs still have not hit their wall because they focus on embarrassingly parallel tasks can just ad
Re: (Score:2)
Re: (Score:2)
This isn't a good comparison when AMD floundered for close to an entire decade with very minimal gains, and it's only very recently that they've hit upon a tenable architecture again. Two datapoints doesn't prove a trend.
Also, Threadripper is significantly larger than a regular processor and is made of numerous separate dies, so it's already cheating in a way. (Eyeballing it appears between 2-3x the areal size of a standard quad-core CPU)
Re: (Score:2)
Re: (Score:2)
and every time Moore (who had a law based on an observed trend rather than a scientific principal) keeps on coming through.
Except it hasn't been true for over a decade!
In other words... (Score:3)
...how much Moore do we need?
Re: (Score:2)
"The fact that modern CPUs come with multiple cores is something that not a lot of code can properly utilize."
This is irrelevant. The vast majority of the really CPU-intensive tasks are highly parallelizable. Most of the things that you can't parallelize aren't CPU-limited, they're I/O limited. This is especially true for things that lots of users do, like compression, decompression, video encoding/decoding, graphics filters, audio encoding, and yes, gaming. All of these have opportunities for parallel exec
Re: (Score:2)
Re: Just a thought (Score:2)
Maybe just downgrade it ... (Score:5, Funny)
Re:Maybe just downgrade it ... (Score:5, Insightful)
Re: (Score:2)
(discussing the title of the "Rules of Acquisition" book)
QUARK: Then why call them Rules?
GINT: Would you buy a book called "Suggestions of Acquisition"? Doesn't quite have the same ring to it, does it?
QUARK: You mean it was a marketing ploy?
GINT: Shh. A brilliant one. Rule of Acquisition two hundred and thirty nine. Never be afraid to mislabel a product.
Re: (Score:2)
Oh how I loved those quotes
Re: (Score:2)
That's not to say that Moore's contribution hasn't been extremely useful but It should have been named something like "Moore's Trend" or "Moore's Observation".
Barry Manilow's Mandy Moore's Law & Order Special Victim's Unit of Measurement
Re: (Score:2)
We still think of CPUs and even GPUs as a 2-dimensional device. As temperature and fab changes emerge, you'll see 3D chips.
The problem is, Intel screwed up when it started making multi-core machines that had no software and no mechanism of getting rid of dirty cache between and among cores. Blame them for bad design; the Oracle Sparc had it, but Oracle's purchase of Sun's assets was a shitshow in and of itself.
Will the density be renewed as we go 3D? Maybe. The x64/x86 designs waste a lot of space because o
Re: (Score:2)
Eventually the x64 ISA legacy is going to become an insurmountable problem. Modern processors are still pretending they are an 80386 in some ways.
Re: (Score:2)
Largely. The grafting of under-CPUs, and memory protections for virtualizations (along with guess-aheads) have made both Intel and AMD CPUs recklessly insecure.
The AMD PSP core is a crime-- an insane design. The inability to correctly manage cache is another travesty, but the list is long.
Certainly it will take time for ARM fundamentals to catch up. But Samsung, Apple, and others are desperate to get away from Intel's oil-well-in-the-basement problems, and we all understand WHY.
Re: (Score:2)
I like what you wrote because it makes a point that I've been seeing on /. for the last 8-14 years. Everyone looks at Moore's Law in a flat plane ( thinking 2d ) why? can't it be 3D in the sense that it might just look like some star trek data cube, maybe a processing cube?
we need a newer and different standard for computer's ability to process data.
maybe using a mathematical standard of how fast it hit's 1 billion digits of pi ( or trillionth ) , and then move it from 1 internal point of memory to another.
Re: (Score:2)
In a 2D chip, heat dispersion characteristics are down to algorithms that are pretty light. 3D heat dispersion, noise, crosstalk, junction noise, and a myriad other characteristics (like how do you do QA when you can't see through layers) are all problems yet to be solved for production quantities.
555 timers and their logic are still around, embedded into the USART sections of SystemsOnChips/SoCs. I'm guessing x86 logic and instruction sets will be around long after I'm dead, just as the logic in 4004s stil
Re: (Score:2)
Re: (Score:2)
>>I'm not buying moore's law is effectively dead
that's a fair an valid point. I might have stated that it should be, and that was not the point I wanted to express.
I think Moore's law is now used for goal setting.
What I wanted to project is that we need newer standards to understand what a chip can do within it's type. and have newer goals to achieve
Re: (Score:2)
When envisioning those designs, the nanotubes remind me of chimneys as in Manchester England, or Birmingham in Alabama, various industrial cities, etc.
Little thought factories, with little instruction sets, data flowing in on the trains, I mean busses... you see the analogy, I'm sure.
We humans used to live i largely 2D domains, then we started to be able to add floors, now Hong Kong is all floors.
Outside of the box...
Re: (Score:2)
for years I keep thinking the new designs of chips should be in line with a tube type shape. with the center hollow you have airflow ( liquid flow ) to take the heat away. sounds horrible I know, but I bet it might have some long term advantages.
Re:Maybe just downgrade it ... (Score:5, Insightful)
The notion of "laws", by and large has been dropped by physicists, it's more a holdover from the era of Classical Physics, in no small part because there are sneaking suspicions that in the very early universe, or at extremely high energies, some of those laws might not apply, or at least apply in quite the same way. Physical theories from that era are still referred to in that way, but nowadays, you don't see anyone creating new "laws" per se. For instance, we talk about the "laws of gravity", which are a shorthand for Newtonian laws of motion and gravitation, but by the late 19th century astronomers and cosmologists knew those laws didn't fully encompass observation, and ultimately Newton's mechanics became subsumed into General Relativity as how, for the most part, objects behave at non-relativistic velocities.
Re: (Score:2)
Maybe no more "laws" (although, see: https://en.wikipedia.org/wiki/... [wikipedia.org]), but we certainly still have rules. See: https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
That's all it ever was - it should never have been called a "law". In science, that word is reserved for limits that can never be broken. That's not to say that Moore's contribution hasn't been extremely useful but It should have been named something like "Moore's Trend" or "Moore's Observation".
I guess someone should have a talk with Murphy and Betteridge and Sturgeon too. Or stop being an anal retentive pedant, one or the other... there's no possible way to confuse this with an actual law of nature unless you're a total cunt waffle.
Re: (Score:2)
>Murphy and Betteridge and Sturgeon
those can be considered laws because they are based on humor and not rigorous testing
Re: (Score:2)
Those laws of nature are the same thing: an observed relationship. A few of them, particularly the ones you learn in school, turned out to be true a little more often than others is all. Still not really universally though.
Re: (Score:2)
...... In science, that word is reserved for limits that can never be broken......". /rant
No "Moore's Law" has been correctly used in this case as its a prediction and in Science that's really what Laws are mathematical equations that given a set of information you can use to predict the outcome. Laws are predictions as nothing is absolute in Science. https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
*Pirate Voice* "They're more what you'd call Moore's Guidelines than an actual law"
Re: (Score:2)
*Pirate Voice* "They're more what you'd call Moore's Guidelines than an actual law"
Paraphrasing Eddie Izzard [wikipedia.org] from his show Dress to Kill [wikipedia.org]: Before he had worked it out, the Heimlich Maneuver was more of a gesture.
Re: (Score:2)
What? (Score:5, Insightful)
Re: (Score:3)
"In the meantime, developers will have to go back to actually coding efficiently and not just dumping terrible code into a 8 core processor."
Bingo, this can and will lead to greater gains than Moore's law. Of course you might have to have developers go back to developing and bring back competent admins... this is probably a good idea anyway since churning out buggy dev grade code directly into production with a handful of automated checks isn't a sustainable game plan for the long term.
Re: (Score:2)
Bill Gates called.....
Re: (Score:3)
Shhh... that is the scariest logic of all. All the suits will see are $$$'s everywhere and in the meantime this crap is overtaking the way Airline, Energy, Rail switching, and my favorite entire hospital systems (I know, I'm helping deploy the crap). People do remember there were reasons admin slowed down the dev upgrades and deployments. DevOps isn't about overcoming a technical obstacles admins couldn't solve, it is about bypassing the controls admins put in place to protect the integrity, reliability, an
Re: (Score:2)
I'm pretty sure dumping terrible code into an 8 core processor is going to continue unabated.
Re:What? (Score:5, Insightful)
I should probably clarify what I'm talking about.
Writing terrible code is only marginally related to hardware performance. Sure, it helped it to flourish back when the modern mainstream software development culture was being founded, but if hardware stopped getting better today we would continue to see terrible, bloated software everywhere.
The biggest reason for terrible software is our software business culture. Software developers have to keep moving forward as quickly as possible in the short term both because that's the business culture and because software that gets out first tends to succeed regardless of how shitty it is (at least as long as it's not totally unusable). Once a piece of software has "won" market share in some space, it can get by with far less effort, only losing out after years to decades of terrible decisions and only if something consistently good is waiting in the wings to take its place. It additionally puts the business with the shitty but fast-to-market software in a position where it can root out competition while the competition is still weak, further strengthening its position. Thus, getting software to market fast has to take top priority for any successful business, and our software development culture has grown up around this primary force.
Bloat in particular is an easy quality hit to accept for development speed, since it tends to have less of a negative impact on adoption than other quality issues.
On top of that, the costs of bloat has almost always been paid by a someone other than the party developing the software. With boxed software, the person buying the software pays the costs of bloat, not the person developing the software and making the sale. With cloud software, the person using the software online is paying for the client-side of bloat, which is often where most of the bloat is as a result of these forces.
Still, as long as the software is "good enough" and the bloat isn't so extreme as to make the software no longer "good enough", then bloat will not dissuade most consumers, and so it will continue, and our business culture will continue to amplify this problem.
Also, we're never going to reach an "end" to software development and finally have perfect software to put in place and call it a day. In addition to the constant changes to business requirements, software goes through cycles of fashion, leading to a never-ending treadmill of change. In addition to the internal fashion cycles that are likely more obvious to software developers (which could arguably be ended one day if the perfect solution came along), there's external fashion cycles driven by users (and designers). A piece of user-facing software that still works perfectly well can be seriously hurt by not keeping up with these external fashion cycles for long enough.
(Side note: assuming there is a perfect endpoint to software development, it would make this whole discussion irrelevant, since we'd be done with development once we wrote the perfect software and wouldn't have to go back and work on the perfect software. Diving into the perfect code would just be an intellectual exercise from there on out.)
As long as there's no end to software development, then we'll continue to see speed-to-market win out over quality. Even when things have topped out, those who can chase fashion trends the best will tend to be the most successful. After all, this is true for other mature technologies as well (clothing technology hasn't radically changed in decades and keeping ourselves clothed is trivial, yet people continue to spend money on new trendy clothing) except when regulation forces a higher level of quality (cars are not immune to fashion trends either, though the safety aspects of cars require them to conform more tightly around the current optimum).
It also doesn't help that there are so many people working in IT and development that probably shouldn't be there in the first place. It's relatively well paying work and there's staggering demand for developers,
Re: (Score:2)
Most industries go through their exponential growth, followed by a maturing into much slower improvements, and sometimes getting replaced by something else entirely.
We still see occasional better steel come out, despite the iron age being decidedly played out. Maybe IC's are on that path, and maybe that is not a terrible thing that computing power will plateau and some of our collective focus will move on to the next big thing (whatever that is).
Most folks have more computing power in their hands or on the
It has (Score:2)
I'd argue the last major breakthrough was the semiconductor. Everything since then has been incremental improvements to existing discoveries.
Re: (Score:2)
Some of the biology stuff is pretty cool.
What happens? We start writing better software (Score:4, Insightful)
More efficient, less bloated & easier to follow (ie not a million methods in the way of getting to the essential function).
More seriously, I suspect that we'll continue to improve processes and innovate with new technologies - essentially keeping Moore's Law "True" (in the sense of getting twice the capabilities every 18 months or so).
Re: (Score:2)
Frankly this can't happen soon enough.
Though I suspect that some companies will simply start demanding essentially two computers in one, the first to be the actual brains, the second to run the UI.
Given how far GPUs have come, we might already be there.
Re: (Score:3)
Unfortunately the trend seems to be heading more towards turning programming into a game of LEGO. You get building blocks and you put them together to eventually have something that looks kinda like what you intended. It's just that most of the blocks aren't quite the same shade, and they aren't nearly as standardized as LEGO blocks are so they don't always fit well together.
Re: (Score:3)
I've been programming professionally for 35+ years - people have always done that, grabbed blocks and try to get them to fit together. It's incredibly frustrating when you're trying to fix problems with their "code".
This is indicative of a bigger problem; the actual number of people who can program (which I think is about 25% of people who are coding today).
Re: (Score:2)
Time to get back to basics: Write better software (Score:3)
Our current software stacks are incredibly bloated, compromising performance for ease of use.
If hardware is no longer magically getting better every year, then we need to again learn to squeeze more performance out of the hardware that's at hand.
Great minds think alike (Score:2)
The future will belong to those that write the best software - not just get something out there.
Moore's Law is long dead (Score:2)
Re: (Score:3)
Re: (Score:2)
Loosely, I think phones are about as powerful as a high-end desktop computer was a decade ago.
10 years ago would have been the start of the Core architecture or more specifically Westmere [wikipedia.org]. So that could be a 10 core 3.5 Ghz CPU probably overclockable to 4.3 Ghz. I highly doubt any current phone soic can compete with that, but I'd love to see a benchmark.
I am still using an I7-4770k Haswell launched in June of 2013 and it can play every modern game that I've tried on it without any problem at all. I would certainly like to see a benchmark of that against a Snapdragon 865. I would be very surprised if
Re: (Score:2)
Not that specific processor, and the comparison benchmarks are sadly limited, but interesting nonetheless: Snapdragon 8cx vs i5-8250U
https://www.windowscentral.com... [windowscentral.com]
Re: (Score:2)
The interesting thing is that intrinsic transistor performance HAS kept improving. FinFET's were a big leap in Ft compared to essentially 2D structures used prior. Sadly you still have to connect to them. Using the metric of Ft at the top metal (all the vias, and associated parasitics including) the speed of transistor has been much more gradual. In fact many of the new processes cannot run the transistors near their peak performance with an acceptable MTBF. A few novel papers have proposed basically p
Hope it happens soon! (Score:5, Informative)
There is waaaaay too much software that is written like absolute shit and eats an obscene amount of resources to do such a simple job. Abstractions on top of abstractions and virtual machines on top of virtual machines have all lead to abominations of programming like the Electron "platform". I maybe I'm just showing my age but bragging you cut your 130MB application down to 8.5MB is plain pathetic because proper applications capable of the same things are measured in KB.
The sooner people are confronted with limitations the sooner someone will write better applications.
Re: (Score:2)
Re: (Score:2)
Companies don't like to hire programmers to optimize their code unless speed is something widely complained about by users, but yes I think Electron apps are some of the absolute slowest. I tried some Electron based HTML editor that was way too slow even in Linux. I think it was Atom. This is totally absurd for a text editor. Text editors ran just fine on my 486-33.
Re: (Score:2)
For that matter, text editors ran just fine on my first personal computer, a Tandy 286, and even on my first work computer, an IBM 8086. They main difference is it was not WYSIWYG and only showed the display font.
Re: (Score:2)
Eroom's law: Software gets slower as hardware gets faster
Here we go again.... (Score:5, Insightful)
I feel like I read this same thing over and over again, for at least 15 years.
Still, price of single transistor in IC keeps going down more or less exactly as the law predicts. 8 core CPUs are now mainstream, soon 12. Gargantuan CPUs are built with as much as 50 billions of transistors and you can actually buy them without robbing bank.
Please note that Moore's law says nothing about frequency or performance. Those are collaterals...
Re: (Score:2)
Please note that Moore's law says nothing about frequency or performance. Those are collaterals...
It also says nothing about how many cores are on a single chip. It is specifically about how many transistors are on a chip.
Re: (Score:2)
> more or less exactly as the law predicts
I mean, if you consider a margin of 25% "more or less exactly", then yes.
What does that even mean? (Score:2)
In what way is Moore's law impacting anyone other than PHBs and idiot journalists?
We'll use whatever capabilities are built into the chip, same as it ever was.
Death of Moore's Law (Score:5, Insightful)
What's amusing to me about Moore's Law is the never-ending string of failed predictions about it's demise over the past several decades. I distinctly remember at 90nm there were plenty of pundits who thought we were at the absolute limit of physics, and predicted a hard wall then and there. Obviously, with modern chips now approaching 7nm or even 5nm, that death was called just a wee bit early.
In fairness, this article does admit its more of a tapering-off than a "death". And I'm certainly not saying we won't ever hit either a hard or a practical (economic) limit. Physics tells us you can't shrink the dies forever, of course. And we've long since stopped seeing exponential single-core speed increases. But there may still be some techniques or even some other materials that will allow for the same sort of improvements, albeit in a slightly new direction.
Frankly, I'm not sure the death of Moore's Law would be a real tragedy. Personal computers and phones are already ridiculously overpowered for what people typically use them for, and plenty powerful even for some pretty CPU-intensive work. It might even be nice to buy a phone and expect it to last a decade, which is about how old my plenty-still-powerful PC is.
Re: (Score:2)
You are clearly younger than I am. I remember the exact same predictions when feature sizes were hitting one micron. And I'll bet some older engineers could say the same thing about ten micron feature sizes.
Obviously th
Re: (Score:2)
Arguably, Moore's Law already died 10 years ago. You can see that transistor counts have already slowed down: https://upload.wikimedia.org/w... [wikimedia.org]
If you're going on the original definition, transistors per IC, there's no fundamental wall. We can theoretically make the chips any size, but heat, signaling delays, production yield, etc, will fight you the whole time.
Doing so in a practical manner is harder. For the last few decades it's largely based on shrinking feature sizes.
The limit you mention at 90nm or
We have a long ways to go before the gig is up (Score:2)
Number of transistors is one thing, but we are not going to hit a wall anytime soon:
1: We are using a brain-dead concept of a computer in the first place. A shift to a passive backplane architecture with compute boards for everyday machines will help everything, even if we can't get things going past a certain point.
2: Our biggest bottleneck right now is I/O. Figuring that one out will do a lot more for computing than a lot of other things.
3: Software is written like garbage. A lot doesn't even bother
Re: (Score:2)
It's really about cost (Score:2)
The feedback loop is cost driven. People can afford more transistors because they cost less than they once did so they buy more. It is not any specific metrics like transistor count, density, process...etc. Even if you hit a wall and stop investing in new technologies to enable the next node cost can still continue to be driven by other factors.
Moore's law to me is better reflected in the $60 SBC powerful enough to replace a desktop PC or 1TB SSD for less than $100.
Re: (Score:2)
I don't see it (Score:2)
Moore like law of the batteries (Score:3)
We seem to be hovering around 120 $/kWh for cells and 150 $/kWh for packs. 75 $/kWh packs are total game changers for the transportation sector and the intermittent power sources like wind and solar. At that price it would be cheaper to build solar+wind+storage than to run fully paid for natural gas power plants. kWh/kg might be competitive even in aviation.
Re: (Score:2)
other tricks (Score:3)
Engineers still have a few more rabbits to pull out of the physics hat: Germanium can be clocked higher then silicon, trinary or quaternary arithmetic seems feasible on materials with larger band gaps, stacking increases gate count, then there is quantum computing.
Not that all of those things will prove feasible, yet there is some hope for the future after shrinking feature size runs out.
Self-Configuring ASICS (Score:2)
https://hardware.slashdot.org/... [slashdot.org]
This is last century's programmable ASIC's great leap brought forward into the present. Can it be programmed by BOTS? will determine if industry bites.
Aren't We, Though? (Score:2)
are we done with Corona? (Score:2)
Claims of Moore's Law death (Score:2)
seem to come back every two years but it keeps going.
What does that even mean? (Score:2)
No other aspect of technology improves exponentially the way computers were doing, and yet we're all still alive.
Economics not Physics (Score:3)
Not every 2 years (Score:2)
not again (Score:2)
every so many months yet another new post about moore's law coming to an end.
it's like the 'year of the linux desktop' but for cpu's.
Progress will not end (Score:2)
Moore's Law stopped translating into better performance more than a decade ago. Instead of using the vast growth of available transistors for CPU design, instead of giving us faster, more complex cores that take advantage of more transistors, the CPU manufacturers simply started doubling the number of cores in a CPU, for the same price, as if more cores automatically translates into more performance. And so we got exponential growth in cores without real increase in performance for most real world tasks. We
Re: (Score:2)
Well, at least that seems to be the implication.
Re: (Score:2)
It pretty much was. Moore's law was regarded as a target by most of the industry. After the first years, the rate of improvement in processors was pretty much dictated by how much R&D money the big chip makers were willing to pour into it. Moore's law set a pace, and a lot of money was spent to maintain that pace.
Re: then double the chip area (Score:2)
Re: (Score:2)
widget versions for every language
You don't have fonts installed for your local language? No problem. We'll default to Mojibake.
Re: (Score:2)
Take a look at the actual numbers for the thickness of current technology including all the metal layers. Adding additional layers of active devices will in no way be the great boost you seem to think it will be. Those added layers mean more capacitance, more delay, and more heat. Flaw density will rise. Thermal problems, which already are an important limiting factor, will become much worse; diamond is only 5 times better than copper.
I understand your thinking that fiber optics gives you a workaround for s
Re: (Score:2)
Best comment evar. This alone will give us a boost in performance we couldn't dream about in the past. This will also weed out the wannabe's out of the profession.