Torvalds's Former Company Transmeta Acquired and Gone 150
desmondhaynes sends along a posting from the TechWatch blog detailing the sale of Transmeta (most recently discussed here). Linus moved ten time-zones west, from Finland to Santa Clara, CA, to join Transmeta in March 1997, before this community existed. Here is our discussion of the announcement of the Crusoe processor from 2000. Our earliest discussion of Transmeta was the 13th Slashdot story. "Transmeta, once a sparkling startup that set out to beat Intel and AMD in mobile computing, announced that it will be acquired by Novafora. The company's most famous employee, Linux inventor Linus Torvalds, kept the buzz and rumor mill about the company throughout its stealth phase alive and guaranteed a flashy technology announcement in early 2000. Almost nine years later Transmeta's journey is over." Update: 11/21 16:25 GMT by KD : It's not the 13th Slashdot story, only the 13th currently in the database. We lost the first 4 months at one point.
I refuse to believe.... (Score:3, Funny)
that something Linus worked on was a failure.
You mean he's human after all?
Oh the humanity.
Re: (Score:2)
Worked on?
More like he was hired to sit in an office and be their "star" power.
Re:I refuse to believe.... (Score:4, Informative)
Nothing could be further from the truth. Out of the five major components of the Crusoe firmware -- the dynamic translator, interpreter, nucleus (mini-OS), virtual I/O, and out-of-line handlers ("microcode"), Linus was the driving force, designer and primary implementor of one (the interpreter.) He eventually transitioned into an "advanced research" role, working on more "far out" projects.
You might find this link [uspto.gov] interesting.
Re: (Score:1)
Hey, at least it's not something Theo worked on. Then we'd see fireworks.
Re: (Score:1)
Very telling..... (Score:5, Insightful)
From the article:
Transmeta today announced that Novafora will acquire Transmeta and its assets for $255.6 million in cash.
Transmeta's cash, cash equivalents and short term investments at September 30, 2008 totaled $255.2 million.
So, the entire worth of the company intellectual property was about $0.4M?
Layne
Re: (Score:2, Informative)
Probably offset against debt.
Re:Very telling..... (Score:5, Informative)
I'll just leave this [lightreading.com] here.
Re:Very telling..... (Score:4, Informative)
Re: (Score:1)
From the article:
Transmeta today announced that Novafora will acquire Transmeta and its assets for $255.6 million in cash.
Transmeta's cash, cash equivalents and short term investments at September 30, 2008 totaled $255.2 million.
So, the entire worth of the company intellectual property was about $0.4M?
Layne
The excess $400,000 paid for Transmeta is not necessarily intellectual property. Under financial accounting, it is considered "goodwill."
Re: (Score:2)
It might be, if it can't be assigned to any other sort of intangible asset, but given that they aren't keeping the Transmeta name or anything like that and their main motive in buying the company is to use the technology in their own products, I would think it probably is intellectual property.
Re: (Score:1)
It might be, if it can't be assigned to any other sort of intangible asset, but given that they aren't keeping the Transmeta name or anything like that and their main motive in buying the company is to use the technology in their own products, I would think it probably is intellectual property.
From the Statement of Financial Accounting Standards No. 141 [fasb.org], The excess of the cost of an acquired entity over the net of the amounts assigned to assets acquired and liabilities assumed shall be recognized as an asset referred to as goodwill. An acquired intangible asset that does not meet the criteria in paragraph 39 shall be included in the amount recognized as goodwill..
Reading this myself, I see that I have made a mistake: the $400,000 figure is not necessarily goodwill either, as the article mentioned
Re: (Score:2)
That sounds about right. The adaptive compiler CPU idea was very intriguing (sort of like Hot Spot [sun.com] for x86 code) but nothing really useful seems to have come out of it.
I used to own a Crusoe-based laptop. It ran hot, and battery life was unimpressive. So where's the alleged benefit for this technology?
Re: (Score:1)
I have one lappy with Transmeta CPU. Runs HORRIBLY slow, worth than PII@233MHz (which I had at that time).
Battery life is nothing impressive. Practically unusable machine.
Re: (Score:2, Informative)
What model did you have? I own a sony c1 picturebook with a first-gen crusoe. Very slow, but it got impressive battery life (at the time) and ran very cool. The entire unit had one fan about the size of a quarter which only ran when the cpu was maxed out. The rest of the time you could barely hear it idling. Of course the horrible hard drive (10x louder than the fan, slow, unreliable) more than made up for it...
And PII@233 sounds about right speed-wise.
I consider crusoe the perfect example of an idea t
Re: (Score:2)
Mine got stolen many years ago, but I think it was the same model as yours. I can't explain why yours works as designed and mine didn't, but my experience seems to be pretty typical.
I now own a Motion Computing tablet with a 1Mhz processor that runs very hot indeed. But it makes no noise at all. Or almost: if you put your ear right up to the air vent, you can just barely here the fan. Can't hear the disk drive at all.
Got my sister a used Optiplex SX270 that's just as quiet.
Noise is primarily a matter of mec
Re: (Score:1)
I noticed that too... But not just intellectual property. Intellectual property and all of it's assets!
"March 1997, before this community existed" (Score:2)
Whot?
Re: (Score:2)
Re: (Score:2, Informative)
Pretty sure they're talking about the Slashdot "community" -- Slashdot was founded in Sept 2007.
Re: (Score:2)
Re: (Score:1)
/. ?
THIS IS SPARTA!
Re: (Score:2)
Now just wait a minute. Just wait one minute here. Did we have some sort of temporal field anomaly? I could have sworn I was wasting time on Slashdot for years. Guess it's the Alzheimer's again. Or the coffee. Or maybe we can blame it on George Bush...
Re: (Score:2)
Ahaha, wow, I can't believe I did that. Yeah, that's supposed to be 1997. :p /headdesk
Re: (Score:3, Informative)
As much on slashdot this is self-referential, i.e. "this community" = Slashdot, and if you take this frame of reference "March 1997, before this community existed" is indeed correct [wikipedia.org]:
Re:Anybody else think that... (Score:5, Insightful)
I'm not sure why that is ironic. Edison spent a lot of time failing. Ruth struck out a great many times.... this list can go on.
Now if he were a skydiver, that early failure might have put an end to the story, but still, no irony.
Re: (Score:2)
Re:Anybody else think that... (Score:5, Insightful)
How ironic.
Celebrate failures (Score:1)
Re: (Score:2)
Man, that's just so ironic.
:-P
Re:Anybody else think that... (Score:5, Funny)
Re: (Score:2)
maybe its meta (Score:1)
Re: (Score:2, Informative)
For anyone who's interested in Mr. Byrne slating the song here you go [youtube.com].
Comedians the world over must have kicked themselves when they first saw Ed' routine. A collective "D'oh! Why didn't I think of that!"
As Ed says, the only thing ironic about that song is that it was a song about irony written by someone who doesn't know what irony is.
Re: (Score:2)
Re: (Score:2)
While what you write is true to some extent, that doesn't make it a good thing or something to be supported. Evolution of a language over time is one thing; just plain getting it wrong and saying something you don't mean is another, and there comes a point where the errors are sufficiently misleading that you are no longer communicating effectively.
See also the current tendency, particularly from our friends in the US, to drop the word "not" and thus reverse the meaning of a sentence. I've seen businesses f
Define "wasted" (Score:5, Insightful)
If you count something as "wasted" just because it was a part of something that failed many years later, then virtually all of humanity's efforts are wasted in the long run.
E.g., what was the point of building cities and inventing civilization in Mesopotamia, since millenia later it fell to the semitic populations, then to the iranians (indo-europeans), and finally to the arabs? Even Sumerian, the language of the first human civilization, soon was a dead language kept just for religious services and texts. (Much like what millenia later would happen to Latin.) Was Hammurabi's life wasted on working on that law code and construction and whatnot, since he worked for Babylon which later got conquered by Assyria and today is just a bunch of ruins?
Was the life of every Roman that ever lived wasted, because their country would eventually implode and be conquered by a tribe as primitive as the Longobards?
Was Egipt all a big waste for that same reason?
Sometimes it makes sense to live in the present. It matters what you do now, not what will become of it in 10 years. What may make a difference in the long run is that you were one of the guys who tried and contributed a bit to the advancement of technology/culture/whatever, not whether you left some monumental legacy that will for ever be intact. Because if you're aiming for the latter, you might as well give up now, 'cause in the long run everything turns to dust.
Even the the Great Lighthouse, or the Colosus of Rhodes, or whatever, eventually turned to little more than ruins or disappeared altogether. Was it a waste of someone's years to build them? Well, no, they served their purpose while they existed, _and_ more importantly humanity learned something new in the process. Even if it's how to stack a lot of bricks to build a f-ing huge lighthouse. The road to the mighty gothic cathedrals of later, or to the Hagia Sophia, goes through such earlier achievements. Even if the grand monumental testament to someone's work is gone, their contribution to the species' knowledge lived on and accumulated.
Plus, in this case we're not even talking about some personal failure, but the failure of one company he worked for. Well, gee.
Re: (Score:1)
Re: (Score:2)
It's a point that needs to be driven home, and what's more, ought to be made in more books on human relationships.
Re: (Score:2, Informative)
Define "founding fathers".
Linus is a good programmer. There are several good programmers who could write a kernel, specially the kind he wrote.
The GNU project was well underway when he started working with Linux, so he was no needed to found any revolution. Maybe adoption of free software would have been slowed, but things would not be much worse w/o him.
news getting ahead of itself (Score:2)
The 13th Slashdot story? (Score:5, Funny)
Makes me feel old... oh wait I am. Crap.
Re: (Score:1)
They should have gone directly from story 12 to story 14. Being story 13 doomed Transmeta in the long run...
kinda sad (Score:2)
it's kinda sad. They tried. But the juggernauts ran them right over. Their technology was gee-whizzy and innovative. But they had a hard job getting anybody to buy into such a radical change.
Re: (Score:2)
"But they had a hard job getting anybody to buy into such a radical change."
They didn't offer any CPU/motherboard combos to leverage Linux community participation, so it is obvious they did not want that. Mobo/CPU combos would have gotten exposure that merely going B2B couldn't buy.
If your product is hardware your community can't buy, you cannot leverage their support very well.
Re: (Score:2)
Not only that, they kept the low-level VLIW (very long instruction word) interface to their chips a secret. I think, especially running Linux, that it would have given them a huge performance boost if you could run native VLIW-compiled code directly on the chip instead of going through the x86 emulation layer.
Re:kinda sad (Score:4, Informative)
But that was done on purpose, so they wouldn't hit the obvious wall that hurts all VLIW architectures: increasing IPC without changing the architecture, and without adding all the complex re-ordering logic seen in RISC-like superscalar processors. Once you get above one VLIW per clock, you have to throw the compiler's assumptions out the window, or you need to re-compile the code.
If you don't have to support the old architecture, you can change it to increase IPC without excessive overhead. This was the concept behind adding an interpreter layer between the chip and the OS. Of course, they didn't realize that they were trading one performance bugaboo for another: instead of making a bigger, more expensive chip, they sapped tons of performance doing x86 instruction transation and re-ordering in software. This cost them tons of performannce, as a lot of the time, their VLIW pipeline was only %50 filled.
Transmeta had the same problem Intel did with Itanium: with the exception of perfectly tailored code, the VLIW compiler couldn't keep processor resource utilization anywhere near %100. Transmeta had one additional problem over Intel: their compiler had to work in REAL TIME, with a tiny 16 or 32MB buffer. It's no wonder they got toasted by the x86 market..Itanium, even with Intel backing, is on the way to a similar fate.
Re: (Score:3, Insightful)
But they had a hard job getting anybody to buy into such a radical change.
That's not too surprising, due to the disappointing fact that once their product finally hit the market, it wasn't significantly more efficient than its conventional competitors.
Re: (Score:2)
Re:kinda sad (Score:4, Informative)
Plus working with small companies for such a vital part, wasn't in apple's interest. I think Apple learned its lesson working with Motorola. As big as it was, Motorola couldn't fulfill apple's meager request for power pc chips, nor could it fund development of faster chips.
Re: (Score:2)
My company had two early Compaq tablet PCs, a TC1000 with a 1 GHz Transmeta CPU and a (by that time HP) TC1100 with a 1 GHz PIII. The PIII ran circles around the Transmeta (like, when you were waiting for it to turn your spoken words into text, it was 1-2 seconds versus 3-5) but the Transmeta didn't get significantly better battery life--both were good for about 4 hours in typical usage (which, at a conference, means taking notes, surfing, and playing FreeCell and Dots. [microsoft.com])
Did any of us seriously think it was going to work (Score:3, Interesting)
That a small start up could take on Intel in a serious way? Sure you can make processors for some narrowly defined market that Intel might not be interested in pursuing. But at the time (this was before Pentium M and Centrino) Intel's mobile offerings were embarssing, and Intel was hurting to push something out quickly that could solve the mobile problem. Even at that time laptops were consider the wave of the future, and I think we can safely assume that Intel and AMD both realized that the laptop market was only going to grow much larger.
Do you really jump in between Intel and AMD when they are both scrambling to come out with a solution first for a low power mobile chip with good performance? It didn't make sense to me then, and it doesn't make sense looking back on it.
Sorry to be so critical of Transmeta, but I really couldn't see them achieving anything more than Cyrix/VIA with the Crusoe architecture, as novel as it was.
The only thing that I thought might save them from the beating they received from Intel was the Efficeon. Having worked with product development for blades and modules, there are some serious power constraints in many of these products. And if you can get even a few more MIPS per Watt it can make the difference between being able to run an application or not. For application-oriented blades and modules (for example, Cisco NM, AIM and blades) the ability to have a little more oomph means you can offer more connections per blade or more features or do products that you could not do before. (afaik Cisco never used the Efficeon)
Re:Did any of us seriously think it was going to w (Score:1, Offtopic)
Re: (Score:2)
Yes. The web search/advertising market was very young, Yahoo! and MS's search engines sucked, their designs were fundamentally wrong for the direction the web was going, they showed no indication that they were going to make any meaningful changes.
The CPU market was not young, Intel and AMD had decent products, and they were pouring resources into R&D.
Re: (Score:2)
Well Yahoo was two students too. So it's not inconceivable now is it?
Re: (Score:2)
Re:Did any of us seriously think it was going to w (Score:5, Informative)
That a small start up could take on Intel in a serious way?
Well, that wasn't what killed them. There are many stories of garage companies taking on the fat, lazy big boys and winning (Microsoft/Apple against IBM, for one).
What killed them was the Fundamentally Wrong Approach. They wanted to, in essence, make a "magic optimizer" that would take Intel instructions and convert them to run on a very simple, low-power device. The "magic optimizer" was left as an "exercise to the geniuses". The business plan for that consisted solely of hand waving. "Hey, we'll just hire smart people and let them figure it out."
Unfortunately, optimization is a notoriously difficult problem, and is really a subset of Strong A.I. No one programs in assembly language these days, so one really understands how bad compilers really are at producing code, compared to human optimized code. Computers are so fast and programmers are so expensive, so we don't really care anymore.
Taking assembly and trying to translate/recompile it into another very-low-level assembly and do this on-the-fly without any time or performance penalty is a fool's game. It was never going to work. I could probably even dig up my posts on this subject way back when. :)
See also: VLIW processors, where the hardware guys fool themselves by saying, "the software guys will figure out how to compile to it."
No, not after the Pentium Pro (Score:5, Interesting)
RISC machines made sense before Intel figured out to make x86 go faster than one instruction per clock. That happened with the Pentium Pro, which came out in 1995. (The Pentium II and III were basically Pentium Pro architecture, shrunk down to a single die in a newer fab.) Transmeta didn't announce a product until 2000.
Before the Pentium Pro, RISC architectures seemed to be the way forward. The RISC designs could get down to one instruction per clock, and they weren't that hard to design, because all the hard cases were prohibited. I met the design team for one of the MIPS CPU parts, and it was about 15 people.
Intel took on the insanely hard problem of making a superscalar x86 CPU. All the awful things that can happen in x86 code had to be handled, and not only handled, handled fast. The internal complexity of the Pentium Pro/II/III is huge. It took a design team of 3000 people at peak to bring it off, and a huge transistor count in the CPU. Yet they did it. With that architecture, they could beat one instruction per clock, which blew away the whole rationale for nice, simple RISC machines. Transistors on the chip had become cheap enough that a CPU with 5.5 million transistors was commercially feasible.
Along with blowing away RISC, that technology blew away Transmeta. Transmeta had an OK idea, but they were five years too late.
Re: (Score:2)
x86 machine language is RISC, just compressed/encrypted.
Re: (Score:1)
You are technically right but I believe the grandparent was trying to get a funny moderation.
That's not what took the bloom off RISC (Score:3, Informative)
RISC machines made sense before Intel figured out to make x86 go faster than one instruction per clock.
That's not what made RISC fade into the background.
RISC was about tradeoffs: Do only very simple instructions and you can do them very fast with a small amount of logic (which makes you even faster). Then trade this for occasionally doing several instructions instead of one and you're still ahead.
The smaller machine also means you can move to the next, still faster, logic family while the yeild is still
Re: (Score:2)
Are you saying that there are no superscalar RISC designs,
No. ... or that superscalar RISC chips don't count as RISC?
To some extent, yes.
While the name is in terms of the instruction set complexity, RISC is a package of design ideas, as I described. If a "superscalar RISC" machine buys its superscalar performance with increased gate count (and size, signal path length, and idle time percentage) it's starting to deviate from the definition and swing the tradeoffs in the direction of CISC. If it's done wel
Re: (Score:3, Insightful)
No. ... or that superscalar RISC chips don't count as RISC?
The problem is that going superscalar means enormous additional complexity. Pure RISC CPUs are simple; they're just executing the instructions as they come along. Going superscalar means translating the incoming instruction scheme into a different internal format using a different register system and pumping it through a set of pipelines, each doing different things, with a complex "retirement unit" at the end to deal with any conflicts after t
Re: (Score:2)
That's a good starting point, but you missed half the story.
The concept that RISC failed to stir into their soup is that latency is fundamentally asynchronous in general purpose computing. There are specialized floating-point kernels and such where latency can be successfully regimented into a synchronous model. The majority of customers with these requirements bought dedicated, specialized machines. It
Re: (Score:2)
Re: (Score:2)
See also: VLIW processors, where the hardware guys fool themselves by saying, "the software guys will figure out how to compile to it."
Q: How many software engineers does it take to change a lightbulb?
A: Isn't that a hardware problem?
Q: How many hardware engineers does it take to change a lightbulb?
A: Can't the software guys just code around it?
Re: (Score:2)
I will agree with you that the technology decisions, while interesting research, were perhaps not the best investment.
But my argument is that if you're a little guy and want to take on major players in a market, you need to attack their weak points long before the major players realize they have a weak point. When companies with huge resources are competing in a R&D heavy sector, you don't want to jump in and try to compete in areas that the big players are aware of. They'll eat you for lunch.
Perhaps I'
Re: (Score:2)
But I have learned, rightly or wrongly, that being stealthy and going for the customers that are being ignored by more powerful players is a better strategy.
I agree with you there. I don't remember the exact quote, but I still remember shaking my head when Andreessen started shooting off his mouth taunting Microsoft back in the Netscape browser days. I think I posted at the time something to effect of, "A browser is not the most difficult piece of technology in the world. All he's doing is causing The Nav
Re: (Score:2)
Also, please back up your claim that a compiler generates worse code than a human. Provide example C code where your assembly is better than what gcc produces at O2.
Well, I was willing to maybe believe that you might know what you were talking about, until this... 'gcc' is a notoriously bad optimizer compared to commercial compilers (especially Intel's compiler). The advantage of 'gcc' is that it's common and portable, not that it's a good optimizer.
Compilers are actually extremely good. I'm pretty sure
Re: (Score:2)
I've programmed in assembly language. I still would not want to beat even GCC, unless I was competing in a scenario where I knew the compiler was unusually weak.
To beat a modern compiler, amongst other things, you have to be able to consider how the instructions will run in parallel and thus how to interleave them to keep the pipeline from stalling and memory access latencies hidden. You have to have memorized instruction sizes so you don't write crap like "mov %eax, $0" instead of "xor %eax, %eax" and so m
Transmeta competed with Intel (Score:5, Informative)
and Intel ran them out of business like so many others.
Intel ran Cyrix, Centaur, out of business and they got bought out. Intel stopped NEC (Remember the V20 CPU that replaced the 8088?), and almost ran VIA and AMD out of business.
Re: (Score:2, Insightful)
Re: (Score:2)
Re: (Score:3, Informative)
Re: (Score:2)
And is also owned by Intel and produced under the brand name XScale, though rights to the chips have also been sold to other companies.
Re: (Score:2)
Re: (Score:3, Informative)
Re: (Score:2)
please show me one single arm cpu for two thousand euros a piece.
there are lots of arm cpus being sold right now but they are dirt cheap.
Re: (Score:2)
Relevance?
Re: (Score:2)
uh, profit margins?
Re: (Score:2)
We were talking about whether ARM is successful; a product selling in such massive quantities is doubtless successful -- for it to be otherwise, they'd need to be selling at a price incapable of covering both fixed and marginal costs.
It's just business... (Score:1)
Re: (Score:3, Interesting)
i'm just curious why VIA hasn't been a major contender in the growing netbook & low power desktop market. haven't low power processors always been their specialty?
i think it'd be hard for any independent manufacturer to compete against AMD & Intel in the high-end market where the duopoly is firmly entrenched. however, many consumers are beginning to realize that they really don't need the latest quad core processor just to check e-mail and surf the web. i expect the trend towards low power desktops
Re: (Score:2)
i'm just curious why VIA hasn't been a major contender in the growing netbook & low power desktop market. haven't low power processors always been their specialty?
Linux drivers. Intel provides them, VIA does not, and Vista is not an option. In a panic they've started dumping out public specs and drivers in the last few weeks, now that the Atom is out and they're in danger of being made irrelevant.
-1, Uninformed (Score:2)
I'm just going to whack you with the information stick and leave you to synthesize clue.
1. The more copies of a model sold, the less each copy pays for engineering.
2. A new chip design costs six to eight figures (USD) to develop.
3. A new computer model costs a lot to develop and support, even if you're starting with a reference design from a chip vendor.
4. People in the first world, where such niche segmentation is most likely to fly, care about run-time. Energy efficiency is irrelevant.
5. People in the fi
Re: (Score:2)
lol. i'm hoping that post wasn't meant to be serious.
sorry, i prefer to write in organized, coherent paragraphs rather than disjointed lists with no logical structure.
in any case, assuming you're right and no one cares about energy efficiency, people still care about performance. an energy-efficient system will run coo
Um... (Score:1, Redundant)
Archives go back to December 31, 1997 [slashdot.org] but the site itself goes back to September. [wikipedia.org] So I don't think that was the real 13th story.
Re: (Score:2)
Well, if enough of them were dupes, it could have been the 13th story. :-)
No Comments (Score:1)
Catapostrophic (Score:2)
Linus Torvalds formerly owned a company.
Linus Torvalds' former company was acquired.
His former EMPLOYER, not "his" company... (Score:1)
Wasn't there enough room to say that in the subject field?
What killed Transmeta (Score:5, Informative)
What killed Transmeta was a few things things:
Transmeta felt they were taking too many risks on the software side, and adopted a hyper-conservative culture on the hardware side. The result ended up being both late and below target. All the software optimizations in the world could not help push more operations down the pipe than it could actually perform.
As time went on, the cost of x86 decode and scheduling in hardware went down, and the cost of memory performance -- caching systems, and so on -- went up. The VLIW instruction set consumed more icache than the native x86 instruction set.
The best design in the world doesn't help if your fab partner don't deliver for their own design rules.
So where is Linus now? (Score:1)
This may be a stupid question, but, where does Linus work now?
-- thanks, Dave
Re: (Score:2, Funny)
"morong" at least he can spell correctly. :)
Re: (Score:2, Redundant)
I loved that too :D
Re: (Score:2, Funny)
Re: (Score:2)
"Maroon." Thus Spoke Bugs Bunny.
Re: (Score:1, Troll)
He said "kykes." Have we not been keeping up our ethnic slur vocabulary?