Ternary Computing Revisited 134
Black Acid writes: "American Scientist's Third Base was a nice introduction to the advantages base 3 but didn't really explain ternary computing. Since 1995, Steve Grubb has maintained trinary.cc which covers many aspects of computing with base 3. Not only are the basic unary and binary gates enumerated, which I independently verified as being basic building blocks, but real-world circuits are described also. Half and full adders, multiplexers and demultiplexers, counters, shift registers, and even the legendary flip-flap-flop are all covered with ternary algebra equations and schematics. Steve Grubb touches on problems of of interfacing to binary computers elegantly, although no schematics are given. Perhaps most impressive are the Transistor Models - schematics of the basic gates which can be built from cheap parts available at your local electronic component store."
on, off, ? (Score:4, Funny)
Re:on, off, ? (Score:1)
Re:on, off, ? (Score:1)
Is the bit dead? You'll never know till you open the harddrive!
Re:on, off, ? (Score:2, Informative)
Re:on, off, ? (Score:5, Funny)
Re:on, off, ? (Score:1)
No no, its not a state of electricity its the rest state of all the Anonymous Cowards from around here.
Re:on, off, ? (Score:3, Informative)
break the shackles of your binary programming! (Score:5, Informative)
However, in stodgy old binary, the levels are typically something like 0 Volts (i.e. "off") and 5 Volts (or 3.5 Volts). A "typical" ternary system would add a negative voltage, like -5V (or -3.5V), since that's easier to detect reliably than an intermediate positive voltage value.
So to answer your question, yes a "third state of electricity" is used, one which was previously being ignored in binary circuits. Instead of on, off, and dim, think of positive, off, and negative.
Re:break the shackles of your binary programming! (Score:2)
This set of articles uses current-mode devices with "forwards", "backwards", and "off" as the current levels. Similar idea, and it makes some aspects of the design simpler.
Re:break the shackles of your binary programming! (Score:1)
If the third state is implemented as a negative voltage, then I suggest that the three states are named: ON, OFF and NO.
Because NO is the reverse of ON.
Re:break the shackles of your binary programming! (Score:1)
Re:on, off, ? (Score:1)
My interpretation of the Lucas switch: Dim, Flicker, Short (after which you let all of the smoke out of the wire, which causes it to quit working)
For those of you that have tinkered with British or Italian cars, you have dealt with this before (or will soon).
For those of you who haven't, Lucas is the company that supplied/supplies the British automotive industry with electrical components (switches, relays, etc.) The early design of their electrical connectors left much to be desired. Ironically, this same company produces some of the best brakes and other hydraulic components in the industry (Lucas-Girling).
Re:on, off, ? (Score:1)
The site is interesting; Mr. Grubb has definitely put some thought into it.
The one issue that I find problematic is the requirement for bipolar transistors to get a +V and a -V. CMOS uses little power compared to bipolar. A trinary chip would have to be based on bipolar technology which would require more power and be less dense. Also, when laying out a chip one now has to worry about two power supplies instead of one: scratch one layer of metal. Now it's harder to route and even less dense.
So there might be a use for trinary logic in chips that are already bipolar for some reason. A smidgen of trinary logic latched onto a mostly analog chip of some sort? I don't think they have a chance against current CMOS binary chips.
If a... (Score:2, Funny)
If a Bit is short for Binary Digit...
Does that mean that a Ternary digit is a 'Tart'? Do 8 Tarts make a 'Tight'?
We could be having MegaTights of Ram, and GigaTights (or even TeraTights) of disk!
Tony
Re:If a... (Score:1)
Opps.... Spelling...
Tight should be spelled 'Tyte'!
Re:If a... (Score:1)
That would allow for Megatits and Teratits, and let the nerds guffaw around the water cooler more often.
Re:If a... (Score:1)
a Ternary Digit would be a
Terd. (Turd?)
I knew it! Ternary computing is full of shit.
Re:If a... (Score:1)
Re:If a... (Score:2)
Actually, 'bit' is short for binary digit.
That would make the shortform for ternary digits TITS.
Which would then lead to TYTES, KILOTYTES, MEGATYTES, etc
Of course, it would be better to have something that doesn't sound so alike
how about TET and TEET? TEET being the single ternary digit and TET being a logical grouping 3^3=27 TEETS?? The only logic there being that it's easier to say MEGATET than MEGATEET
Re:If a... (Score:1)
Does that mean that a Ternary digit is a 'Tart'?
I was always partial to 'trit' or 'trin'.Just out of curiosity, what do any Mage players think of this? A dream come true?
Re:If a... (Score:2)
No, I I think it's better called Trinary. Tertiary would be a more appropriate name if base-2 was instead called Secondary. If trinary is used, then you've got Trinary Digits, or trits.
However, the real question is when they get to base-4 computing, will they call it quits?
Trits? (Score:1)
Hmm.. at my fictional computer company (Zapitron -- an in-joke among my friends), we always called 'em trits, trytes (9 trits), and words (27 trits). But now that everybody's moving from those old 27-trit computers to the new-fangled 81-trit machines, "word" might get redefined.
BTW, one of the neat things about Zapitron computers is that unlike conventional computers with their software emulated null devices, Zapitron machines have hardware null devices, so that you can burst-read EOFs at billions of EOFs per second. Ain't innovation great?
Re:Trits? (Score:1)
So in trinary, you couldn't go in hex halves, you'd have to go by base-9 thirds. Would a third of a tryte perhaps then be a "katz" ? No, we want a diminutive, not a superlative... Hackneyed is too long... a "stale"? well, that does it for my thesaurus. Suggestions?
Re:Trits? (Score:1)
The trouble with... (Score:1)
So maybe a Ternary Bit would be a Trit.
and a Ternary Byte would be a Trite.
And a Nibble, of course...
Hobbes
0, 1, 2 ? (Score:1)
Just wondering.
Re:0, 1, 2 ? (Score:1)
in base 3 which has the digits 0,1 and 2.
Re:0, 1, 2 ? vs. -1, 0, 1 (Score:1)
The example given is 19, which in 0, 1, 2 is expressed as 201 [ (2* 3^2) + (0* 3^1) + (1* 3^0) ] = 18 + 0 + 1 = 19.
In -1, 0, 1 this is represented as 1t01 [ (1* 3^3) + (-1* 3^2) + (0* 3^1) + (1* 3^0) ] = 27 - 9 + 0 + 1 = 19.
Although it does often take more digits to represent the number via the second method, the balanced system does have many advantages. Without getting to deeply involved, the important one is that you have one system fo both positive and negative number, thus eliminating the need for the actual sign.
I hope this basic summary of that previous article helps.
Re:0, 1, 2 ? (Score:1)
The way these digits are represented in the computer is unimportant, and shouldn't be confused with the way we represent them on paper.
Re:0, 1, 2 ? (Score:1)
Duh! let me guess now...
Well thanks. Actually I'm, quite familiar with number base.
I was only wondering anyone had any alternative thoughts on the subject - which one of the posters above did.
Re:0, 1, 2 ? (Score:1)
Didn't the ENIAC use denary to represent numerical values internally? Or was it another computer of that era?
Re:0, 1, 2 ? (Score:1)
For example: 1010
= [(-1) * 27] + [0 * 9] + [1 * 3] + [0 * 1] OR
= [(-1) * 3^3] + [0 * 3^2] + [1 * 3^1] + [0 * 3^0]
= [-27] + [0] + [3] + [0]
= -24
"no such thing as two." (Score:1)
"It was just a dream, Bender. There's no such thing as two." -Fry
Why Tri why not just go Analog ? (Score:4, Interesting)
Digital computing gained popularity for many reasons, cost effective to build, easy to program, with the state of current electronics this is no longer neccesarly the case but we there ,
Analog copmuting has many advantages over digital computing, especially in the AI arena, Since there can never be a digital concept of infinity
Rockets in the beggining were put into orbit using ANALOG computers, there is a reason, accuracy to the nth factor.
I played around with analog computing in the 70-early 80's cool stuff if more would have been available, fact wsas everyone was happy with their 8 bit pc.
Trinary computing sounds a little like taking something that was settled on in the first place and resettling again
I mean come on isnt the goal of computing to have a supercomputer take control of our national defense grid when it becomes sentient ?
Re:Why Tri why not just go Analog ? (Score:4, Informative)
This theory works great, except in the real world. (Score:2, Interesting)
Evidently we need to optimize some joint measure of a number's width (how many digits it has) and its depth (how many different symbols can occupy each digit position). An obvious strategy is to minimize the product of these two quantities. In other words, if r is the radix and w is the width in digits, we want to minimize rw while holding r^w constant.
This may be an "obvious" strategy, but is it a useful one? A modern computer typically contains hundreds of millions of digits in base two. According to this theory, the cost of a computer (ie, the value we are trying to minimize) is equal to the radix times the width. If this is true, we can reverse the radix and the width to get a system that has precisely the same cost: thus, a machine that stores one hundred million digits in base two costs the same as a machine that stores two digits in base one hundred million, because two times one hundred million equals one hundred million times two.
In practice, building an electronic computers capable of distinguishing between one hundred million distinct voltage levels is a practical impossibility. Early attempts to build machines that had just ten distinct voltage levels were abandoned, not because of any theoretical arguments about data density, but because these devices turned out to be extremely difficult to manufacture and notoriously unreliable in operation. A computer with one hundred million distinct voltage levels, if it could be built at all, would certianly cost several million dollars to construct, and it would probably require a special power supply and several pounds of electromagnetic sheilding. It would certianly not "cost the same" as a typical desktop computer.
Even if we were to ignore the absurdity of the basic premise of the theory, and take for granted that the trinary computer is better than binary in some abstract way, there is still no compelling reason to switch. We have already invested billions of dollars into binary technology, and the benifits of that investment are undeniable. If you think companies like Sun and Apple has a hard time selling theoretically superior hardware in a market dominated by cheap PC clones, imagine how much harder it would be to introduce a computer that is so fundamentally incompatable that it does not even work with binary data. The dominance of the Windows platform proves that people don't want theoretical perfection: they want something that gets the job done, they want it to be cheap, and they want it now.
Re:Why Tri why not just go Analog ? (Score:1)
My understanding of this is that e is optimal only if the cost of supporting more signal levels grows linearly with the number of signal levels represented. Somehow, I really don't think that this is the case. I'm more inclined to believe that there is a step increase between 2 and 3.
Re:Why Tri why not just go Analog ? (Score:3, Informative)
I dissected their inverter circuit in a different post -- in short, it won't work for the intermediate level, and in fact closely resembles a primitive ancestor of TTL binary.
This is not to say that higher bases are always and everywhere a bad idea in electronics, just that you need to be cautious when taking designs from someone who hopes someone else will build them... Transistors are becoming much cheaper than wires, and higher bases really save on wires. So does time-division multiplexing (e.g., sending the bits twice as fast on half as many wires), and at this point we better know how to do this, and can make it work more reliably at lower cost as compared to trinary. Eventually, multiplexing will hit some sort of practical speed limit, and then sending multi-level signals may be cost-effective. I just don't see any particular reason to stop at 3.
Re:Why Tri why not just go Analog ? (Score:1)
Their inverter seems more related to current tristate outputs (i.e. zero, one, off) than trinary, but you might want to check the RSFQ tristate logic [sunysb.edu]. Seems the links for more info are down at the moment, though.
Re:Why Tri why not just go Analog ? (Score:2)
Re:Why Tri why not just go Analog ? (Score:1)
(Circuit tricks that depend on transistor gain, but NOT equal gains, to set the switching thresholds and approximate a connection to ground are not, IMO, useable as a real circuit, even though a 'simulator' will work fine.)
The best I could come up with out of their circuits was 'rotate down' - voltage switched positive and negative output, and although it's theoretically a complementary emitter follower, zero is open (to a resistor) zero output - add a saturating transistor between the output bases, and you've got a tristate. But you are correct, this is not the inverter that I was looking at.
Meanwhile, the link I suggested is still applicable for naturally tristate circuitry - josephson junctions and magnetic flux work equally well for switching between positive, negative, and zero with no additional components.
Re:Why Tri why not just go Analog ? (Score:2)
In short, analog computers are fast and accurate for small calculations under controlled conditions. Digital computers are better for difficult calculations (Despite what you'd think, calculating orbits is not a difficult calculation by modern standards) and under arbitrary conditions, a digital signal is less prone to error.
Re:Why Tri why not just go Analog ? (Score:1)
It's *very* easy to do differential and integral equations in analog circuits. It's much harder to do them digitally.
Orbital and aerospace/ballistic calculations involve lots of integro-differential equations.
FWIW, there's a (relatively) cheap source of analog computers available still - analog synthesizers. They can be interlinked and interchanged with analog computers, and the demands of music make sure they're fairly accurate. Wouldn't trust the space shuttle to an MOTM synth, but then again, I'd just use a binary computer.
Re:Why Tri why not just go Analog ? (Score:1)
Was Analog. They spent 10 years writing software for it and by the time they had gotten guidance modules written for it they could buy them from Westinghouse for 100x less in digital binary.
Lordbyron
www.wylywade.com
Re:Why Tri why not just go Analog ? (Score:1)
How on earth do you create equipment to handle or even create this value of infinity?
Binary computer, touch CMOS chip, it dies
Analog computer, touch chip, you die
personally I would prefer a computer that I could kill and not the other way around.
Seriously though.
IANAEE (I am not an Electrical Engineer), but I don't see any way in which you could generate or store infinite voltages, however infinite currents could be manipulated if the computer was made out of superconductive materials.
With the current state of superconductor technology, the cooling rig for a machine like that would truly be a beast.
Adopting crypto to trenary computing (Score:4, Interesting)
One of the nice things about the Rijndael crypto algorithm is that, becuase of its "wide trail strategy" design, it is easy to adopt to different environments, including trenary computing.
I am sure that a variant of Rijndael which does everyting in "trits" instead of "bits" would have the same security features as the current Rijndael algorithm. The only thing that would have to be re-invented is the sbox. The rest (changing the galois field to a 3-base instead of a 2-base galois field, and chainging the MDS matrix used) could be simply adopted.
- Sam
Not really a problem. (Score:2)
Second, separate theory from implementation, please. Very few areas of information theory and cryptography are dependent on base-2. That'd be counter to the entire point of math, which is to think abstractly enough that the principles apply to any base. Your statement is sort of like saying "moving from base 10 to base 16 is hard, since we learn arithmetic in base 10 and we'd have to relearn all our arithmetic". It's just not the case. A + B = B + A, no matter what base you use. And likewise, the integer factorization problem and the discrete logarithm problem are damn hard no matter what base you use.
Implementations are highly dependent on binary systems, yeah--but that's only because we only have binary computers right now. As soon as someone comes up with a ternary computer, rest assured, Blowfish and 3DES and RSA and El Gamal and AES and all sorts of crypto goodness will be running on it in no time flat.
Think about this one for a moment. Computers are Turing machines. We write in Turing-complete languages.
But there's nothing in the definition of a Turing machine which requires that it be binary, trinary, or base radical two. The Turing machine doesn't care.
3rd base? or base 3? (Score:1, Offtopic)
- Where's the Chips !?! (Score:2, Informative)
Easy answer: Signal to Noise ratio renders ternary logic useless. Either it comes at a slower speed than binary logic or at higher power consumption.
In addition - the site design doesnt make it look very credible..
Re:- Where's the Chips !?! (Score:1)
Signal to Noise ratio renders ternary logic useless.
The signal-to-noise ratio is similar for both binary and these type of trinary systems, if they are using similar hardware. The three states of a the trinary system given here are 0 (-x V), 1 (0 V), 2(+x V). In other words, the circuitry in both cases still uses the same voltages, and thus the same resolution. The only addition is current direction, which is just a modification of normal binary logic circuits.
the site design doesnt make it look very credible
So if someones site doesn't look fancy and professional then their ideas are no good? Perhaps the guy who made the site was too busy coming up with good ideas to have time to make a fancy frames-and-flash site. I think Steve Grubb did a great job with the site. It's simple and direct, he has tons of examples and tables to show you how the concept works. He also has a tutorial which brings you through the ideas gradually, culminating with the actual circuit designs so you can build your own versions of his ideas. What's not to like about it other than the fact that it is plain?
Re:- Where's the Chips !?! (Score:1)
The signal-to-noise ratio is similar for both binary and these type of trinary systems, if they are using similar hardware. The three states of a the trinary system given here are 0 (-x V), 1 (0 V), 2(+x V).
You name it. Binary logic requires 0V and +vcc, ternary -vcc, 0v and +vcc. Thats twice the voltage range. Hence the power usage (almost, due to different switching scheme) quadruples for the same SNR and speed.
In addition ternary logic will be much more sensitive to process variations. The logic will be A LOT more unrealiable than binary logic, not to speak of initial production yield.
So if someones site doesn't look fancy and professional then their ideas are no good?
No - he has too much graphics. Using HTML in a proper way (eg. as _markup_ and not _layout_ langue) would have been fine. Even better would have been some TeXed PDF/PS paper.
Re:- Where's the Chips !?! (Score:2)
We are already dealing with circuits that have a voltage swing of 250mV. If I understand this site (which you have pointed out already as less than credible), then we could say 0 = 0v, 1 = 250mV and 2 = 500mV.
I don't think that ternary logic is useless. But its usefulness is limited. I think it will be most useful in high speed serial systems. Infiniband comes to mind. (Though it wouldn't be implemented there now.)
I don't agree with your assement of slower than binary, however, it will cost more power. Especially if the first generation of design relied on more comparaters.
Re:- Where's the Chips !?! (Score:1)
Exactly, thats four times the power requirement. (See above) Not to speak about the problems of building proper comparators for this voltage range.
Re:- Where's the Chips !?! (Score:2)
Isn't that exactly what I said... "...however, it will cost more power. Especially if the first generation of design relied on more comparaters.?
Re:- Where's the Chips !?! (Score:1)
Isnt this great ? I agree with you ! ;) (Ok, sorry I was too quick while reading).
About the speed issue: The higher power requirements are generally dynamic power - hence it scales with clockspeed. Therefore you would have to use a slower clock to achive the same logic/power density as with binary logic. Since the power/area ratio is limited you would have to go with a lower clockspeed for the same manufacturing process.
An addition: I presume that implementing the ternary logic gate in CMOS logic will eat almost as many transistors as an equivalent gate for two binary bits. The only advantage i can see for ternary logic is that of having ~30% less interconnections. Thats not a big one ...
Since 1995? (Score:1)
Re:Since 1995? (Score:2, Informative)
-Steve grubb
Getting to third base... (Score:1)
Okay god that was bad
Dude. Totally. (Score:4, Funny)
"So I call one up and we agree to meet out at Woody's on 4th. And this PC never shows up. So I just keep drinking, waiting for it. I musta had too many because all I know is that I woke up in this apartment on 84th with these three midgets screaming at me in portugeuse."
"Damn unreliable computers."
Something already use ternary signalling (Score:5, Informative)
Naturally, such systems get enhanced so they can send more data at the cost of a little harmonic purity. For expample, they could get 50% more data through by using pairs of trits to send 3 binary bits. The 9th state would be used to prevent leaving the line at one voltage level too long. The real encodings are better behaved in the analog domain, and therefore more complex, but lookup tables for the trit to binary conversions take very little silicon.
For those who haven't memorized powers of three, if trinary logic, memory or signalling works better in some situation, 1 trit holds 1 bit, 2 trits hold 3 bits, 12 trits hold 19 bits, 31 trits hold 49 bits, etc...
Going the reverse is also very simple. If you have an algorithm that works better in trinary, store 1 trit in 2 bits, 3 trits in 5 bits, 5 trits in 8 bits, etc... You don't need special hardware.
Cyclic Groups (Score:1)
I have been trying to implement a Number Theoretic Transform based multiplication system and not being able to choose a power of two as my modulus is causing a major speed loss. Even though switching to a ternary base (in hardware) would not lead to more than a linear speed difference, having low level functions run two to eight times faster, and doing away with the ugly code used to modulo a prime) is nice.
Trinary consumes more power, not less. (Score:1)
However, this is *no different* than power supplies of 0, +3v, and +6v. Shifting your voltage reference does *not* change the power consumed.
Trinary also does not change your noise margins. Noise margins are a function of the actual circuit, and in an ideal world the noise margins on the high and low sides would be at half the representation voltage. 2.5 volts for 5-volt TTL logic, for example. 1.5 volts for 6-volt trinary logic, for example. See the noise margins decrease? You would have to increase the voltage range to 10 volts to maintain the noise margins. And therefore you would increase the power consumed. And power rises as the *square* of the voltage.
So let's see. Assuming your gates are feeding 1k resistors, binary logic would consume either 0mA (for a 0) or 25mA (for a 1). Assuming uniformly random distribution of bits, each bit would consume an average of 12.5mA.
For trinary, and maintaining the noise margins, a 0 would consume 0mA, a 1 would consume 25mA, and a 2 would consume 100mA. Average of 41.7mA. Since each "trit" conveys 3/2 as much information as a bit, the equivalent power consumption per bit for trinary logic is 27.8mA.
So trinary logic consumes more power per bit than binary logic.
Bummer.
Re:Trinary consumes more power, not less. (Score:1)
Show me a trinary Schottky (Score:3, Insightful)
Electrically, implementation is inevitably binary, at its core... electrical comparisons of boundary conditions. "Trinary" is just a minimal case of "analog", with all of the same disadvantages.
You want the same noise margins? You'll have to double your voltage. That means you're cutting your speed in half. So overall you're taking a loss because at half speed you could have gotten two whole bits for your money instead of one lousey trit.
Not to mention the fact that you're using more power, switching between these trinary states due to the longer transition and detection times. Oh boy! Hotter chips! Bleah!
Re:Show me a trinary Schottky (Score:1)
True, but that's what clocks or asynchronous handshakes are for - because even with only two levels, there's a lot of 'Wrong' state in between.
Admittedly, it might not be possible to define a 'gray code' signalling scheme using trinary digits, but a trinary clock signal ( -1, 0, +1, 0, -1) etc, could have definite benefits - quad speed data rate, data in trits would be effectively a 6X interface...
Electrically, implementation is inevitably binary, at its core... electrical comparisons of boundary conditions. "Trinary" is just a minimal case of "analog", with all of the same disadvantages.
Say what?? Where'd this come from? Binary itself is still a minimal case of analog - just with a single breakpoint instead of two with trinary.
You want the same noise margins? You'll have to double your voltage. That means you're cutting your speed in half. So overall you're taking a loss because at half speed you could have gotten two whole bits for your money instead of one lousey trit.
This depends very much on the implementation, but it is NOT necessarily cutting the speed in half. Assuming that the circuit has normal RC timeconstants, double the swing is NOT double the lag. (It's about 1.4 times the lag) This might not be significant in clocked ciruits, and at worst, the loss is about equal to the gain achieved by trits over bits.
Not to mention the fact that you're using more power, switching between these trinary states due to the longer transition and detection times.
Here you may be partly correct.
Assuming that there is a separate power supply connection for each signal state, the leakage in the drive transistors is probably going to be about double that of a normal binary output. The active power using a positive-zero-negative type signalling scheme should work out similar, since the voltage relative to ground won't be increased. The dynamic power would probably be comparable despite the larger swing since only 1/3 of the transitions involve the full possible swing while carrying 50% more information. And as I noted in the previous point, the transition (and detection) times are also comparable on a 'amount of data transferred' basis.
Oh boy! Hotter chips! Bleah!
All things considered, it looks like the heat to computing power ratio is going to similar for both. But if there truly are algorithms or applications that are more easily rendered using trits (and there may well be so) then the advantage for them may go to the trinary logic.
There may also be some uses for trinary base computing where the storage of additional logic states is NOT an overhead. Quantum flux gates - which unfortunately can't amplify or fan out yet - can store digits as fluz quanta - gates can be designed such that there is no overhead to such a device holding 2 quanta instead of one - and these chips will definitely NOT run hot. (Of course, cooling to superconducting temperatures may have its own problems.... for those interested, this is a link to the RSFQ lab [sunysb.edu] pages, and a link to an item on superconducting trinary circuits [sunysb.edu]. 100+GHz on 3.5um technology.)
Re:Show me a trinary Schottky (Score:1)
The whole point of what I was saying was that if you represent logic states with voltage levels you would be throwing away a quarter of your bandwidth using three states. Here's why:
To determine what state a 'trit' is in, TWO BINARY COMPARISONS must be made, one against each dead zone. Why, that sure looks like two whole bits-worth of information! But your trinary system doesn't allow one of the four states, and thus throws away what could have been a useful bit.
If you make those two comparisons simultaneously to keep the same bandwidth, then you'll need more voltage spread and higher currents (and an oven mitt to handle your circuit).
It just doesn't pay to throw out a perfectly good bit. There's ALWAYS a trade-off. If you think you're getting something for nothing, you're probably wrong.
-Rick
ooh i can't wait to program this (Score:1)
Re:ooh i can't wait to program this (Score:1)
Just a few thoughts. (Score:3, Interesting)
Looking at the schematics I see that it is based on a analog design style. The transistors are bipolar and there are plenty of resistors used for biasing. All in all, it looks more like an amplifier than a digital gate.
A previous poster commented on returning to analog computing. While there are several major problems with analog computing, I want to just mention a few.
High Implementation Cost
Currently, resistors are considered a somewhat "expensive" item in VLSI designs, since they use a lot of area and lead to static power dissipation. Using bipolars instead od MOSFETS is probably a mistake from a fabrication standpoint, but I don't see why the schematic couldn't be modified to take this into account.
In other words, any design using these gates would be big and power hungry. This isn't to say that a base-3 system is infeasible, only that this implementation doen't map very well to existing technology.
I think a MOSFET only implementaion would be required before we can take base-3 really seriously. Maybe something using depletion mode MOSFETS would work better.
No Component Architecture
Analog components are difficult to interconnect. Without going into too much detail, they don't just snap togeater like legos; rather, each brick must be modified slightly depending on what it connects to.
The schematics shown also exibit this problem; the author freely admits it. While I feel that this problem could be partially solved by automated tools, it is still a big hassle. Not just because I'm lazy either. Many tools operate using O(n ln n) or even O(n^2) algorithms. Increasing the time constant or adding unnecessary coupling means that the tools won't finish their production times for very long periods of time. A Xilinx FPGA synthesis run, which is comparativly simple, can allready take several hours to complere. This hurts the design cycle time, since even small changes can require a full recompile totaling hours. No telling how long it would take to make a full microprocessor - many days I am sure.
A true digital design, by contrast, does not exibit this problem. Again, I don't feel that this problem is insurmountable. The problem here is all those resistors whose values must be changed. Remove them and remove the problem as well.
Problem with interconnect
One more problem is that in most modern designs the design area is dominated by interconnect. Active areas (made of real transisotrs) are connected by routing channels, and the channels are getting to be quite large. Ternary logic doesn't exactly help with this, since we now have three power rails. This is at worst a second order problem though, since it doesn't really increate the interconnection between any two components, just the interconnection between all the components.
Underdeveloped logic family
While on the topic of power rails, I can't help but wonder about the clock. It seems underutilized. What should a negative (-1) pulse on the clock do? Or the control lines on a flip flop. Take a D-type flip flow for example. if load=1 loads a new value, and load=0 hold the old value, what does load=-1 do? load an inverted value perhaps? Until clocking a flip flop behavior is defined, I don't see any complete designs coming out. These ideas probably just need some more "brain time".
Error Correction and Asthetics
Lastly, let me say what I do like. The fundamental advantage of digital gates are that signals can be regenerated. In a five volt system, if you have a 4.89 volt signal, it is probably supposed to be a 5.00, so the gate boost it up and passes it on. This means that error do not propogate. This is the essence of Digital design.
The ternary design style we see is not incompatable with this notion. In binary, The decision is made around the transisors meta-stability point, typically 2.5V. This means that the fundemental decision is to determine if a signal is [ s>2.5V ] | [ 0V ] | [s
Re:Just a few thoughts. (Score:1)
The machines don't care. (Score:1)
Circuits more complicated? (Score:1)
Ternary "bit"?? (Score:1)
if a bit stands for "binary digit", so by analogy, shouldn't a "ternary bit" be called a tit?
"I am busily ignoring some thousand of implications I have determined to be irrelevant."
Intercal (Score:1)
PLEASE DO. Argh.
13*3/20*2 (Score:3, Informative)
Not Necessarily (Score:1)
it takes 2 operantions to distinguish a ternary system: v < 1 ? 0 : v < 2 ? 1 : 2 as oppose to binary: v < 1 ? 0 : 1
You're applying binary logic to ternary digits. I would foresee that the logic would be something like this: cmp(v) ? -1 : 0 : 1;
C's '?' operator inherently assumes binary digit. So, I think that if we were going to implement trit-based computer, we have to slightly change programming language structures.
Anyway, it won't happen, because people don't like radical changes. Besides, a lot of investments already take place in bits...
Base e (Score:3, Interesting)
A friend of mine who's into digital pro audio looked into building logarithmic audio gear, but the recording industry went to 24-bit linear instead, which provides more headroom.
number of transistors (Score:3, Interesting)
The other functions will take a lot more real estate if realized in trinary too. The Full Adder he had listed has 20 gates of varying complexity, that would take at least 2 transistors per gate, probably resistors as well considering his schematics. A binary/CMOS implementation can be done in about 30 transistors.
I Guess I Should Be Waiting For... (Score:1)
It's going to kick some wicked ass, especially with those 12-gigabyte multilayer CD-ROMS and fungus-based hard-drives.
The real benefit of trinary computation (Score:1)
However, there would be a real benefit in using trinary logic, as it can be used to represent true, false and unknown. In many real situations a lot of extra logic is devoted to dealing with unknowns because the binary system forces a decision. This would be cleaned up quite a bit in trinary system. In fact, the ease of rounding using trancation is a nice little demonstration of this property.
What about using more than one base? (Score:2)
One of the professors in our department has been doing some heavy research into computations using more than one base. The idea goes like this:
Obviously this isn't a universal solution, but think about DSP hardware, where multiplications are expensive and needed all the time. Not to mention exponentiation for cryptography. Also, this brief explanation doesn't do justice to the full potential/applications of DBNS. A lot of work has gone into it.
If you want to find out more about DBNS, there is a primer at www.rcim.ca/Research/Video_Rate/DBNS/ [www.rcim.ca], miscellaneous papers at people.atips.ca/~eskritt [atips.ca] and a collection of a few published papers at www.atips.ca/research [atips.ca]. Also, some older presentations are archived at wooster.hut.fi/geta/courses/graham/Applications/ [wooster.hut.fi].
Disclaimer: I'm the web guy for our research group at the U of Calgary. The guy who came up with DBNS is a professor here (Dr. V. Dimitrov [atips.ca]).
Impractical circuits (Score:3, Informative)
Their ternary inverter is simply a two-transistor inverting _analog_ amplifier running on +/-3V supplies. If the input is -, Q2 turns on, bringing the base of Q1 low, turning Q1 off, so R2 pulls the output (which isn't explicitly shown) to the + rail. If the input is +, Q2 is off, and apparently this circuit depends on leakage to then bias Q1 on. This brings the output almost to the - rail. So it would work as a binary inverter. It's not nearly as good as a
TI 7404 [ti.com] (see page 2). The major difference is that R2 was replaced by a transistor, which turns on for high. This speeds up the low-to-high transition, since you get the full output current of the transistor until the output node is charged up. It also saves power, because one
output transistor is always off and the other always on, so when not switching only leakage currents flow at the output. (This two transistor output is called a "totem pole", and CMOS similarly depends on transistor pairs, one always off so little current flows.) Two more intermediate transistors are added, to control the top transistor on the totem pole and to reduce the resistor count. (On-chip, resistors are not cheaper than transistors.) But if you used it as a binary circuit, trinary.cc's inverter is basically the stripped-down ancestor of the 7404 circuit.
As a trinary circuit, it also has to take a 0V input and output 0V. This inverter does not do this reliably. It probably could be made to work by adjusting the resistor values until 0.0V in gave 0.0V out, but warm or cool the transistors a few degrees, and the amplifier bias will shift so that the output swings to the + or - rail. When you are trying to put the mid-level through it, you are running it like an analog amplifier, and analog amplifiers are unstable without negative feedback.
Nor would adding a few transistors and a negative feedback loop to stabilize it make it work well enough. A trinary inverter should take an input that is not right at any logic level, decide which level is closest, and output the corresponding nomimal voltage. For highs and lows (2 and 0), it does that, since it pins the output to the opposite rail. But even if you can be sure that 0.0V in = 0.0V out, with a circuit that is basically an analog amp, -0.1V in will give more than +0.1V out. So a chain of gates would allow the logic levels to get worse at each gate, until the mid-level became misinterpreted as + or -. To restore the mid-level would take a much more complicated circuit. I lay no claims to being a good designer at the transistor level, but I can't see any possibilities that are not nearly twice as complex as the corresponding binary circuits.
Re:Impractical circuits (Score:2, Informative)
Don't focus on whether the transistors are BJT's or how many resistors there are. When you put this on a chip, everything changes. It gets changed to CMOS, NMOS, or something. Resistors are gone. All the famailiar 2nXXXX transistors are basically gone. You use a transistor library customized for the feature size and process and other odds and ends.
What is important is that the circuits are simple enough to put into a spice simulation and it uses parts that are already in your library so that it can be independantly confirmed or improved. Its much more fun to play with things without having to drop all the way to foundation if all you want to do is study one aspect.
The transistor models are not ideal...I'll be the first to admit it. What my goal really is about is to get people thinking and to share knowlege.
I would be more than happy to publish any improved transistor circuits that uses parts that would be in a common spice library. Anyone that is sincerely interested, please contact me through the address listed at the website.
Cheers,
-Steve Grubb
Re:Impractical circuits (Score:2)
Just remember, in your spice modeling, try varying the transistor characteristics. It's not a real circuit until it can shrug off immense variations in the silicon. 10 degrees C temperature change can change beta by a factor of 2, for instance. Variation from wafer to wafer is even greater...
Re:Impractical circuits (Score:1)
Some circuits have already been physically built and tested - and at least one person feels that they lend themselves to tristate logic gates [sunysb.edu].
The basic principles are already in the category of proven technology - ever heard of a SQUID sensor?
Josephson junctions work equally well for either positive or negative currents - and so do magnetic flux quanta. (But this circuitry has to be the ultimate in low-power computing - you can't get much lower discrete amounts of energy than a single quantum of magnetic flux.)
Re:Zero effect to developers... (Score:2)
Maybe this "doesn't effect your day to day job", but it is an interesting computer science article. Apparently you want a site that's "software news for developers, stuff that matters only to a very narrow audience".
If you'd get your head out of your ass, you'd realize that most of technology comes from building on the past; that's why it doesn't usually make the news. Technology has always been marked by gradual refinements, occasionally interrupted by large leaps.
Re:Zero effect to developers... (Score:1)
And let's not forget all of those hours spent in Algebra/Geometry/Calculus doing truth tables and proofs based on Boolean logic. I'm too far in debt from my college education to just throw away all of that valuable knowledge!
greg
Base2 subset of Base3 shocker.... (Score:2)
I'd ask for a refund on that college education if I were you.
Re:Base2 subset of Base3 shocker.... (Score:1)
Let's take a small example in C:
long x = 42;
long y = 100;
long z = x | y;
In base 2, you can view this as
0101010
|1100100
---------
1101110
But what is the same sequence in base 3?
01120
| 10201
-------
?????
To me, that doesn't translate. Are you advocating that a ternary system would be able to represent numbers in base 2 when Boolean operators are present, but represent numbers in base 3 when ternary operators are present?
greg
Re:Base2 subset of Base3 shocker.... (Score:2)
01120
| 10201
-------
11221
It's a matter of simulating binary. (Score:1)
When you knew you were writing for ternary computers you wouldn't want to use a lot of binary operators, since they're not very efficient, but a ternary computer could definitely run programs in binary-based languages. Naturally, there are analogous digit operations for any base, and to take full advantage of the computer, you'd want a more appropriate programming language that lets you access them.
Re:Base2 subset of Base3 shocker.... (Score:3, Informative)
This works for base-2 just as well, as you see the operations are exactly the same.
Re:Base2 subset of Base3 shocker.... (Score:1)
Re:Zero effect to developers... (Score:1)
I don't think the pursuit of binary computing needs any encouragement. The companies above spend billions on it, products that use it are everywhere, and schools around the world teach students to design binary computers. Binary computing is being refined and improved as fast as it can be, period.
And even if trinary computing goes nowhere, I think it should be explored, if for no other reason than principle. How else are you going to know if it's better? The world needs more variety, not less (check out this book [fatbrain.com] for why), and if you think otherwise I suggest you go work for Microsoft or McDonald's.
I, for one, am interested in alternative approaches not *necessarily* because I think they're better, but because without being exposed to them I won't know if my current assumptions are valid. For that reason *I WANT* to see this kind of thing on
Re:Zero effect to developers... (Score:1)
But that's exactly why I read Slashdot!
If I really want 'Stuff that matters' there are plenty of other places to go.
Re:Zero effect to developers... (Score:1)
If the data is all represented in a format other than binary, that means that it will be that way on disk as well as internally, (otherwise it seems pointless to have to switch it to binary when leaving the processor, non?).
Operating systems would have to be rewritten for this new architecture, and that isn't exactly an easy process, especially considering that we're not talking just Intel to Alpha here.
Re:Zero effect to developers... (Score:1)
It makes me laugh that you use these words with disdain, when most people -- or at least geeks (and probably you, if you'd admit it to yourself) thrive -- no, LIVE on this stuff. New things, quirky things and different things are, to my knowledge, subsets of the class called "interesting things..."
The desire to do, learn or be part of something new or odd is precisely what has powered the whole hacker culture from the beginning. Think about it for a minute...
And as for marketing -- You're on the right track. The theory behind marketing is that people like interesting things. But your logic is transductive; you seem to be condemning ALL new or different ideas.
I agree that building on what we have is important, but what if we realize that what we have has become a giant, messy, bloated crapfest, or just that there might be a better way to start? I'll take my BMW Z3 and you can just keep developing a better horse feed to make your carriage run faster.
Justin
Re:Zero effect to developers... (Score:2)
You are rather arrogant, aren't you? I care about this kind of stuff because I don't write applications. I build hardware. You don't seem me bitching about MAME running on the XBox. I could use your same "who cares" argument by saying "No kidding, its a PC. Let's stop wasting our time with that. Its 'interesting' but nothing new."
So I am sorry if it bothers you that Slashdot appeals to a broader audience than code monkies (which apparently you are and the other 99% of the people that read slashdot are not).
This isn't "Stuff that matters" its "irrelevant stuff to pass sometime"
God, a statement like that just makes me boil.
This is yet another example of the desire for "new or different" over getting hands dirty and improving what is there
You don't even know what a trinary system is, do you? Its exactly dealing with improving what is there already. By having 3 different states, we could significantly increase the rate with we transmit serial data. (Which is idea that just popped into my head.) Its a lot faster to detect two distinct voltages than two that are very close.
Ah, no one cares about you anyway. Go back under your rock.
Re:History Repeating (Score:1)
Perhaps if you read my original post you wouldn't make such blatantly false comments.
On a further karma burning point, why was the original modded as redundant? Flamebait maybe, but redundant? No-one had mentioned the previous article when I posted - perhaps someone posted minutes before me but this just underlines the pointlessness of the redundant moderation.