Beyond Binary Computing? 412
daksis writes "Non base two computing is nothing new. But it is an idea that, for various reasons, never really caught on. Embedded.com is running an op/ed piece that asks if hardware and software engineers are ready to move to ternary or quaternary logic. A move to multi-valued logic provides more computational capability without the standard increase in die size or transistor count. Is the need to make do with the current fabrication technology enough to drive the move to multi-valued logic? Or will Moore's law continue without the need for doing more with less silica based real estate?"
Truth Tables * n? (Score:5, Interesting)
a and b = ?
-----------
0 and 0 = 0
0 and 1 = 0
1 and 0 = 0
1 and 1 = 1
But how would you make an AND gate for a trinary system? Would it be like multiplication with signs?
a and b = ?
-----------
- and - = +
- and 0 = 0
- and + = -
0 and - = 0
0 and 0 = 0
0 and + = 0
+ and - = -
+ and 0 = 0
+ and + = +
And then quarternary... if it's just pairs of Boolean digits, no problem. It's just a four-input AND:
a and b = ?
-----------
0x and 0x = 0
0x and 1x = 0
1x and 0x = 0
x0 and x0 = 0
x0 and x1 = 0
x1 and x0 = 0
11 and 11 = 1
Or is the whole concept of an AND (OR, NAND, NOR, XOR) gate a relic of my Boolean thinking?
Re:Truth Tables * n? (Score:5, Interesting)
You're a relic, I'm afraid ;-)
Binary operations can be carried out by considering whatever values you have to be binary numbers, and working from there. Binary operations would probably have to be implemented like that somewhere, because they're quite useful...
Implementing binary operations using any base which isn't a power of two would, I suspect, be extremely painful...
But arithmetic and other operations wouldn't have to be based around binary logic; it seems like the circuits might get horribly difficult to reason about, but with decent computerised tools that's hardly a problem...
Re:Truth Tables * n? (Score:3, Interesting)
Yes, No, and Maybe?
And for four values:
Yes, No, Maybe, and error?
Much beyond that and you start thinking in coarse probabilities. Go very far in that direction, and I will argue that you should eliminate the yes and no values. (I.e., nothing is either certainly true or certainly false.)
There are probably other ways to parse the bit patterns. (I.e., I know there are several, but I don't remember them.) But that one makes sense to me without (much) transla
Re:Truth Tables * n? (Score:2)
True, true... it would be fun to play with a computer that had such things implemented quickly. Maybe come up with a version where you can run normal programs using fuzzy logic and see what happens ;-)
Re:Truth Tables * n? (Score:5, Funny)
Re:Truth Tables * n? (Score:2, Interesting)
However, the article was really talking about gaining extra memory capacity by using greater than bi-valent state memory. So then you would have some sort of mapping function - as described in the article - in which case from the user/developer stand point nothing wo
Fuzzy Logic (Score:4, Interesting)
Heh.. I hate to break this to you, but your thinking is a bit behind the times as well...
Multivalued logic = Fuzzy logic
The most common AND and OR operations in Fuzzy Logic are min() and max() that together form the basis of a De-Morgan Algrebra (only the law of excluded middle [A AND NOT A = 0, A OR NOT A = 1] must be thrown out)
AND(A,B) = MIN(A,B)
OR(A,B) = MAX(A,B)
NOT(A) = 1-A
Generally, a trenary logic is composed of { 0, 0.5, 1 } where each value is the "degree" or "belief" in TRUE.
0 = FALSE
0.5 = UNKNOWN
1 = TRUE
Some of you may recognize this from SQL (yes, SQL does actually have a simple trenary fuzzy logic base).
The truth table ends up looking like this:
0 AND 0 = 0
0 AND 0.5 = 0
0.5 AND 0.5 = 0.5
0.5 AND 1 = 0.5
1 AND 1 = 1
0 OR 0 = 0
0 OR 0.5 = 0.5
0.5 OR 0.5 = 0.5
0.5 OR 1 = 1
1 OR 1 = 1
NOT 0 =1
NOT 0.5 = 0.5
NOT 1 = 0
If we move from trenary to any other precision, the rules stay the same and the table is easily derived ( min, max, 1- ). Generally, it is prefered to always have a 0.5 value, because UNKNOWN is actually a useful truth indicator. The next set after trenary that makes sense is not 4-value-logic (because it would exclude unknown), but instead 5. For instance:
0 = FALSE
1/4 = UNLIKELY
1/2 = UNKNOWN
3/4 = LIKELY
1 = TRUE
At this point, some truly interesting approximate reasoning is possible, although going to 15 values or (ideally) handling multivalue logic as analog until storage/retrieval would be much better. Approximate reasoning is one of the many things that fuzzy logic makes possible. Essentially it is the application of fuzzy logic to determining beliefs where certainty is not important (and in fact the lack of certainty is where the power of such a system comes from - being able to continue computing without full knowledge, only belief)...
The idea of signals that are analog flying around on a semiconductor, instead of digital, yet time discreet in the same way as digital signals is quite interesting and could probably be done quite easily. Anyone have any ideas on how a min(A,B), max(A,B) and (1-A) operation might look on silicon?
Re:Truth Tables * n? (Score:5, Informative)
Try reading this [att.net] for a quick primer.
It wont happen all at once, its a different paradigm and a definate learning curve, like the difference between imperative, functional and object oriented programming. But it has definate advantages, beyond the Moores law tripe.
Re:Truth Tables * n? (Score:5, Interesting)
Tripe? Where do you get that from? Moore's observation about the exponential growth of transistor count is just a specific case of the more general Law of Accelerating Returns [kurzweilai.net].
Exponential growth isn't tripe-- it's historical trend that hasn't been broken in thousands of years.
--
Ah, some pedantic semantic conflict (Score:5, Insightful)
(Exponential Growth = Unbreakable) => Tripe
I hate to add fuel to this sort of fire, but is Moore's "Law" a law, or an "observation"? They are not equivalent.
"...historical trend that hasn't been broken in thousands of years." - What codswallop. In a theoretically infinite universe this may be the case, but real life is never that simple. Exponential growth of velocity - diminishing returns as you approach the speed of light. Exponential population growth - always a ceiling....
I could go on and on - but I won't.
Q.
Re:Ah, some pedantic semantic conflict (Score:3, Interesting)
It's closer to an observation than a Law, since scientific laws are usually mathematical descriptions of our worldly observations. e.g. I can observe that an apple falls "faster and faster", but Newton's Laws describe that motion more precisely (and Einsteins theories go a step further by attempting to explain, rather just describe what he observed).
Exponential growth of velocity - diminishing returns as you approach the speed of light
I like the indeterminant ternary logic concept (Score:4, Informative)
Picture a system with:
1/3 power = 0
2/3 power = alpha
3/3 power = 1
Now consider the case of recursion where each iteration must be deffered until the one above returns - by using uncertain values instead you may be able to perform a range of forward-possibilty operations upon the as yet indeterminant numbers.
When the higher order recursion results eventually (lets assume) returns a value that determines the alpha value all that is required is to create a specific instance of the generalised results.
I like the concept - and it seems it could easily be integrated on the same die as a standard ternary chip.
Q.
Re:I like the indeterminant ternary logic concept (Score:4, Informative)
Consider a typical "fuzzy" logic problem of when to stop a car. You want to weigh variables like the speed of a car, the amount of force to apply, distance to the stopping point and other options like the existance of pedestrians (generally, if there's people within 20 feet of your stopping point, like say at a bus stop near a stop sign, you'll want to slow down quicker at first but roll longer, to decrease the effect of a fluke accident).
Lots of variables. Lots of choice. Lots of probability to weigh. Having an extra option out of 3 does not help you. Having 64 bits to work with does.
I have a hard time coming up with problems in my line of software dev in which ternary or quaternary logic is any more useful than nested binary logic or some fun probability and calculus. Mostly because it's rare that I care about anything other than STOP or GO, ON or OFF. About the only time I do care is when I'm dealing with a database (YES, NO or NULL [no data])...all the rest of the time, alternative options are best handled with an enumerated type or a nice exception.
Anyway, it's all well and good to talk about ternary computing being 'faster' with less overhead, but it's never really going to take off. It will take at least an extra year to train engineers to use the new logic effectively and for them to learn the tricks...and in that year, binary computing will have doubled. And when you live in a world where most software isn't optimized anyway...waiting for a slightly faster logic system that 9/10 of programmers will merely treat as binary because it's easier to understand in more comfortable is a waste of everyone's resources.
Even the terminology is bad. True, false, alpha. Ugh. If P is sort of true, then kind of do q.
Re:Truth Tables * n? (Score:3, Interesting)
2) constants (Y=true, Y=false)
2) wire (Y=A, Y=B)
2) NOT (Y=notA, Y=notB)
2) AND/NAND (Y=AandB, y=AnandB)
2) OR/NOR (Y=AorB, Y=AnorB)
2) EQ/XOR (Y=AeqB, Y=AxorB)
2) AND with an input inversion (Y=Aand(notB), Y=(notA)andB)
2) OR with an input inversion (Y=Aor(notB), Y=(notA)orB)
These are the 16 functions. Of course, the first two lines are skipped for obvious reasons and the last two lines are really combinations of the others plus
How about true decimal? (Score:2)
On the other hand, 7-bit ASCII now needs three digits -- 300 values, wasting 172 values, more than have the value-space of a byte ("bydte"?).
Don't even begin to ask UNICODE to retool for this.
Re:Truth Tables * n? (Score:2, Funny)
Binary logic (Score:3, Interesting)
It's hard to break out of binary thought since the traditional AND/OR in computer science mimic the English language usage of these terms, but in reality one could create any logic table and assign it a name. The fact that AND/OR have clear English meanings confuses the issue when we try to apply them to ternary input; we might as we
Re:Binary logic (Score:5, Informative)
No, they'd just trade the engineering problem of packing more bits into once space with finding ways of unambiguously interpreting a value.
See the whole power of binary (pardon the pun) has always been it's wonderful noise-suppression ability. Imagine a copper wire running 2 miles with either a 5V or a 0V signal on it. You can apply a simple analog device (say a BJT transistor amplifier) that utilizes an exponential function switching at some precisely known voltage (we'll call it 2.5Volts). Voltages before and beneath this voltage are amplified to either zero or 5V exponentially, such that only voltages within a small delta of the threshold voltage have any ambiguity.
Thus you can determine the likelyhood of noise on a line, then put digital amplifiers every so often such that no amount of noise will be sufficient to raise or lower the voltage to the ambiguous region.
The same is true even on micro-scopic wires; Fanning transistor outputs out to too many transistor inputs introduces noise on the wire. CPU's not surprisingly utilize "buffers" which are trivial 2 transistor logic gates which pass the output to the input. This cleans the signal just as the higher-powered digital amplifiers do.
While I'm not sure which particular technologies are being considered in this trinary/quatrinary logic system, if it is nothing more than a sub-division of voltages, then it's doomed to failure for general processing, and possibly even simple memory storage. First of all, you introduce another whole region of voltage ambiguity. The only way to maintain your safety zone is to up the voltage or provide more amplification stages to garuntee a cleaner signal. But the trend has been to decrease, not increase voltages (lower power consumption, smaller devices, whatever), and adding additional logical devices merely to interpret a signal means that your bit-density is going to suffer.. Exactly impeeding it's whole point.
Likewise for denser bit-storage, since the probability of bit-error necessarily increases (all else being equal), then you're not as likely to achieve as small or as dense a physical digit. So unless you can at least achieve less than 1.5x logical-digit spacial expansion (due to error-compensating material), you haven't gained anything by going to a trinary system.
Lastly, the concept of >2 digit computing already has a particular niche where it's trade-offs are acceptible.. Think of 56k modems which encorporate dozens of thousands of "values" for a single digit. They utilize a full 256 voltages for each anticipated time-slice. Of course the analog modem can't anticipate the exact sampling point where the analog phone line gets digitized (happening to transition at that point can be bad), and there is usually a tremendous amount of line noise. But what modems wind up having to do is group several time-slices together and produce a macro-digit with a but-load of error-correcting pad-values. And that's not even enough; the entire packet is still likely to have misrepresented digits, so CRCing and thereby retransmission is often necessary.
All this effort is worth it because we physically realize extra bandwidth.. But such a "probabalistic" solution (prone to bit-error) is unacceptable at the lowest level of computation. You can't get any less error prone than binary (given the above discussion), and mathmeticians have shown that base-e (2.717) is the optimal number to balance complexity of the number of combinations with the number of digits in a given number. (analogously demonstrated by considering an automated phone system where you have to wait to hear 10 possible choices per menu (the base-10), and you have to go through k menu levels to achieve what you want. The metric is the average wait-time using different bases, and mathmatically the shortest wait time was the
Re:Binary logic (Score:5, Interesting)
Binary, being the lowest base that can represent any integer mathematics, is not a point on the continuum, it is a defining terminus of the continuum, and has many special properties. Termini (endpoints) often do, especially in one-ended ranges (e.g. base two is the lowest number of sates, but in theory analog has an infinite number of states, and any real-world instantiation of an analog computer can only be an approximation.) One example of an open-ended range where the sole endpoint has unique properties is the prime numbers (which, properly, must be positive integers): the lowest prime, 2, has so many unusual properties that it is often excluded or dealt with as a special case. it is believed (but not quite proven) that there is no highest prime
This may sound trivial or like mealy-mouthed gibberish, so here's an example:
In every multi-state binary-like computer, division is computationally 'harder' than multiplication except base two!
Any algorithm for general division (by an arbitary divisor) involve more multiplications (and then subtractions, according to the results of implicit trial and error subtraction [branchpoints]) than a corresponding extended ('long form') multiplication. The reason this does not occur in base two is that multiplications by the two binary digits 1 and zero is so trivial that it does not need to actually be performed - a compare and branch suffices, which corresponds to the compare and branch preceding the additions of a binary multiplication.
This is pretty special. While multiplication and division are inverse function, full generalized division is always 'harder' than generalized multiplication. This is quite unlike, say, subtraction, where a 'subtraction circuit' can be constructed to perform subtraction exactly as easily and in roughly the same number of, say, transistors as an adder.
Binary math has many special properties in group and number theory. We'd lose those in higher base math, and we wouldn't gain new properties to make up for that loss. Two, the low bound, is special.
Re:Truth Tables * n? (Score:5, Informative)
Correction (Score:2)
Also, in the binary case, AND and NOT (say) are a basis, any other function can be expressed in those. I dunno if that holds for m-values logic.
Re:Truth Tables * n? (Score:5, Interesting)
I think the whole point is not about changing the boolean logic, but merely changing the representation of numbers, such as considering a number as octal and thinking of the values 0..7 as different voltages. Building an adder of course requires new logic circuits, but no one will take away boolean logic from you.
Besides, there exist many non-binary logic ideas with AND/OR etc. operations (such as the ternary Lukasiewicz logic), even continuous logic (see, for instance, the lecture slides here [uni-karlsruhe.de] -- German unfortunately), but they are /not/ Boolean as they can not satisfy the Boolean axioms.
So, for you writing software, nothing changes really... but internally, numbers would be represented differently. (Of course, when switching a whole CPU to n-valued calculation, you still need a way to do simple Boolean calculations since that is needed for conditionals.)
Re:Truth Tables * n? (Score:3, Funny)
While drunk once in my youth, I came up with diaboolean algebra.
Instead of the boolean true/false, you have true, false, maybe and maybe not. The latter two are equal in interpretation (uncertainty), although with different values.
diaboolean 0 = boolean 00 = false
diaboolean 1/3 = boolean 01 = maybe not
diaboolean 2/3 = boolean 10 = maybe
diaboolean 1 = boolean 11 = true
Of course, NOT maybe gives mayb
The Star Trek chronicles... (Score:3, Funny)
Re:The Star Trek chronicles... (Score:2, Informative)
A "quad" does not refer to "quaternary," but is just a unit of storage space. It it not known how many bytes are in a quad.
It is common knowledge among trekkies that the term was invented specifically to avoid describing the data capacity of Star Trek's computers in 20th century terms. It was feared by technical consultant Mike Okuda that any such attempt would look fool
Re:The Star Trek chronicles... (Score:2)
<voice style="comic-book-guy">
Of course "quad" is defined...look it up in the TNG tech manual [amazon.com].
</voice>
(I'd look it up, but mine's at home. The implications of these statements on my having a life are an exercise left for the reader.)
Re:The Star Trek chronicles... (Score:2)
And that was very wise of him. Remember Max Headroom [techtv.com]? Max only occupied about 64 MB of RAM. Of course when that show was made, typical home computers had 64K of RAM. Supercomputers [ucar.edu] of the time had 64 MB of RAM.
Re:The Star Trek chronicles... (Score:2, Funny)
about that much storage we need
Qubits (Score:2, Interesting)
Re:Qubits (Score:3, Informative)
Er. That's q for quantum, not q for quaternary.
I hope that was a joke :-P
Re:Qubits (Score:3, Informative)
Qubits are bit which can be zero, one, or zero AND one both at the same point in time (although, in order to be measured they must collapse down and become either zero or one).
Re: But you're partially onto something... (Score:2)
It strikes me, that to make a statement like "Or will Moore's law continue without the need for doing more with less silica based real estate?", after the article really is just pushing out the other, to me, more likely, possibilities, to make this an important article?
I'm not saying it's not, a
Trinary Computing (Score:5, Informative)
Re:Trinary Computing (Score:5, Interesting)
Representations of numbers in a particular base have two defining characteristics - the number of values that can occupy a digit (m), and the number of digits it takes to represent that value (n).
(Here's where the theory takes a leap, at least to me) The most efficient base (or simplest) base for performing computations is the one at which the m*n product is minimized. As an example, we'll take THE ANSWER, 42(base10).
THE ANSWER in base 16 has a result of 16*2=32
THE ANSWER in base 10 has a result of 10*2=20
THE ANSWER in base 8 has a result of 8*2=16
Here are the interesting cases, though:
THE ANSWER in base 2 has a result of 2*6=12
THE ANSWER in base 3 has a result of 3*3=9
THE ANSWER in base 4 has a result of 4*3=12
IIRC, according to the article I was reading, the most effective base is actually "e" (euler's constant).
Re:Trinary Computing (Score:5, Informative)
Re:Trinary Computing (Score:4, Insightful)
Each additional "base" value takes more complex circuitry (base 2 being the simplest).
But for small values of the base, we need more "bits" to represent a given value. A single hex value can represent the same number as four binary values.
Those of us old enough to remember using octal notation probably remember wishing that getting to 7 as a largest value was getting close, but not quite, to 9.
Binary (base 2) was adopted in the early days of computers because (1) electronically it was very easy to design circuits that either were saturated (max current) or cut-off (zero current), and (2) Boolean algebra had been around for 200 or so years, making the transition straightforward (although not easy).
It's been a long time since I took a semiconductor course, but I would think that a tri-state logic circuit (using -1.5V, 0V, and +1.5V, for example) should be fairly straightforward today.
Yes, truth-tables and such would become much more complicated, and de-Morgan's theorem would be relegated to the scrap heap, but it would seem to be a way to continue to increase processing power once Moore's Law begins to poop out as feature sizes become sub-atomic.
Moore's Law itself could continue, just taking advantage of better technology to move to quad-bit, penta-bit, and so forth, computing.
In deference to those who might be easily offended, I have abstained from using the obvious acronym for a 3-state digit.
Sounds like a good idea. (Score:5, Interesting)
My mind is too used to binary
Sounds like a good idea.
Re:Sounds like a good idea. (Score:2)
Re:Sounds like a good idea. (Score:5, Interesting)
Ternary (Score:5, Informative)
Re:Ternary (Score:5, Funny)
Farewell to 'The joy of hex' (Score:2)
After 20 years in the biz, my brain is hard-coded for hex, I'd have to retire.
Ternary system is the way to go (Score:5, Insightful)
Re:Ternary system is the way to go (Score:5, Informative)
Third Base [americanscientist.org]
It's a good read, stuff I didn't know until I read your post and looked it up =)
~Berj
Re:Ternary system is the way to go (Score:5, Funny)
Third Base
There is just something funny about the concept of Slashdotters needing to follow a hyperlink in order to get to third base...
Mechanik
Re:Ternary system is the way to go (Score:2)
Re:Ternary system is the way to go (Score:3, Interesting)
Of course, if you are talking about processing information (CPU vs. RAM), ternary system is a little bit better, but the advantage ove
Re:Ternary system is the way to go (Score:3, Informative)
So I say that that reasoning does not prove that base 3 is 'more efficient'.
Btw what does 'more efficient' mean in this context: less power usage? lower cost? more girls?
Quaternary or ... (Score:3, Interesting)
I don't see why software would have to deal with
quarternary logic at all.
Currently the assembler- hardware logic is already an abstraction (microcode).
If only the main busses (address bus, data bus, and their modern counterparts) would simply use that, and elementary pieces like barrelshifters would
be quaternary, one could severely limit the number of lines (and thus transistors)
However it could be that because of tolerance problems quaternary logic elements have to be larger, and thus don't yield the big benefit one would expect.
Not Moores Law (Score:2)
Take a look at the Itanium. It's not taking off because not enough people 'get' EPIC, and moving to that platform is a lot of work. The speed benefits for a full migration to Itanium are quite large, but nobody wants to hand-tune their millions of lines of code.
Its not a smart move at all (Score:3, Insightful)
Re:Its not a smart move at all (Score:3, Insightful)
Well, for ternary computing, you could do -3.3 V, 0V and 3.3V. So you don't actually need higher absolute voltage levels.
Re:Its not a smart move at all (Score:3, Interesting)
eg
* red, green, blue
* apples, lemons and bananas.
All perfectly unique, all one move away, no intermediate transitions. The trick is finding an electrical equivalent.
Another analogy is to imagine points in space - eg trinary could be a set of points in two dimensional space, quaternary logic a set of points in three dimensional space etc.
Power (Score:5, Interesting)
If you go to multilevel logic (not just on/off) then you're necessarily going to have intermediate states which both conduct and have voltage across them, with the resulting dramatic increase in power. This is an acceptable tradeoff for charge-storage devices like memories but a non-starter for logic.
So, if you flip a coin (Score:3, Funny)
Hey. what about the T-shirts! (Score:2)
Not to mention changing the Slashdot Sigs.
No. For the excelent afformentioned reasons , i vote we stick with binary.
Pointless (Score:2)
Go for the Ultimate Matrix! (Score:2)
The point is that after the density of wires will come closer to its limits, then the amount of states per wire can increase the informational density, making devices more compact. It will not come without a price: you'll have to have more precise "readers" that can read more states (levels?) per wire. I think it will certain
Will quarternary cost twice as much as binary? (Score:2)
Re:Will quarternary cost twice as much as binary? (Score:2)
It's not that hard really... (Score:2, Interesting)
All that would have to happen is for a middle range to be established: 2-3V is "2".
But why stop there? You could have a base10 system by making further divisions. It depends on your ability to differentiate between the various voltage levels.
You could even have a system wherein a multi-volt circuit checked to see what kind of inp
Re:It's not that hard really... (Score:3, Informative)
To distinguish between more logic levels, you'd have to increase the voltage level, and power is proportional to the square of voltage.
Re:It's not that hard really... (Score:2)
Re:It's not that hard really... (Score:2)
It's commonly assumed that people are base-10... (Score:2)
But consider this:
- there is only one of you, or 2^0
- you have two parents, or 2^1
- you have four grandparents, 2^2
etc, etc.
Re:It's commonly assumed that people are base-10.. (Score:3, Funny)
Only if I dig them up.
Re:It's commonly assumed that people are base-10.. (Score:5, Insightful)
Maybe you only use your 10 fingers to count to 10, but any self-respecting geek will use those 10 fingers to count, in binary, up to 1023 by using both states of their fingers to represent a one or zero (up or down). A base-1 system on your fingers is just a waste of states. With some practice you can even handle the unusual states like 21 and 27 easily (I use my thumb as 2^0).
Survey ... (Score:5, Funny)
Do you think three-valued logic is a good idea?
Re:Survey (Zen & the Art of Motorcycle Mainte (Score:2)
1. Yes
2. No
3. Mu
I say bring it on. (Score:2, Interesting)
Also, I want to work on a computing system wherein *every* single data has its own timestamp.
I've been experimenting with 64-bit processors in this regard - using regular 32-bits for data, and the remaining 32-bits as a timestamp - and it has produced some interesting results. Treating *all* data as though it has a 'last modified' timestamp...
If computing systems can be designed to take Time into account with every single operation, it can make f
Ternary programming would rock! (Score:5, Funny)
#define TRUE 1
#define MAYBE 2
Re:Ternary programming would rock! (Score:2, Interesting)
Following this to it's logical end (Score:3, Insightful)
Now I am confused.
For the love of god (Score:2, Funny)
There will be no more
Yes No Cancel
Now it will be
Yes No Maybe Cancel
Yes No Maybe Dont_Know Cancel
Yes No Yes(but I mean no) No (but I mean yes) Maybe Cancel......
Noise Margin (Score:4, Insightful)
Now, for a binary number system, digit 0 is [0, 2.5) volts, and digit 1 is (2.5, 5] volts. Clearly, the noise margin of the binary number system is much better than the noise margin of the base-3 number system.
Now consider the voltages of modern digital circuits. Consider a voltage range of [0, 1.5] volts. In a base-3 number systm, digit 0 would be [0, 0.5) volt. Digit 1 would be (0.5, 1.0) volt, and digit 2 would be (1.0, 1.5] volts.
For a binary number system, digit 0 is [0, 0.75) volt, and digit 1 is (0.75, 1.0] volt. Again, the noise margin of the binary number system is much better than the noise margin of the base-3 number system.
In fact, the noise margin of the binary number system is consistently 50% better than the noise margin of the base-3 number system. The noise margin of the binary number system is always better than the noise margin of the base-n number system, where n > 2. For this reason, engineers have not built and will not build digital systems with any non-binary number system.
Re:Noise Margin (Score:4, Informative)
Besides, you're wrong. People have built digital systems with non-binary number systems. There are flash memory chips that use a 4-level voltage scheme to increase data capacity.
I don't think so (Score:2)
No exponential is forever, but we can delay 'Forever'
Engineering gain or loss? (Score:3, Insightful)
As we add more states, intermediate voltages, to the system, the difference between states becomes smaller, ie. the difference between states 2 and 3 in a ternary system is less than states 1 and 2 in a binary system.
Hence a binary system can be made smaller and denser than a ternary system and still work.
We may gain in logic density but lose out in physical density.
various reasons? like physics? (Score:2, Insightful)
>idea that, for various reasons, never really caught on.
The various reasons not being so various; A binary system can be constructed in a much more stable fashion than can a trinary or quatrinary system. Everyone knows that 1 is not always exactly 5v (or 3v). Having several values confuses the picture even more
"but we have progressed enough that trinary systems are much more stable now!"
No matter what level of stability someone can ge
Re:various reasons? like physics? (Score:3, Insightful)
* Red, Green and Blue
* Apples, lemons and bananas
All unique, easily identifiable, only one step between them. No intermediate transition required. Now, we just have use an electronic equivalent.
Just base 3 or 4? How about base pi, e, i, 1,... (Score:5, Interesting)
Moores Law changed ??? (Score:2)
I may be wrong, but hasn't most of the effect of moores law existed based on improvements in chip design and manufacturing alone - ie. the doubling of transisters on a chip or the doubling of clock speed of a die.
If we were to double the clock speed or transisters of a tirnary computer, would that still give just a 2x performance increase, 8x, or somewhere in between? I can't even get my head around this enough to begin ...
Something I used to smirk at in CS 101 (Score:2)
I imagined that if a stable 10-state device could exist, programming would no longer need a mathematical priesthood.
I now think in my old age that someday a stable 10-state device will come. And it will be received with all the joy of wedding guests greeting a police raid. If you take the hex out of coding, how wi
Here, here! (Score:5, Funny)
Balanced Ternary, and Ternary circuits (Score:5, Interesting)
You can check out http://www.trinary.cc/Tutorial/Tutorial.htm [trinary.cc] for many examples of ternary circuits using ternary logic gates.
Sign Bit (Score:2)
Actually, you'd probably still have a sign tit... -1 for negative, +1 for positive, and 0 for imaginary.
Hehehehe... I said, 'tit'.
Problems with creating a 3 or more stable states (Score:2, Informative)
E-mail I wrote a long time ago about trinary (Score:2, Insightful)
From: Jeff Epler
To: steve@trinary.cc
Subject: Trinary adder efficiency
Do you know of any more efficient trinary adder designs? I've found an
online abstract of a paper that may have some, but I don't have access
to the paper itself:
http://www.computer.org/proceedings/ats/7129/7129 0 387abs.htm
Also, do you know if a "balanced trinary" adder (-1, 0, 1 trit values)
is any simpler than your trinary (0, 1, 2) adder?
I also performed a simplistic comparison of the propo
Coverage (Score:2)
I also read a paper once explaining how 3 was the optimal number base when you consider the number of different symbols needed and the width of a string of those symbols needed to represent numbers. I even solved the equations myself coming to this conclusion. You actually find "e" (2.718281828...) as your answer, but the closest whole number is 3, not 2.
Unfortunately, I don't have a link and Google has failed me
Perfect for women (Score:5, Funny)
0 = No
1 = Yes
2 = No (But I mean yes)
3 = Yes (But I mean no)
Im gonna get innto trouble (Score:2)
Women = Fuzzy Logic.
Me = Sleepin on the Couch.
From a trinary computing tutorial... (Score:2, Funny)
Another poster provided trinary computing tutorial [trinary.cc]. On one of the pages for the introduction [trinary.cc], the author writes:
As if we didn't lack sufficient sexual jokes regarding current computer technology. Now we have to introduce "trits" into the fray. N
Toomuch heat? (Score:3, Interesting)
Already our processors are dissipating a serious amount of heat. This heat is developed only during the switching time.
Picture a cpu clock:
_|-|_|-|_|-|_
Haha, something like that. Anyways, the heat is only developed during the vertical bars of that clock. (Because the vertical bars arent perfectly vertical in the real world and that P(heat)=VI. During the horizontal bars, only V or I are present, so no power, ie: no heat)
I dont exactly know how this ternary or quaternary computing works, but if its forcing the transistor to work in stages between full off and full on (1 and 0), you will be increasing the heat output by your cpu exponentially.
Correct me if im wrong on this, but maybe we'll really need those diamond semiconductors to make this feasable for high computing power applications.
Base 3 or 4 logic is NOT smaller than base 2. (Score:5, Informative)
Base 3 or higher are a lose for implementing logic. Base 4 is useful in some kinds of memory, and this has been done by Intel since around 1980-81. Intel used a quaternary ROM (two bits per cell) for the microcode store of the 43203 Interface Processor, and (IIRC) for the 8087. More recently this technique has been used in flash memory.
logic circuits don't like trinery and beyond (Score:4, Informative)
Ternary useful for async? (Score:3, Interesting)
Basically you want the truth table to be, in order of precedence:
0 AND * = 0
n AND 1 = n
n AND n = n
1 AND 1 = 1
(OR is the reverse, swap 0 and 1)
NOT 1 = 0, NOT 0 = 1, NOT n = n
Gates can be actually made to follow the right truth tables by diddling your substrate voltages in an otherwise fairly standard CMOS design; this has the effect of making your circuit twice as slow or quadrupling its power consumption though, which sucks. You also have to watch your noise thresholds here, transients can be nasty although they are unlikely to propagate far through a network of n's. This can be mostly fixed by further tinkering with thresholds, but then the leakage current becomes prohibitively high.
You can also just design extra logic with standard gates and watch your glitches very carefully.
There may be cleverer ways to do this, or the savings of asynch might be enough to make it useful anyway.
Aymara (Score:4, Interesting)
There is a great writeup of these people and their logic at:
http://www.aymara.org/biblio/igr/igr3.html
The article mentions that it is very difficult to impossible to express the logic of one culture in the language of another. Thus to understand better the inferences in Aymara logic, we have to resort to mathematics, which is sufficiently general to be understandable and translatable.
Important detail about truth tables (Score:4, Interesting)
A B out
0 0 1
0 1 1
1 0 0
1 1 1
Apply this to base 3 and you find that there are not just 3-operand operations but 1 and 2 as well. For one operand you can have a rotate-down(0>2,1>0,2>1), shift-down (0>0,1>0,2>1), rotate-up (note that in binary one-operand rotation happens to coincide with NOT), shift-up, and various arbitrary tables like the one above. For two operands you have NeitherBoth (00>0,01>2,02>1,10>2,11>1,12>0,20>1,21>0,22>2), and the arithmetic operators, plus a bunch of others with explanations i cant think of right now. For three operands there are thousands of possible truth tables, many with useful explanations, many many more arbitrary ones. Oh, and for both 2 and 3 operands you have multiple partial or complete counterparts to the traditional binary AND OR XOR that apply the same kinds of rules to the operands.
Ob Futurama Quote (Score:3)
Associative processing (Score:5, Informative)
Say you wanted to add an 8 bit field - bits 0-7, to another, bits 8-15, and store the result in a 9 bit field, 16-24.
Search as follows (CC Field is Carry):-
Whew. You have added the LSBs of the fields together, in 6 operations. There are 8 more to go. However, you have done it for the entire array which might be thousands of records.So there is a fixed processing time for parallel operations on all the data.
We still had to use two input lines to represent the Ternary value, but, remember, no address lines needed.
Content Addressable memory chips are also used for lookaside Cache memory in CPUs today.
Cheers, Andy!