Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Technology

First Digital Computer Dates back To 1944 143

swcox writes "Security restricted information on the first digital (semi) programmable computer has been released. A brief story and links to blueprints can be found in an article by the UK's Daily Telegraph. And for more details: Colossal code of silence broken Dr Donald Michie, at Edinburgh University, said: "Some will be startled to know that by VE Day Britain had a machine room of some 10 high-speed electronic computers on three-shift operation round the clock.""
This discussion has been archived. No new comments can be posted.

First Digital Computer Dates back to 1944

Comments Filter:
  • Anyone else consider the possibility that this might, by some off chance, be an english propagandic lie directly related (and in a vain attempt of support and background for British ingenuity concerning technology) to the recent hyperlink patent news by British Telecom? Or am I just crazy?
  • by Chagrin ( 128939 ) on Saturday September 30, 2000 @06:18PM (#741963) Homepage
    The first digital computer was built by Vincent Atanasoff and Clifford Berry at Iowa State University. This would have been between 1937 and 1942 and has been awarded full rights as the first true digital computer.

    Read about it here [iastate.edu]. They've even completed a replica model of it (the original one was cannibalized because interest in computers at that time was so low :)

  • I belive that Babbage never completed the analytical engine.

    Somthing to do with engineering at the time.

  • Yes, but it can be argued that the work Colossus performed had a major impact on WW2 and certainly influenced the length of the war, and quite possibly decided the victor...

    So whilst it can be said that Colossus did not have an input into the history of Computers, it certainly had a major input into real HISTORY.

    I'm not sure about the impact of Collosus on later events, but whilst it may not have influenced computing, it is more than possible it heavily influenced British codebreaking development. If we hadn't got such a bad spying record with Blake, Philby etc, perhaps we gained ground in the field of codebreaking...
  • Right you are. An interesting contrast between X and Blechly is the sheer size of the Colussus machine [slashdot.org] and the little computing engines ("Bombs") the Poles invented and the Blechly folks improved on. In Alan Hodges's biography of Turing [amazon.com], there's a sad/amusing story of how the Brits stuggled to build a dozen or so Bombs, only to see the Americans jump in and build them by the hundred.

    Hodges has a web site on which he uses the Bomb to argue [turing.org.uk] that Turing more or less invented the modern computer. Of course, Hodges is less than objective, since he sees Turing as a sort of poster boy for oppression of gay mathematicians.

    One interesting aspect of Hodges's book is the implication that Turing's suicide was actually a case of official murder. He doesn't say this outright -- either his evidence is too weak, or he's cautious about living in a country that can legally void due process.

    __________

  • Vikings did settle in North America (no, I'm not of Scandinavian descent).
  • It's actually WRNS (no English) but it IS pronounced wrens - like the wee birdie.
  • Yes, of course, it must be true because a federal judge ruled it so! The US court system is infallible. OJ Simpson didn't make a public mockery of it on television. Between 1/5 and 1/7 of those sentanced to death aren't found to be innocent after execution.

    There is enough reasonable doubt to ignore this ruling if you consider the facts. The most likely party who could contradict this ruling was the UK government. It's unlikely that they would interested or even bothered to participate in such a court case. At the time, any of their potential evidence was still sealed and classified. This whole court-case was about the promotion of the self-serving interests of a corporate entity (well, corporate entity might be the wrong legal term, but I hope the intention of the statement is clear.)

    As a bad analogy: if a federal judge from the Russian Confederation ruled that the Soviet Union discovered the first atomic bomb, would you care or even be mildly interested. Probably not, and the same goes for this ruling.

    Besides, who *really* cares where the first computer was invented? Judging by the number ofd jingoistic, ego-building, irrelevant posts, I would rather we didn't know. It really makes no difference to our lives. It's just turning into a pissing competition along nationalistic lines.
  • Not surprising that NSA has a fab, especially for me since I used to work there. The fab (known as the SPL, or special projects laboratory) is, quite understandably, actually used mostly for manufacturing in-house designed chips designed explicitly for cracking encryption via brute-force. It's quite decked out, with a single pour stable foundation, and clean room specifications similar to .25 micron industry standards last time I was there (last year). Costs major $$$, but one of these special purpose devices sure beats a bunch of general purpose devices, even parallelized. p.s. They also have their own supercomputer facility for just this purpose, but now this is going way OT
  • The United Kingdom gave the Enigma technology to France, for example. They used it routinely for diplomatic and military encryption until 1973. It makes you wonder what else is to be revealed that we don't yet know...
  • And the same man who axiomatized quantum theory, provided us with one possible (not ZF)axiomatization of set theory, proved the spectral and ergodic theorems, originated game theory, developed fundamental economic equilibrium growth models and came up with too many mathematical theorems and papers to list. There's a nice biography by Norman McRae(sic?). Unfortunately I forget the title...

    There's a nice anecdote about Von Neumann. He was asked the "brain teaser": two trains are on a collision course 1 mile apart. Both trains are going 10mph. There is a very scared bird flying 20mph back and forth between the two trains, turning around and flying in the opposite direction when it sees that it is about to be hit by one of them. How far does the bird fly before it goes splat? This is not a brain teaser if you note that the trains will collide in .05 hours and a bird going 20mph will fly 1 mile in that time. However some mathematicians can be tricked into treating the bird as a point which is going faster than the trains and can turn around instantly and, hence, makes an infinite number of reversals before the trains collide. One way to find the distance is to add up an infinite series for the distance for each trip back and forth. Mathematicians are predisposed to do this the hard way. When Von Neumann was asked this question he immediately came up with the right answer. When his friend asked him "So you know the trick?" he said "What trick? I just summed the infinite series."

  • You're half right. The chip was in develop when the first PII's were brought out and AMD finally solved the floating point issue. Soon later they released the K6-2's that had this fix. At that time frame they were in dev for the K7. At which point they knew Intel was going to release a new chip, no one knew what it was called, or how it worked/what it could do. But AMD had to be prepared for it. Which is why thier were reports of the K8/9 when the K7 500 was first released.
    I don't believe they are / or the current models have ever been referred to as the K8/9; however.
  • by Anonymous Coward
    I dunno... Next the Brit's will be saying that U571 wasn't in fact U571 and there was no American involvement at that point in the war.

    Plainly ridiculous - we all know that America leads the world in everything. You've read Brave New World, haven't you?

  • Feynmann also describes how they debugged the source code before the IBM Machines arrived - every secretary in the place with a single task (add the two numbers you're given together, double this number) and they passed cards back & forth. They reached very nearly the speed of the IBM by the time it arrived...

    Not to mention the multi-pass arrangement of cards, colour coded so they knew which was which - successive approximations to override errors!

    Mark


    Keeper of the Wedding Shenanigans Home Page

  • Because British stuff is only automatically de-classified 50 years after initially being classified. Sometimes it's de-classified sooner, but if it's considered too sensitive they don't until after 50 years when it has to be revealved by law.
  • On October 19, 1973, US Federal Judge Earl R. Larson signed his decision following a lengthy court trial which declared the ENIAC patent of Mauchly and Eckert invalid and named Atanasoff the inventor of the electronic digital computer -- the Atanasoff-Berry Computer or the ABC.
    And why should we trust that US judge any more than the one who decided that credit for the invention of Saccharin should go to the owner of the laboratory and not to the actual lab worker?
  • First, it isn't digital,

    No it was digital, that's rather the point. Did you mean something else?

  • by Durinia ( 72612 ) on Saturday September 30, 2000 @06:19PM (#741979)
    ...that ENIAC was NOT the first American Electronic Digital Computer. That title is held by the Atanasoff-Berry Computer [iastate.edu], which was developed from 1937-1942 at Iowa State University. Its precedence was shown in court in 1973. In fact, it appeared that several of the ENIAC ideas were borrowed from the "ABC".

    It would be interesting to see how these machines all "fit-together" in history. I.E. which ones were the first to develop each feature. As Atanasoff himself once put it:

    "I have always taken the position that there is enough credit for everyone in the invention and development of the electronic computer"

  • by Anonymous Coward

    "Atanasoff-Berry computer was the first digital computer. It was built by John Vincent Atanasoff and Clifford Berry at Iowa State University during 1937-42, and introduced the concepts of binary arithmetic, regenerative memory, and logic circuits."

    -- http://www.cs.iastate.edu/jva/jva-archive.shtml [iastate.edu]

  • i appreciate the feedback. tell me what you'd like to see.


    1. INTERACTIVE [mikegallay.com]
      1. ENTERTAINMENT

  • ...would it run Quake (or at least Wolfenstein 3d)?
  • by neilmjoh ( 156292 ) on Saturday September 30, 2000 @06:30PM (#741983) Homepage

    (from the Iowa State University Web Site [iastate.edu]):

    "The Atanasoff-Berry computer was the first digital computer. It was built by John Vincent Atanasoff and Clifford Berry at Iowa State University during 1937-42, and introduced the concepts of binary arithmetic, regenerative memory, and logic circuits.

    On October 19, 1973, US Federal Judge Earl R. Larson signed his decision following a lengthy court trial which declared the ENIAC patent of Mauchly and Eckert invalid and named Atanasoff the inventor of the electronic digital computer -- the Atanasoff-Berry Computer or the ABC.

    Clark Mollenhoff in his book, Atanasoff, Forgotten Father of the Computer, details the design and construction of the Atanasoff-Berry Computer with emphasis on the relationships of the individuals. Alice and Arthur Burks in their book, The First Electronic Computer: The Atanasoff Story, describe the design and construction of the ABC and provide a more technical perspective. Numerous articles provide additional information. In recognition of his achivement, Atanasoff was awarded the National Medal of Technology by President George Bush at the White house on November 13, 1990. "

    The story as I remember told in one of the books is that the creator of the ENIAC visited ISU during the development of the ABC and "borrowed" many of the concepts of the ABC for the ENIAC.

    Unfortunately, Iowa State never fully realized what Atanasoff and Berry had developed.

    As a Chemistry Professor at ISU told me "If they had, our toliet seats would be gold plated".

    Iowa State recently built a working model of the ABC to prove that it really did work.

    Check out this [iastate.edu]Web Site for more info.

    -Neil Johnson (Proud to be a Cyclone !)

  • by the time he got to the z80, though, things were sure cooking.

    --saint
    ----
  • by gwernol ( 167574 ) on Sunday October 01, 2000 @04:52AM (#741985)

    The first computer was designed by Charles Babbage in 1882.

    Babbages analytical engine is not a true computer, in the modern sense. It isn't fully programmable because it it not Turing-compatible. There is a large set of programs that you can run on your Intel box that the analytical engine couldn't compute.

    Although he never built one, a unviersty did from his original 1822 designs in the early 90's and it DOES work.

    I believe you are thinking of the Science Museum in London, which built a replica of Babbage's difference engine. Thhe difference engine is a much simpler machine that is a sophisticated mechanical calculator. It is certainly not a computer. The Science Museum are considering building a replica of the analytical engine, but haven't started work onit yet.

  • Didn't Michaelangelo have the idea if not the plans for a mechanical adding machine?

    Yes, but an adding machine is not necessarily a computer. There were many mechanical and electro-mechanical calculators available before the first electronic computers. IBM had a flourishing business making such beasts long before WWII.

    I know the idea isn't anywhere near new. What it really comes down to is how we classify the word "computer".

    There is a precise and definitive answer to this question. Alan Turing's famous paper "On Computable Numbers" proposes the logical foundation for all computing machines. Take a look at this [turing.org.uk] page on Andrew Hodges' web site. If you want to dive in to it, buy Hodges' excellent biography of Turing. Another great source of information on the mathematical basis for all computation is Douglas Hofstader's tour de force book "Godel, Escher and Bach". If you really want to understand computers you have to understand Turing's work.

  • ABC was using vacuum tubes, not relays. Some of these tubes are on display in Durham Hall in Iowa State along with the memory drum. Also there is a working replica of the computer, full with vacuum tubes. The electromechanical part was the drums containing the programs -- similar to a punch card, like a music box.
  • Collossus was not used to decypher enigma codes.
    This had already been done by some polish mathematicians, and later by Turing and his "bomb". (Electromechanical decyphering device)

    The collossus was used to decypher encrypted teletext transmissions. The transmissions were encrypted with pseudorandom numbers. Due to a mistake during one transmission by the germans (they send the same message twice) the english were able to figure out the generator polynom. The collosus was used to correlate encrypted transmission with this generator polynom.
  • Oh well, I am not sure whether it was really a LFSR PRN generator, but at least something similar.
  • Wasn't Colossus declassified some time ago?
  • If I remember correctly from my 8th-grade Algebra book, Pascal had actually built an adding machine...

    Not sure, though. Anyone have any thoughts on other mathematical geniuses whose extensions on 1+1 overshadowed inventiveness paralleled recently?
  • I keep seeing this recurring theme that the purpose of continued classification is probably due to the technology..

    Has it occured to anyone else but me that perhaps these things remain classified in order to prevent the deduction of the methods they used to get those technologies? Perhaps they have some procedure that allows them to perform research somewhat more efficiently than anyone else, and that procedure is what they're keeping secret..

    Personally, I think this may be a case where correlation of declassified data is more of a security risk than the actual data itself..

  • by AFCArchvile ( 221494 ) on Saturday September 30, 2000 @05:48PM (#741993)
    Remember, even ENIAC was digital, but it wasn't binary (it was base-10). I don't quite remember which one was the first binary computer (was it UNIVAC?), but this didn't come along until about the mid-50's (if I'm right.)
  • Has anyone read this book by Neal Stephenson called Cryptinomicon? The last part of the book was all about this. The first computer apparently was supposed to be dedicated to de-crypting Japanese codes.
  • by fm6 ( 162816 ) on Saturday September 30, 2000 @06:31PM (#741995) Homepage Journal
    The operator sent a 4,000-character message, after which the receiving station replied "didn't get that, please resend'. The operator obliged, but with the same start settings... these two streams of obscuring characters (or "key"), being exactly the same, cancelled each other out when the two intercepted transmissions were superimposed. But the operator had not fed the plain-text message into the enciphering machine at exactly the same point on the two occasions, displacing them by a 'stagger' of a couple of characters ... From this error, the code-breaker Brigadier John Tiltman was able to reconstruct the original message and deduce the key...

    This is similar to the Enigma story -- both codebreaking teams benefited from sloppy procedure by Axis radio operators. The German military suspected a leak, but took it on faith that their encryption was unbreakable. So instead of looking for flaws in their communication procedures, they went on a witch hunt for spies and traitors. There's a lesson in this -- I hope.

    By the way, I assume this is not the same Colossus that tried to subjugate humanity [imdb.com]?

    __________

  • by Anonymous Coward

    To find out how military and civillian computer stuff is related, use Google or Altavista to search for "COTS" sometime.

    If you're too lazy to do this yourself, COTS stands for Commercial Off The Shelf. Meaning you can usually get a COTS machine that's cheaper and more capeable than the "secret" alternatives.

    Also, for the record, the most of the New Big Machines at those "known government research facilities" consist of several thousand Intel processors all hooked together in cool ways. Use Google to search for "ASCI Red" if you want to know. Also, if you want to know how to program such a machine, use Google to search for "MPI".

    Sorry to burst your bubble, but the government isn't competent enough to build or keep secret anything that's signifigantly ahead of what's possible in the private sector.

  • hmmm.... Anyone here heard of Blaise Pascal?



    Did addition and subtraction. No batteries required thanks to the convient hand crank. 1642

  • by EricEldred ( 175470 ) on Saturday September 30, 2000 @06:42PM (#741998) Homepage

    This new information indicates Colossus was the first electronic computer, and Colossus 2 the first programmable electronic computer, doesn't it?

    The Zuse machines and the ABC from Iowa were not really electronic, but electro-mechanical, like the Mark I at Harvard, using relays instead of vacuum tubes ("valves" in British parlance), according to the computer histories on the net I've seen. Even though many of these electro-mechanical devices used punched tape or card input, they were not necessarily programmable.

    According to these stories, Colossus had 1,500 vacuum tubes, while Colossus 2 had 2,500, and there were 10 of the latter by the end of the war (a year after Colossus 2's introduction)--an immense achievement.

    This also might confirm that Alan Turing really was the first computer programmer, as others have already indicated.

    And it reveals, interestingly, that cracking the Enigma code was not even the main purpose of all this effort, but the other supersecret German military code.

    Maybe the next historical revelations will be about the computing power behind the atomic bomb, the first ones and later ones. Were the British computer experts allowed to play a role in this? What were the early Russian computers like?

  • That was painful to witness.
  • This'll probably break my running record of only score "one" posts, but...

    Schmoo!

    My step-father was Fire Control on a Navy battleship in the seventies, (Good way to stay out of Vietnam, I guess.) and he described to me the computer equipment they had on board:

    Magnetic-core memory...Loop of iron that could be assigned a binary value, but would lose that value as soon as that address was read...

    Schmoo memory...The equivalent of ROM. The loop of iron had another, permanent magnet attached to it, to restore its magnetic state as soon as the data was read.

    The RAM banks consisted of a fine network of wires basically(from what I gather) providing a coordinate system of access to the memory...however, this network was so complex that the manufacturing process had to be done by hand and suffered a 99% fail rate.

    The 'computers' they used were progrogrammable, although they were basically a huge tree of electromechanical relays..gosh, there's just too much for me to remember to type.

    Now, how much of this is exadgeration (sp) I don't know...

  • by hawk ( 1151 ) <hawk@eyry.org> on Sunday October 01, 2000 @07:10AM (#742001) Journal
    And even before that, in the '30's, Dr. Atanasoff and his graduate student Berry gave their initials to the ABC at Iowa State, which was electronic and digital. Programability left a lot to be desired, though :) [It solved sets of 17 equations in 17 unknowns].

    This was the computer that led to the invalidation of the ENIAC patents--many were for things already done on the ABC, which the ENIAC designers had axctually examined.

    Two replicas of the ABC were built, one to tour, and one to live in the Smithsonian. One was fired up to solve a problem.

    hawk

    p.s. scrounge around http://www.iastate.edu to find lots of articles and pages onit.
  • but for the parts, which were turned into a bit of this, and a bit of that, iirc.

    ONce the ABC solved the set of equations it was built to solve, everyone pretty much lost interest . . .

    hawk, harumphing at the idea that it was electromechanical--though memory was on rotating drums of capacitors, a breakthrough in and of itself . . .]
  • Not magnetic-core, no. However, the drums of capacitors were the first regenerative memory.
  • I think the history of computers is just as important as the current information and technology. Its was only 10 years ago that people were still using separate controller cards for Hard drives and Floppy drives, bus mice and other such devices.
    How many people in the computer field really know about the history of computers? It may not be a requirement to use one, but I think in order to be considered an expert (A+, MCSE, etc), maybe they should include a little history aswell.

    Maybe thier should be a hardware database, with pictures and specs on the internet. Or maybe even a separate database from the Internet, create another ARPNET (consult history books for that one too).

    [Side note: A+ does include history to a certain extent, but there were computers before Intel; and contrary to what my current A+ manual states, NEXT GEN did not develop the K7 to compete with the Pentium II, AMD developed it and it was to compete with the Pentium III]
  • John Vincent Atanasoff and Clifford Berry were the first. Just becase their computer never took off for much other than solving differential equations, doesn't mean it wasn't the first.
    Atanasoff Berry Computer [iastate.edu]
  • Yes, the general rule in the UK is the 30 Years' Rule, although this is sometimes extended to 50 or even longer, depending on context...

    But really, we have to remember that this was in wartime. Money really was no object. After WWII, Bletchley Park was entirely dismantled, and the GCHQ moved on to other things. (Like discovering RSA -- but that's not like quantam computing: the first is insight, the latter engineering) While they might make insightful breakthroughs, things like Collosus and bletchley require maassive funding (not to mention employee loyalty), and probably don't happen much in peacetime.

    This is why all the griping over the enigma machine handouts makes so little sense: it really took a massive effort to break a well organised enigma network back then, several thousand people. Neither Britain nor anyone else were going to devote that kind of effort to one minor cypher used in friendly or post-colonial countries. Of course, with computers coming into wider use, it got much easier...

  • Colossus and Zuse's machine were built before the ABC was completed in 1942.


    ---
  • by Anonymous Coward

    How many people in the computer field really know about the history of computers? It may not be a requirement to use one, but I think in order to be considered an expert (A+, MCSE, etc), maybe they should include a little history aswell.

    I went to Bletchley a couple of years back. Kind of a pilgrimage - very interesting. The Colossus' paper tape reader ran at 5,000cps - 5k/sec is quite respectable when compared with a modern modem. The "Museum of Computing" is more like the contents of somebody's garage packed in cardboard boxes, but it's all the more fun for that.

    Have to agree that the history of computing is very important. It helps to know where you have come from. It is also interesting, coming from England and being used to the "Colossus was the first stored-program electronic computer" argument, to see the views of people in other countries.

    BTW, I really wish my university had taught its CS grads some contract law alongside straight CompSci. It's as important as the history...

  • What people have to realize about historical events is that they are only meaningful if they influence later events. While Canadians and Americans of Scandinavian descent like to bring up Leif Erikson as the first European to discover the New World, the simple fact is that his discovery (if indeed it occured) was meaningless because it did not lead to anything. Columbus' discovery lead of course to European colonization and therefore it was significant.

    Similarly, Colossus did not lead to the evolution of today's computers, as it was classified in the early days of computing Even the ABC is only significant because it may have influenced the design of the ENIAC, which was the ancestor of every computer now in use.
  • Imagine a beowolf cluster of them

    I just closed my eyes, and saw a nuclear power plant, with cooling towers and everything.

  • Cryptinomicon is fiction, but with a lot of real people and events in it. Of course, the "historical" parts should be taken with a grain. I don't think Turing was quite as swish as Stephenson portrays him...

    There's a lon g excerpt [sffworld.com] online.

    I have to throw in my favorite Alan Turing story. During the war, he was sent on a secret mission to the US. Since his work was very sensitive, he was ordered to take no documents on the trip. Turing being Turing, he interpreted this order quite literally -- he left behind his passport and identity papers. Imagine the reaction of an Immigration officer approached by a Brit just off the boat, with no proof of identity, claiming to be on a secret mission....

    __________

  • I heard they tried to publish this sometime back but Amazon.com got a patent on the concept and forced them to retract it.
  • I wrote my history thesis on an aspect of the Manhattan Project, and I remember looking through John von Neumann's papers at the Library of Congress. It looked like he was working on some interesting computer ideas while he was at Los Alamos. That wasn't the topic of my thesis, so I didn't really pay attention, except noting to myself that someday I want to look into it further.

    Anyone know more about what von Neumann was working on? That also would have been the early-to-mid 1940's.

    A quick glance at _The Making of the Atomic Bomb_ by Richard Rhodes reveals this:

    "Such work could not be done reliably by hand with desktop calculation machines. Fortunately the laboratory had already ordered IBM punchcard sorters to facilitate calculating the critical mass of odd-shaped bomb cores. The IBM equipment arrived early in April 1944 and the Theoretical Division immediately put it to good use running brute-force implosion numbers. Hydrodynamic problems, detailed and repetitious, were particularly adaptable to machine computation; the challenge apparently set von Neumann thinking about how such machines might be improved." (page 544)

    Thanks,
    ccg at aya dot yale dot edu
  • Here [ddj.com] is a Java bytecode implementation of the Z3. Dr. Dobbs Journal had an article on it in their Sept. 2000 issue.

  • Hmm... Atanasoff has/had the patent on electronic digital computing. I got dibs on the vacuum-tubeless binary photonic computing patent! If Rambus hasn't already beat me to it. ;)
  • I've heard the opposite in a biography of the creators of ENIAC. That Astantoff stole ideas from ENIAC. There was a huge debate and a court case and ENIAC lost.

    But all that means is that Astantoff swung the court case, possibly with big bucks from Sperry.

    I'm not sure what the actual truth is. We all know how malleable the court system is when lots of money is involved, and this certainly wouldn't be the first time that the true inventors of something (the ENIAC guys:Presper Eckert and John Mauchly) were passed over in favor of the big ego and bigger checkbook.

    -Illserve
  • Cause the Master's message to the universe got thwated by the 4th Doctor but he fell and had to regenerate cause Adrick and Tegan got caught and couldn't save him in time. Thanks for all your help Brigadiere.
  • ... the court case could be part of the reason it remained classified. People don't go to court to prove who did something first unless there's money involved too (meaning the technology had worth in the US.)

    There's really little point in advertising your level of technical advancement, since patents and secrecy are mutually exclusive. Better to be secretly ahead of others instead of flaunting it in their face (especially if you cant extort money out of the runner ups in your technology race.)

    Also, invalidating the ENIAC patents would simply lead to more competition in the US. (certainly not in the UKs best interest.)

    --
  • I think the moral of this, and other posts on this article is pretty clear - there was no revolution where some guy 'invented' the computer, it was all a series of evolutions.

    Determining who created the first computer is like determining who was the first human. It simply depends on how widely you cast your net.

    Those giants you're standing on the shoulders of are awfully tall. Which is not to diminish the achievements of the people involved.
  • Shift the rotors and you had multiplication too. FatPhil (Who has stood |this| far from Colossus while it was running)
  • I've also had a chance to see Tony Sale's reconstruction of Colossus. He's published a little booklet with a description of the process used to figure out the design, and in that he says that all the blueprints and other technical information on Colussus was destroyed after the war; his reconstruction was put together from a handful of old photographs, a few scraps of paper, and interviews with a couple of the original developers (though I don't remember seeing the name of the fellow quoted in the story). After all that reverse engineering, this new report must have been a bittersweet moment for him -- had it been released a few years earlier, he would have had a much easier time of it (but perhaps not as much fun...)
  • Actually, I've seen the ABC pictures and replica, and it DOES use tubes. See this site [ameslab.gov] for some cool pictures of the replica they built, including actual vacuum tubes manufactured in the 40s!
  • The first computer was designed by Charles Babbage
    in 1882. Although he never built one, a unviersty did from his original 1822 designs in the early 90's and it DOES work.

    This page has a nice summary:

    http://www.maxmon.com/1822ad.htm

  • Now I remember. They were also the first to use magnetic-core memory, weren't they?
  • at beta launch (nov. 1), members will be privy to all the content on the site. non-members will not. for now i'm just testing to see what percentage of users are comfortable simply signing up for a meaningless membership.

    the site will be a full on human solutions site; meaning, all the problems you've got with other folks - we'll help you take care of 'em - from revenge to lovelife, etc., and it will be launched two months after in a slashdot-style community where users will dictate the editorial, thereby creating a many-to-many community where individuals can be entertained and get the solutions to those pesky problems they need (facts within fiction context).


    1. INTERACTIVE [mikegallay.com]
      1. ENTERTAINMENT

  • ..also known as ABC [ameslab.gov], constructed in 1942 at Iowa State University. It's such a shame the ENIAC overshadowed this wonderful project. The machine had a precursor of todays Dynamic RAM (used charged capacitances to store digital information).

  • Actually, the ABC did use vaccum tubes and was digital, binary, and electronic. There is an example tube in Durham Hall at Iowa State. Unfortunately it is not an original, the only remaining artifact of the original ABC was one of the two memory drums, which basically worked like DRAM.

    The origial ABC was dismantled by a grad student who needed the storage space. (I've heard that this grad student is the head of the physics dept. now.) They had no idea what it actually was until after WW2 was over (and ENIAC was known.) This is why the ABC does not get the credit it deserves.

    A quote from http://www.cs.iastate.edu/jva/jva-archive.shtml

    Atanasoff-Berry computer was the first digital computer. It was built by John Vincent Atanasoff and Clifford Berry at Iowa State University during 1937-42, and introduced the concepts of binary arithmetic, regenerative memory, and logic circuits.

    There are many interesting links about the ABC on that site as well.
  • The cool thing about this is that although I always thought that Babbage was a crackpot with a good idea that he was actually Lucasian professor at Cambridge. Holding the same position that Isaac Newton and now Stephen Hawking hold.
  • Richard P Feynmann's autobiographies ("Surely You're Joking Mr Feynmann", and "What do YOU care what other people think?") also discuss some of the IBM card calculators, and things that were done to minimise errors and steamline the calculations done at Los Alamos, although more as anecdotes than in any real specifics.

    Both books, and especially the first one, are an entertaining read for anyone with even a slight physics interest too - I'd recommend them.
  • I almost agree: Konrad Zuse COULD have been the first, if the Germans had decided to fund his research. But, to be honest, the Z1 is not really a programmable machine, neither was the Z2 or the Z3. The Z1 was actually a purely mechanical device, correct me if I'm wrong.

    Nevertheless, Zuse deserves a lot of credit for his work! He was a true hero of the information age. Too bad he was on the wrong side, at that time.

  • My Grandmother worked as one of the WRENS (Womens Royal English Naval Service), entering the encoded message into the enigma machines for decoding. I did a school project on it some years back.

    The Germans' code was an encrypted tesxt message, encoded by a machine known as the Enigma Machine. The Enigma Machine consisted of a series of gears and wheels that would encode the message depending on the setting on the front of the machine.

    The English had (relatively) no trouble obtaining a machine, as someone simply stole it from the germans and they copied it. What the computer was used for was to determine the setting needed on the machine to get a sensible message out. I can't actually remember how, but that's what it did.

    It was using this that enabled the English forewarning if battles and to determine the positions of the germans.

    This is pretty old news, though, The books that I got all this from for my project were old at the time. As far as I know, the information was released thirty years after the event as per legislation.
  • I did AI at Edinburgh, and had Mitchie as a lecturer as few times. I would have actually turned up and listened more frequently if I had known his past - as it is I spent 2 years down the pub, followed by 2 years programming in the bowels of the machine rooms - they actually gave undergraduates keys to the machine rooms full of sun workstations so you could work all night if you preferred (which all the serious guys did - you get more resources at 3.00am).
  • Zuse was the first. However, his circuits were built using electro-mechanical devices. Why was he first?

    (1) his machines were BINARY. ENIAC was stupid enough to be decimal--this is *extremely* inefficient and accounts for why eniac was so damned large.

    (2) he has floating point (oh my god, we americans didn't think of that for years).

    (3) his machine wasn't ELECTRONIC. but, it was digital if i know what digital is.... others are confused, since his machine wasn't ELECTRONIC, they assume that his machine was not the first computer. This is wrong, of course. His machines were digital as digital can be, but his switches were electromechanical and not purely electronic. Thought i'm not german (and very american--have no german ancestry, in fact), i want to respect this german genius. He was freaking amazing, and i don't want people to steal his accomplishments. -patrick.

  • by Mr. Protocol ( 73424 ) on Saturday September 30, 2000 @08:58PM (#742045)
    Tony Sale, at the Museum of Cryptography at Bletchley Park, has reconstructed a running Colossus. At the time I visited, he had two out of its five parallel channels up and working (I believe all five are now complete). I asked him a couple of semi-intelligent questions, and he promptly grabbed me by the elbow and dragged me into the Colossus machine room. He stood me in the middle of the frame and pointed out the machine's various sections, then ran over to the side and turned it on around me.

    This was exciting on several levels, because it is very much a tube machine - lots and lots of tubes - and they all run at +400 volts plate voltage. I didn't make a whole lot of extraneous movements.

    Perhaps the most impressive thing about it, visually (besides all the glowing tubes) is the high-speed paper tape reader that runs a 5-level Baudot tape over and over again in a loop as the machine searches for correlations. The reader's made of machined aluminum (or aluminium, over there), and stands about six feet high. It reads 5,000 characters per second, and the impulses from the smaller feeder holes form the machine clock.

    This is an absolute don't-miss if you get to London. Bletchley Park is a fine day trip by train.
  • by altman ( 2944 ) on Sunday October 01, 2000 @12:24AM (#742046) Homepage
    Actually, the first *stored program* computer was binary - the Manchester "Baby":

    http://www.computer50.org/

    This used CRT tubes to provide the RAM - refresh involved a photosensor on the surface of the CRT feeding back to the CRT itself. Very cute.

    Hugo
  • Interestingly they couldn't then go on to use as much of the information as they might have wished because if they had the Germans would have know that their messages were being intercepted.

    It is believed, for example, that considerable advance warning was available about the invasion of Crete, but the Allied command were prevented from acting on it because that would have revealed that the German encryption had been cracked.

  • (1) his machines were BINARY. ENIAC was stupid enough to be decimal--this is *extremely* inefficient and accounts for why eniac was so damned large.

    Decimal is not "stupid". It used to be standard for business computers to use decimal arithmetic and scientific computers to use binary arithmetic. Later computers, notably the IBM 360 series, unified the two, supporting decimal and binary arithmetic on a single system.

  • This new information indicates Colossus was the first electronic computer, and Colossus 2 the first programmable electronic computer, doesn't it?

    Except that it isn't exactly new information, this was part of a British TV documentry first broadcast 2 years ago. Quite a bit of information on Station X has been known since the 1970's and finally some people have known exactly what went on for nearly 60 years.

    This also might confirm that Alan Turing really was the first computer programmer, as others have already indicated.

    More likely Tommy Flowers, what engineer would build such a device without testing it?

    And it reveals, interestingly, that cracking the Enigma code was not even the main purpose of all this effort, but the other supersecret German military code.


    They could already crack the variations of Enigma in use, however they couldn't crack "fish".
  • What's more worrysome for me though is what they have classified that we don't know about. What sort of AI or Quantum Prime Factoring Machine do they have classified now that we'll be hearing about in 2055?

    Except if they do a good enough job of classifying something in the first place then it may stay hidden for a very long time.
  • A beowulf cluster of these. Really, really, hot, noisy, and big.
  • by Ellen Spertus ( 31819 ) on Saturday September 30, 2000 @05:57PM (#742062) Homepage
    Konrad Zuse built a programmable binary computer in 1941. It even had floating-point support! You can read about it in English [t-online.de] or German [tonline.de].
  • ...any one who read cryptonomicon knows that a waterhouse invented it during wwII.

    wait, no! that was a joke!

    -Peter
  • Why was this project classified for 55 years?

    Did they think back then that computer technology wouldn't advance that much in that time so it might still be a security risk today?
    Is there a standard period for classification?

  • Then what do you consider binary-coded-decimal, or BCD?

    Base conversions are very complicated, and if it was easier to build a computer that was decimal all the way through than to divide and multiply by 10 (which is very computationally intensive) and subtract to get modulos... the instructions for this process can take a significant amount of punch-cards! Even the Motorola 680xx series (the not-too-distant anscestor of today's Palm s and many embedded DSPs) included add, subract, and conversions to and from BCD to aid driving numeric displays.

    Actually, one of the main reasons for it being binary is probably the fact that floating-point arithmetic can hardly be done at all in any other base than 2. And binary isn't the same as digital: base-10 is still digital. The difference is that in digital computers all arithmetic results are infinitely precise and reproducible, as opposed to analog computers which add and subtract voltages, which are inherently irrational numbers and therefore cannot be represented as digits (i.e., not digital).

    I'm sorry, it's late at night. I don't mean to be overly critical - this guy invented the first FPU, way ahead of his time - but if it didn't have conditional branching etc. you have to question if it really ran a program, per se. I just read this [best.com], a link I found elsewhere in this discussion. It goes into some detail about Zuse, you might like it.

    Fsck this hard drive! Although it probably won't work...
    foo = bar/*myPtr;

  • The UK government seems to be even slower than the US government at declassifying documents. There are laws in the US that specify declassification schedules, but there are plenty of exemptions for things like cryptographic related information. There is still stuff from World War II that hasn't been declassified.
  • by shockwaverider ( 78582 ) on Saturday September 30, 2000 @10:26PM (#742074)
    I utterly agree. Bletchley park is a remarkable place and well worth a visit. If you manage to get Tony Sale to give you a tour then so much the better as the man really knows his stuff.

    Bletchley Park is the location where computers were conceived, designed and first built. It is the place where Alan Turing started laying the foundations of the industry you now work in [OK,assumption], and until recently its cryptanalysis role was still a secret. That's remarkable considering the amount of people that worked there in that capacity during the war.

    You really owe it to yourself to have a look. Apart from anything else it's tremendous fun!

    Example: the Computer Conservation Society's display. A room filled with old computers. Some powered on, some not. From the earliest VAX through to the latest "built-to-the-hilt" PC. All of them "hands-on" and running funstuff.

    Example of "first soundcard". A radio tuned to RF frequency of interference generated by PDP mainframe. Program running on mainframe designed to cause the processor to generate RF interference, by the use of tight loops etc.

    Look. the place doesn't cost much to visit. It's fairly central within the UK, Just go OK?

  • I'm sure part of it was simply momentum. That is, it was in the classified archives, why dig through them looking for stuff to declassify?

    Another part, though, is that after WWII, some British allies were given Enigma machines and some of those were in use until some time in the 70s. (of course, that doesn't explain the other 20+ years, but it gets you more than half way)

    And I'm sure when it first got classified, they had absolutely no idea how far computer technology would advance, or probably even if it would ever make it outside of secret government installations.

  • I bet it must have been cool to get the first copy of AOL Beta Platinum 0.4 on punch-card tape in the mail.

    But it got old quick when they started getting them three times a week. Laster the scientists decided to paper the wall with them.

    "If only they were shint and round," they imagined..."



  • First, it isn't digital, second this isn't new information, modulo the actual plans for the computers. Third, didja know the NSA has its own fabrication plant? :) Makes you wonder...

    --

  • No, Waterhouse invented memory using standing waves in mercury... shortly after RAMBUS patented it.
  • Cryptinomicon, while an entertaining book, is fiction

    In a Forest Gump kind of way.
  • Around 1940 Bell was designing a binary computer that used RAM, it used capacitors for RAM. They couldn't get the hardware to meet with the design, however.
  • didja know the NSA's buget (from what can be guessed by looking at "black money") is TWICE as large as Nasa's? No wonder they have a fab. It is a wonder that they don't have moonraker too :)
  • Much of this story can be found in The Code Book [slashdot.org] by Simon Singh, published last year.

  • This is the Enigma story - Colossus was used to break the Enigma code - unless you believe that U571 film :-)

    How many times does it need saying that Colossus was NOT used for cracking Enigma. Instead it was used for cracking a high level (used between Hitler and senior generals) teletype cypher, codenamed "fish" by the Station X people.
    The Enigma cypher was used throughout the German armed services and was sent as Morse.
  • Actually, it's probable that Tony Sale's work on the Colossus reconstruction is one of the main reasons why this work was declassified at all. They could hardly hold onto it now that a working reconstruction is publicly available.

    The Americans have wired a declassification schedule into the classification machinery; positive action has to be taken to keep things classified beyond their natural expiration date. I don't think the British have any such law, so they just tend to hang onto things long past any reasonable deadline.

    Tony ran into this time and again, skirting the edges of the Official Secrets act in the course of rebuilding Colossus. The only reason he was able to complete it at all was because the folks who'd worked on this stuff in WW II were reaching the ends of their lives, and many of them decided to write their memoires and the Official Secrets Act be damned. Rather than lock up a bunch of antique war heroes turned popular authors, the British government finally started to let this stuff go. Tony's work blew the lid off thereafter.
  • A few years ago there was a landmark action. The first files relating to UK covert foreign intelligence work (spying) EVER were declassified. They turned out to relate almost exclusively to the Napoleonic wars. I believe they are considering declassifying some Crimean and Boer war material sometime this century.
  • I imagine a big part of it was not the computer technology, which is indeed, not really cutting edge any more, but the cryptanalytic methodology, which is much slower to change. Computers now can search 10^9 cases per second (say) now insteaqd of 10^3 then, but the analysis of operation procedure, message formats, procedural errors, etc. that gets you down to this many cases probably dates much less quickly.
  • by DrWiggy ( 143807 ) on Sunday October 01, 2000 @03:03AM (#742098)
    OK, firstly we need to make clear that the project occured between 1937 and 1942. The machine was not built in 1937 - it wasn't completed until 1942. Seeing as the official figures on the kit down at Bletchley Park in the UK appear to put them bringing the first machine up in 1941-ish, the ABC was not the first computer. It was the first computer that could be talked about.

    Secondly, let's just clear up this nonsense about court cases proving it was the first computer. The argument was between the builders of ENIAC and the ABC. How likely do you think it is that the UK Government were going to walk into a court room and argue their part on this, espeically as the project was still classified in the 1970's?

    You would be amazed at how much stuff is sitting around out there that is only now starting to get de-classified. For example, did you know that public key cryptography is now publically acknowleged as actually having been "discovered" at GCHQ in the UK some significant time before RSA made it out into the big wide world. Just because two commercial entities "prove" in a court which one invented something first, doesn't mean to say that there isn't a western government that actually invented it first, but are keeping it under wraps.
  • The reason for the secrecy around Colossus is actually a lot more insidious than you'd think. After the war, the Allied powers provided enigma machines to their various colonies and satellite states, telling them that this encryption had proven uncrackable. Concealing the colossus, which made the enigma fairly easy to crack, was part of this.

    Probably the first ever "back door" installed ever, no?

  • Kind of makes you wonder just how far ahead of the commercial market the military is regarding computer technology today.

    Would be interesting if all our innovations from Intel and clan were really all developed by the government and leaked out. :)

    Seriously though, they may have computers more massively parallel than any known acedemic (or known government) research facility. It isn't too likely they are very far ahead in the speed of a single processor, if they even work on such a thing.
    -

The explanation requiring the fewest assumptions is the most likely to be correct. -- William of Occam

Working...