Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Technology

CBS and MIT's 1960 Documentary On AI Is a Gem (fastcompany.com) 47

FastCompany magazine editor and Slashdot reader harrymcc writes: On the night of October 26, 1960, CBS aired a special -- coproduced with MIT -- about an emerging field of technology called 'artificial intelligence.' It featured demos -- like a checkers-playing computer and one that wrote scripts for TV westerns -- along with sound bits from leading scientists on the question of whether machines would ever think. It was well reviewed at the time and then mostly forgotten. But it's available on YouTube, and surprisingly relevant to today's AI challenges, 59 years later.
This discussion has been archived. No new comments can be posted.

CBS and MIT's 1960 Documentary On AI Is a Gem

Comments Filter:
  • by 93 Escort Wagon ( 326346 ) on Monday September 16, 2019 @04:51PM (#59200722)

    Is that 1/8 of a sound bite?

  • by PolygamousRanchKid ( 1290638 ) on Monday September 16, 2019 @04:55PM (#59200742)

    "The question of whether Machines Can Think... is about as relevant as the question of whether Submarines Can Swim"

    • The question is can humans think.

      I wonder what an alien species would answer.
      • Alien species: Define "think"!
      • The question is can humans think.

        From the state of things today, no.

      • by anegg ( 1390659 )

        The question is can humans think.

        I think we *think* humans think. But we need to determine whether "Humans compute" (and we can eventually uncover the algorithms that they use), or we can make "Machines think" (and "thinking" is somehow more than computing).

        If all of what humans do in their brains can be reduced to computation, then it is likely that we can make machines do it, too. Whether we'll want to is another question, especially if the machines are likely to do it better than humans.

        • Computing is simply the large subset of human thinking processes that can be carried out by a Universal Turing Machine.

          Except that, when formalized and automated, it can sometimes be carried out so much faster than a human brain can think that it seems qualitatively different.

          • by anegg ( 1390659 )
            I worded my statement poorly. Instead of "... determine whether 'Humans compute'" I should have said something more like "... determine whether what humans do can be reduced to only computation". In other words, is all what humans do subject to reproduction by computation, or is there some element of what humans do that cannot be reduced to computation?
    • If Dijkstra said that, then Dijkstra is a retard.

      Intelligence is far more important and significant than specific utility. And no, submarines can't swim.

      • by backslashdot ( 95548 ) on Monday September 16, 2019 @06:41PM (#59201076)

        Considering it's clear that you are one of those with zero in the way of achievements, let alone achievements in comparison to Dijkstra (whose contributions to computer science anyone should appreciate) you have no credibility when it comes to calling him a retard. Second, you obviously don't grasp the meaning of what he said.

      • If Dijkstra said that, then Dijkstra is a retard.

        Intelligence is far more important and significant than specific utility. And no, submarines can't swim.

        /sigh

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        If Dijkstra said that, then Dijkstra is a retard.

        Intelligence is far more important and significant than specific utility. And no, submarines can't swim.

        "Whoosh." -Edsger W. Dijkstra

      • His point, which you clearly don't understand, is that Submarines do what they do and that thing is similar to what humans can but they are far better at it. Arguing if what computers that outperform humans do is "thinking" or not doesn't change the fact that they do what we can do but better in important ways. They can't feel? ... Oh no! There goes my opportunity to have my computer refuse to talk to me for 3 days because I didn't notice the way it rearranged its bits!
      • If the quote is correct, then he used the word relevant, which isn't synonymous with important or significant. In other words, the question of whether a submarine swims isn't relevant to its operation, whether it "swims" is more philosophical in nature. Regardless of whether a computer is intelligent, it still computes.

  • by fattmatt ( 1042156 ) on Monday September 16, 2019 @05:07PM (#59200792)

    of an AR and AI blockchain cloud app on the Linux desktop

  • by BAReFO0t ( 6240524 ) on Monday September 16, 2019 @05:09PM (#59200800)
    Call us, when you actually simulate neurons, instead of showing us a hand brush (weight matrix multiplication), and telling us it will grow into a wild boar! A red plastic brush!

    I literally did better neuron simulations as a hobby in 1999! When I was 18!
    I was offered an well-paid job to do "AI" recently. I got all the "bibles" of the industry, to read, so I'm at the cutting edge.
    I thought they were trolling me! As somebody who knows his share of actual neurology, ... it was a complete joke! So many cut corners; so many feats of insane half-assery; plus *completely missing the point of neural nets* ... and then they wondered why their nets did not perform a 100th as well as real ones. --.--

    Sorry but the whole industry, like the HTML5/Web n.0/WhatWG one, is medically certified batshit insane.
    At least the part that is not hacks, so clueless, they do not even reacg the level where insanity could be determined.
    • it was a complete joke!

      Most of IT is a joke, not just AI. It takes roughly 3x longer in the typical org to develop small and medium internal CRUD apps than it did in the 90's, and everybody just says, "that's just the way it is. As long as they pay me, I don't care if takes longer, not my dime."

      Some try to justify by saying deployment is cheaper with the web, saving on desktop support costs. But I'm not sure this trade-off is inherent. What Grand Law of Physics keeps the industry from improving desktop d

      • “... what you’ve just said is one of the most insanely idiotic things I have ever heard. At no point in your rambling, incoherent response were you even close to anything that could be considered a rational thought. Everyone in this room is now dumber for having listened to it. I award you no points, and may God have mercy on your soul.”

        • Seriously though? Comparing application dev of the 90's to application dev today?
          .
          When medical doctors first started "practicing" it was a two week course to get licensed.

          Shit changes brah.

          Jesus Christ.

          • by Tablizer ( 95088 )

            When medical doctors first started "practicing" it was a two week course to get licensed.

            But doctors know more and can do more now. New apps don't. Webifying them didn't make them any more useful or usable for the end users on average.

            CRUD de-evolved, I'm just the messenger. Sure, it may may have evolved gills preparing for Waterworld, but the planet never flooded and dry gills tend to get infections. The future missed the target.

            • Some might argue that they have "discovered" far more unknowns than knows.

              Point is without using a structured , measured approach (which takes longer) to solving problems, more problems tend to crop up, whether you find them or not.

              • by Tablizer ( 95088 )

                Point is without using a structured , measured approach (which takes longer) to solving problems, more problems tend to crop up, whether you find them or not.

                Most of the extra "problem solving" steps are work-arounds to the web's limitations (from a CRUD perspective), or people following trends they haven't investigated well for fear of being left behind.

                If you could make an argument along the lines of "in order to get benefit X, Y, and Z; we must live with annoyance A, B, and C", then maybe we can agree we

            • We have also gotten to the point where it is no longer feasible for one or two guys to code an entire project.

              Splitting up the duties adds overhead.

              • by Tablizer ( 95088 )

                We have also gotten to the point where it is no longer feasible for one or two guys to code an entire project. Splitting up the duties adds overhead.

                That's because our tools are too labor intensive, requiring specialization to split up and master the layers of suckage. It's a self-fulfilling need: "we need complex tools to better manage our complex tools", recursive suckage/bloat.

                I used to be "one or two guys to code an entire project", did it well, fast, and with relatively little code. The tools were gett

              • by Junta ( 36770 )

                We have also gotten to the point where it is no longer feasible for one or two guys to code an entire project.

                This is not particularly accurate. A handful of people can do more stuff more quickly than they ever could in the past.

                In some areas, the bar has risen (e.g. an Atari 2600 level game is going to lose out to a well-executed major game that requires a great deal more complex code and more challenging artwork). Of course there are many small indie games made by a couple of people in a short period of time have done relatively well.

                In some areas, this sentiment together with a deluge of unqualified yet 'certif

                • by Tablizer ( 95088 )

                  A handful of people can do more stuff more quickly than they ever could in the past.

                  As I originally mentioned, under the right conditions this is indeed true, but the "right conditions" are relatively rare in practice. If one lets me choose/make/tune my own web stack, I could be quite productive. But orgs don't want to risk living with a roll-your-own stack for understandable reasons: newcomers won't know the stack.

                  With the 90's CRUD IDE's, there were fewer ducks that had to line up right to be productive.

            • Webifying them didn't make them any more useful or usable for the end users on average.

              CRUD de-evolved, I'm just the messenger. Sure, it may may have evolved gills preparing for Waterworld, but the planet never flooded and dry gills tend to get infections. The future missed the target.

              OK more specific constructive criticism them. Its really more of a disagreement though.

              In the 90's the commercial web was an infant. Today it is Hercules.

              To suggest that being able to utilize apps over the web has gimped them for today's internet is absurd.

              I submit that those gills are working and they are breathing HARD.

              • by Tablizer ( 95088 )

                To suggest that being able to utilize apps over the web has gimped them for today's internet is absurd.

                My intro said "small and medium internal CRUD apps". There's a reason I limited the scope of my criticism.

                For light data entry for masses of consumers, yes the web is godsend. But just because it's great for X does not mean it's great for Y.

                Web standards were designed for sharing mostly read-only documents. It's done that quite well (with some caveats). However, for write-heavy and data-heavy applications,

              • by Junta ( 36770 ) on Monday September 16, 2019 @08:16PM (#59201286)

                I would say that while capability is more advanced, that the tooling to manage much of this is in many cases more needlessly tedious and in some ways a backslide from 90s UI design.

                Notably, it is now utterly trivial to create exceedingly custom look and feel with the ability to have any layout one could possibly imagine and assigning any behavior you like to any UI element. Want to draw a radio button UI that instead acts as a checkbox? Sure, why not.

                HIG guidelines are dead and so rough 'common sense' prevails. Admittedly, this is generally is better in practice than it sounds. However a lot of the tooling is in some ways more tedious than paradigms of the 90s. In 90s desktop application I cared vaguely about UI element positioning but the UI toolkit largely made platform-appropriate decisions on the details that were consistent application to application. In current web development, I better get ready to micromanage by tweaking CSS for some of the most trivial things. In exchange for easier access to customize a great deal more, the toolkits *force* more of these decisions to be explicitly made.

                • by Tablizer ( 95088 )

                  I will agree that we have more choices and potential control than in the 90's, but at a big cost to productivity and learning curves in my observation. And the 90's tools were getting better over time. At least until the web killed sales.

                  It kind of reminds me of the ACA (healthcare) debate among Democratic candidates. Single-payer would probably only cost about 60% as much based on observing other countries. But Americans prefer choice in providers. However, having this choice seems to be a big part of our

        • by Tablizer ( 95088 )

          Your criticism is not usable to me. I don't intentionally write bad or vague. If I say something specific that's wrong, then demonstrate it's wrong with logic and/or links. If I say something unclear, then ask for clarification, showing which word(s) trip you up by presenting multiple possible interpretations, for example, so that I can see where your interpretation is different than I intended.

          Specific criticism I can fix. I can't fix general "it's all bad & incoherent" criticism. Most humans can't wo

    • We don't need to have an airplane flap its wings in order to fly. We don't need to perfectly simulate neurons to make a computer do useful processing that in specific aspects can do things which we'd see as intelligent.
      • But you aren't simulating neurons AT ALL. What you are calling neural networks is NOTHING LIKE A BIOLOGICAL NEURAL NETWORK (a.k.a brain). You guys are like used car salesmen.

        • I'm not sure who the "you guys" are here- I don't work in machine learning. But more to the point, neural networks do act like biological networks in multiple important respects. Each individual unit has threshold behavior, and each individual unit can have its inputs reweighted based on how effective something was. The main difference between biological networks and artificial networks is that biological networks generally allow feedback loops. That's presumably important for things like modeling the most
    • If you actually "knew neurology" you would be aware that what we don't know about it is literally almost everything about it.
    • AI is always five years away. Even back in 1960 they made that clear.

  • But they really should've gotten Rod Serling to host and narrate.
  • by swm ( 171547 ) <swmcd@world.std.com> on Monday September 16, 2019 @07:41PM (#59201206) Homepage

    It's been 60 years, and I don't know that we have much to show for it.

    Artificial stupidity
    The saga of Hugh Loebner and his search for an intelligent bot has almost everything: Sex, lawsuits and feuding computer scientists. There's only one thing missing: Smart machines.
    https://www.salon.com/2003/02/... [salon.com]

  • In the opening scene, both men are smoking on camera. One is smoking a cigarette, the scientist a pipe.

  • ...and kookery in the field of AI. I swear to God, if this same bullshit surrounded the internal combustion engine, we would not have many real auto mechanics, but we would have plenty of whack jobs who never picked up a wrench in their lives trying to make IC engines sound like something mystical and throwing in some religious bullshit for good measure.

      We need to tear away all of the bullshit surrounding AI. First step is to get rid of the term "AI".

E = MC ** 2 +- 3db

Working...