CBS and MIT's 1960 Documentary On AI Is a Gem (fastcompany.com) 47
FastCompany magazine editor and Slashdot reader harrymcc writes: On the night of October 26, 1960, CBS aired a special -- coproduced with MIT -- about an emerging field of technology called 'artificial intelligence.' It featured demos -- like a checkers-playing computer and one that wrote scripts for TV westerns -- along with sound bits from leading scientists on the question of whether machines would ever think. It was well reviewed at the time and then mostly forgotten. But it's available on YouTube, and surprisingly relevant to today's AI challenges, 59 years later.
"sound bits" (Score:5, Funny)
Is that 1/8 of a sound bite?
Re:"sound bits" (Score:4, Informative)
Re: (Score:1)
Back then hardware was expensive :-)
Edsger W. Dijkstra classic quote (Score:5, Insightful)
"The question of whether Machines Can Think... is about as relevant as the question of whether Submarines Can Swim"
Re: (Score:2)
I wonder what an alien species would answer.
We got the answer right now! (Score:1)
Re: Edsger W. Dijkstra classic quote (Score:2)
The question is can humans think.
From the state of things today, no.
Re: (Score:2)
The question is can humans think.
I think we *think* humans think. But we need to determine whether "Humans compute" (and we can eventually uncover the algorithms that they use), or we can make "Machines think" (and "thinking" is somehow more than computing).
If all of what humans do in their brains can be reduced to computation, then it is likely that we can make machines do it, too. Whether we'll want to is another question, especially if the machines are likely to do it better than humans.
Re: (Score:2)
Computing is simply the large subset of human thinking processes that can be carried out by a Universal Turing Machine.
Except that, when formalized and automated, it can sometimes be carried out so much faster than a human brain can think that it seems qualitatively different.
Re: (Score:2)
Re: (Score:1)
If Dijkstra said that, then Dijkstra is a retard.
Intelligence is far more important and significant than specific utility. And no, submarines can't swim.
Re:Edsger W. Dijkstra classic quote (Score:5, Insightful)
Considering it's clear that you are one of those with zero in the way of achievements, let alone achievements in comparison to Dijkstra (whose contributions to computer science anyone should appreciate) you have no credibility when it comes to calling him a retard. Second, you obviously don't grasp the meaning of what he said.
Re: (Score:2)
If Dijkstra said that, then Dijkstra is a retard.
Intelligence is far more important and significant than specific utility. And no, submarines can't swim.
/sigh
Re: (Score:2, Insightful)
If Dijkstra said that, then Dijkstra is a retard.
Intelligence is far more important and significant than specific utility. And no, submarines can't swim.
"Whoosh." -Edsger W. Dijkstra
Re: Edsger W. Dijkstra classic quote (Score:1)
Re: Edsger W. Dijkstra classic quote (Score:2)
If the quote is correct, then he used the word relevant, which isn't synonymous with important or significant. In other words, the question of whether a submarine swims isn't relevant to its operation, whether it "swims" is more philosophical in nature. Regardless of whether a computer is intelligent, it still computes.
this will be the year... (Score:5, Funny)
of an AR and AI blockchain cloud app on the Linux desktop
Try our extra big-ass iPHB tech fads! (Score:1)
Re: (Score:2)
You mean "algos"
Can this fad just die now? (Score:3, Interesting)
I literally did better neuron simulations as a hobby in 1999! When I was 18!
I was offered an well-paid job to do "AI" recently. I got all the "bibles" of the industry, to read, so I'm at the cutting edge.
I thought they were trolling me! As somebody who knows his share of actual neurology,
Sorry but the whole industry, like the HTML5/Web n.0/WhatWG one, is medically certified batshit insane.
At least the part that is not hacks, so clueless, they do not even reacg the level where insanity could be determined.
Not just AI, IT is rotting from fads & confusi (Score:3)
Most of IT is a joke, not just AI. It takes roughly 3x longer in the typical org to develop small and medium internal CRUD apps than it did in the 90's, and everybody just says, "that's just the way it is. As long as they pay me, I don't care if takes longer, not my dime."
Some try to justify by saying deployment is cheaper with the web, saving on desktop support costs. But I'm not sure this trade-off is inherent. What Grand Law of Physics keeps the industry from improving desktop d
Re: (Score:2)
“... what you’ve just said is one of the most insanely idiotic things I have ever heard. At no point in your rambling, incoherent response were you even close to anything that could be considered a rational thought. Everyone in this room is now dumber for having listened to it. I award you no points, and may God have mercy on your soul.”
Re: (Score:2)
Seriously though? Comparing application dev of the 90's to application dev today?
.
When medical doctors first started "practicing" it was a two week course to get licensed.
Shit changes brah.
Jesus Christ.
Re: (Score:2)
But doctors know more and can do more now. New apps don't. Webifying them didn't make them any more useful or usable for the end users on average.
CRUD de-evolved, I'm just the messenger. Sure, it may may have evolved gills preparing for Waterworld, but the planet never flooded and dry gills tend to get infections. The future missed the target.
Re: (Score:2)
Some might argue that they have "discovered" far more unknowns than knows.
Point is without using a structured , measured approach (which takes longer) to solving problems, more problems tend to crop up, whether you find them or not.
Re: (Score:1)
Most of the extra "problem solving" steps are work-arounds to the web's limitations (from a CRUD perspective), or people following trends they haven't investigated well for fear of being left behind.
If you could make an argument along the lines of "in order to get benefit X, Y, and Z; we must live with annoyance A, B, and C", then maybe we can agree we
Re: (Score:2)
We have also gotten to the point where it is no longer feasible for one or two guys to code an entire project.
Splitting up the duties adds overhead.
Re: (Score:1)
That's because our tools are too labor intensive, requiring specialization to split up and master the layers of suckage. It's a self-fulfilling need: "we need complex tools to better manage our complex tools", recursive suckage/bloat.
I used to be "one or two guys to code an entire project", did it well, fast, and with relatively little code. The tools were gett
Re: (Score:2)
We have also gotten to the point where it is no longer feasible for one or two guys to code an entire project.
This is not particularly accurate. A handful of people can do more stuff more quickly than they ever could in the past.
In some areas, the bar has risen (e.g. an Atari 2600 level game is going to lose out to a well-executed major game that requires a great deal more complex code and more challenging artwork). Of course there are many small indie games made by a couple of people in a short period of time have done relatively well.
In some areas, this sentiment together with a deluge of unqualified yet 'certif
Re: (Score:1)
As I originally mentioned, under the right conditions this is indeed true, but the "right conditions" are relatively rare in practice. If one lets me choose/make/tune my own web stack, I could be quite productive. But orgs don't want to risk living with a roll-your-own stack for understandable reasons: newcomers won't know the stack.
With the 90's CRUD IDE's, there were fewer ducks that had to line up right to be productive.
Re: (Score:2)
Webifying them didn't make them any more useful or usable for the end users on average.
CRUD de-evolved, I'm just the messenger. Sure, it may may have evolved gills preparing for Waterworld, but the planet never flooded and dry gills tend to get infections. The future missed the target.
OK more specific constructive criticism them. Its really more of a disagreement though.
In the 90's the commercial web was an infant. Today it is Hercules.
To suggest that being able to utilize apps over the web has gimped them for today's internet is absurd.
I submit that those gills are working and they are breathing HARD.
Re: (Score:2)
My intro said "small and medium internal CRUD apps". There's a reason I limited the scope of my criticism.
For light data entry for masses of consumers, yes the web is godsend. But just because it's great for X does not mean it's great for Y.
Web standards were designed for sharing mostly read-only documents. It's done that quite well (with some caveats). However, for write-heavy and data-heavy applications,
Re:Not just AI, IT is rotting from fads & conf (Score:4, Interesting)
I would say that while capability is more advanced, that the tooling to manage much of this is in many cases more needlessly tedious and in some ways a backslide from 90s UI design.
Notably, it is now utterly trivial to create exceedingly custom look and feel with the ability to have any layout one could possibly imagine and assigning any behavior you like to any UI element. Want to draw a radio button UI that instead acts as a checkbox? Sure, why not.
HIG guidelines are dead and so rough 'common sense' prevails. Admittedly, this is generally is better in practice than it sounds. However a lot of the tooling is in some ways more tedious than paradigms of the 90s. In 90s desktop application I cared vaguely about UI element positioning but the UI toolkit largely made platform-appropriate decisions on the details that were consistent application to application. In current web development, I better get ready to micromanage by tweaking CSS for some of the most trivial things. In exchange for easier access to customize a great deal more, the toolkits *force* more of these decisions to be explicitly made.
Re: (Score:1)
I will agree that we have more choices and potential control than in the 90's, but at a big cost to productivity and learning curves in my observation. And the 90's tools were getting better over time. At least until the web killed sales.
It kind of reminds me of the ACA (healthcare) debate among Democratic candidates. Single-payer would probably only cost about 60% as much based on observing other countries. But Americans prefer choice in providers. However, having this choice seems to be a big part of our
Re: (Score:1)
Your criticism is not usable to me. I don't intentionally write bad or vague. If I say something specific that's wrong, then demonstrate it's wrong with logic and/or links. If I say something unclear, then ask for clarification, showing which word(s) trip you up by presenting multiple possible interpretations, for example, so that I can see where your interpretation is different than I intended.
Specific criticism I can fix. I can't fix general "it's all bad & incoherent" criticism. Most humans can't wo
Re: (Score:2)
Re: (Score:2)
But you aren't simulating neurons AT ALL. What you are calling neural networks is NOTHING LIKE A BIOLOGICAL NEURAL NETWORK (a.k.a brain). You guys are like used car salesmen.
Re: (Score:2)
Re: Can this fad just die now? (Score:2)
Re: (Score:1)
AI is always five years away. Even back in 1960 they made that clear.
Looks cool. (Score:1)
Re: It's not a "gem" (Score:1)
Artificial stupidity (Score:3)
It's been 60 years, and I don't know that we have much to show for it.
Artificial stupidity
The saga of Hugh Loebner and his search for an intelligent bot has almost everything: Sex, lawsuits and feuding computer scientists. There's only one thing missing: Smart machines.
https://www.salon.com/2003/02/... [salon.com]
Re: Artificial stupidity (Score:1)
They're Both Smoking (Score:1)
In the opening scene, both men are smoking on camera. One is smoking a cigarette, the scientist a pipe.
Too much fluff, and bullshit... (Score:2)
...and kookery in the field of AI. I swear to God, if this same bullshit surrounded the internal combustion engine, we would not have many real auto mechanics, but we would have plenty of whack jobs who never picked up a wrench in their lives trying to make IC engines sound like something mystical and throwing in some religious bullshit for good measure.
We need to tear away all of the bullshit surrounding AI. First step is to get rid of the term "AI".