Forgot your password?
typodupeerror
Australia Software

Researcher Builds Machines That Daydream 271

Posted by timothy
from the free-thinkers-are-the-best-kind dept.
schliz writes "Murdoch University professor Graham Mann is developing algorithms to simulate 'free thinking' and emotion. He refutes the emotionless reason portrayed by Mr Spock, arguing that 'an intelligent system must have emotions built into it before it can function.' The algorithm can translate the 'feel' of Aesop's Fables based on Plutchick's Wheel of Emotions. In tests, it freely associated three stories: The Thirsty Pigeon; The Cat and the Cock; and The Wolf and the Crane, and when queried on the association, the machine responded: 'I felt sad for the bird.'"
This discussion has been archived. No new comments can be posted.

Researcher Builds Machines That Daydream

Comments Filter:
  • Feelings (Score:5, Insightful)

    by Anonymous Coward on Friday September 24, 2010 @02:37AM (#33684390)

    Well sure, emotions are what give us goals in the first place. It's why we do anything at all, to "feel" love, avoid pain, because of fear, etc. Logic is just a tool, the tool, that we use to get to that goal. Mathematics, formal logic, whatever you want to call it is just our means of understanding and predicting the behavior of the world, and isn't a motivation in and of itself. The real question has always been if there's "free will" and what that would be defined as. Not the existence, or lack of, emotions as displayed by "Data" or other science fiction charicatures. As Bender said "Sometimes, I think about how robots don't have emotions, and that makes me sad"

  • by Narcocide (102829) on Friday September 24, 2010 @02:45AM (#33684428) Homepage

    Haven't these fools seen Blade Runner?

  • by melonman (608440) on Friday September 24, 2010 @02:47AM (#33684438) Journal

    One set of stories, one one-sentence response. Would that be news in any field of IT other than AI? Eg "Web server returns a correct response to one carefully-chosen HTTP request!!!"?

    Surely the whole thing about emotion is that it happens across a wide range of situations, and often in ways that are very hard to tie down to any specific situational factors. "I feel sad for the bird" in this case is really just literary criticism. It's another way of saying "A common and dominant theme in the three stories is the negative outcome for the character which in each case is a type of bird". Doing that sort of analysis across a wide range of stories would be a neat trick, but I don't see the experience of emotion. I see an objective analysis of the concept of emotion as expressed in stories, which is not the same thing at all.

    Reading the daily newspaper and saying how the computer feels at the end of it, and why, and what it does to get past it, might be more interesting.

  • by foniksonik (573572) on Friday September 24, 2010 @03:03AM (#33684486) Homepage Journal

    We define our emotions in much the same way. We have an experience, recorded in memory as a story and then define that experience as "happy" or "sad" through cross reference with similar memory/story instances.

    Children have to be taught how to define their emotions. There are many many picture books/tv series episodes/ etc dedicated to this very exercise. Children are shown scenarios they can relate to and given a definition for that scenario.

    The emotions themselves can not be supplied of course, only the definition and context within macro social interactions.

    What this software can do is create a sociopathic personality. One which understands emotion solely through observation rather than first hand experience. It will take more to establish what we consider emotions ie a psychosomatic response to stimuli. This requires senses and a reactive soma (for humans this means feeling hot flashes, tears, adrenalin, etc).

  • by afaik_ianal (918433) * on Friday September 24, 2010 @03:04AM (#33684488)

    Yeah, I wonder what the machine thought of "The Forester and the Lion", and "The Boy Who Cried Wolf". They seem strangely appropriate.

  • by feepness (543479) on Friday September 24, 2010 @03:19AM (#33684536) Homepage
    I felt sad for the researcher.
  • by Trepidity (597) <delirium-slashdot AT hackish DOT org> on Friday September 24, 2010 @03:23AM (#33684552)

    He now does commonsense-reasoning stuff at IBM Research using formal logic, but back in his grad-school days, Erik Mueller [mit.edu] wrote a thesis on building a computational model of daydreaming [amazon.com].

  • by token0 (1374061) on Friday September 24, 2010 @03:39AM (#33684602)
    It's like a XV century man trying to simulate a PC by putting a candle behind colored glass and calling that a display screen. People often think AI is getting really smart and e.g. human translators are getting obsolete (a friend of mine was actually worried about her future as a linguist). But there is a fundamental barrier between that and the current state of automatic german->english translations (remember that article some time ago?), with error rates unacceptable for anything but personal usage.
    Some researchers claim we can simulate intelligent parts of the human brain - I claim we can't simulate an average mouse (i.e. one that would survive long enough in real-life conditions), probably not even it's sight.
    There's nothing interesting about this 'dreaming' - as long as the algorithm can't really manipulate abstract concepts. Automatic translations are a surprisingly good test for that. Protip: automatically dismiss any article like that if it doesn't mention actual progress in practical applications, or at least modestly admit that it's more of an artistic endeavour than anything else.
  • I agree. (Score:5, Insightful)

    by stephanruby (542433) on Friday September 24, 2010 @03:55AM (#33684652)

    The software isn't even "daydreaming" either. You could say it's parsing and cross-referencing emotions and meta-objects out from a textual database. And then, it's returning the resulting records in the first person singular, but that's about it.

    That's hardly what I'd call "daydreaming". When I daydream, I see my dream from the first person's perspective. That part is correct. But there is at least some internal visualization going on. So unless this software starts generating internal visual images to make its decisions, let's say some .png image with at least one pixel within it, or some .png image representing itself winning the lottery, then I'm calling shenanigans on the entire "daydreaming" claim.

  • by thebignop (1907962) on Friday September 24, 2010 @04:30AM (#33684774)
    António Damásio, a well-known neuropsychologist already extensively explained why are emotions intrinsically linked to rational thought in his book "Descartes' Error: Emotion, Reason, and the Human Brain", published in 1994. He basically says that without emotion you wouldn't have motivation to think rationally and he studied the case of Phineas Gage, a construction work that got an iron rod crossing through his skull and survived, but stopped having feelings after the accident. I still doubt that they'll get something useful with this project. There is an infinite number of variables that stimulates our emotions and we can't expose a computer to. Not to say that even if we could, nowadays supercomputers doesn't have enough processing power to do the job.
  • by Anonymous Coward on Friday September 24, 2010 @04:41AM (#33684796)

    Spock isn't emotionless, no vulcans are emotionless in fact. They just learn over time to control their emotions and keep them buried deep within themselves. Big difference between that and being completely devoid of all emotion.

  • by mattdm (1931) on Friday September 24, 2010 @05:36AM (#33684980) Homepage

    We define our emotions in much the same way. We have an experience, recorded in memory as a story and then define that experience as "happy" or "sad" through cross reference with similar memory/story instances.

    Children have to be taught how to define their emotions. There are many many picture books/tv series episodes/ etc dedicated to this very exercise. Children are shown scenarios they can relate to and given a definition for that scenario.

    The emotions themselves can not be supplied of course, only the definition and context within macro social interactions.

    What this software can do is create a sociopathic personality. One which understands emotion solely through observation rather than first hand experience. It will take more to establish what we consider emotions ie a psychosomatic response to stimuli. This requires senses and a reactive soma (for humans this means feeling hot flashes, tears, adrenalin, etc).

    In other words, the process of defining emotions -- which has to be taught to children -- is distinct from the process of having emotions, which certainly doesn't need to be taught.

  • Re:I agree. (Score:5, Insightful)

    by HungryHobo (1314109) on Friday September 24, 2010 @06:00AM (#33685070)

    Why should it have to use standard image formats?
    Your brain doens't.

    And not all my daydreams are visual.
    Pleanty are merely fictional/planned conversations or even thoughts about physical movement.

  • Re:Feelings (Score:5, Insightful)

    by Requiem18th (742389) on Friday September 24, 2010 @06:20AM (#33685136)

    A human being can choose how they respond to these inputs.

    No you can't, once you discover a way to activate your pleasure receptors, your next action will be to activate them, all the time.If you stop, voluntarily, it will be because you have to do something else to ensure future pleasure or perhaps to avoid a great deal of pain. This is how drug addiction works. This is how we are wired, you may not like how that sounds but you have the obligation to accept it and understand it.

    You probably don't consume drugs. This is not because you are above human nature, You avoid drugs because you are afraid of the pains that come with them, like losing the love and trust of those you love, maybe you simply reject drugs out of a personal sense of disgust over the hedonistic senselessness of a narcoleptic lifestyle. Either love, fear or disgust you reject drugs over an emotion, not a reason. I the end everything is irrational, as it should.

    You don't have to feel bad about it, intelligence is built upon emotion as houses are build upon brick, as clocks are built from gears, as computers are built from chips. There is intelligence in the clockwork of a pocket watch, but the springs that moves it doesn't ask for a reason to uncoil, it just does it. There is intelligence in the circuits of a computer, but it's logic gates are oblivious to the rationale behind why they are doing it. Every machine, including animals, have non rational elements in them.

    This is very natural as "intelligent things" are just a subset of the larger set of "things" all of which have been behaving irrationally. The wind blows, the rain pours, the sun shines bright in the sky. All of this is irrational, meaning, none of these things are planning what they are doing nor they have an idea of why they are doing it. Rational follows irrational, that's the order of the world.

    Back to your methaphor, you say that emotions are just inputs, that's true but they are special inputs that set goals. Let's make an analogy with a robot: You create a robot with a very advanced AI, you can chat with it and it will understand everything you said and why you said it. You programed this robot with one goal, for coffee tables to be made. You give it free reign over the method. Being an extremely intelligent robot, it subcontracts the labor to a sweat shop in China while it figures out where to build a mechanized plant. You equipped this robot with the knowledge to reprogram itself, and right away it does just that, optimizing its mind for the task of building coffee tables. But it won't deprogram the goal of making coffee tables, because that wouldn't further its goal of making coffee tables. It's not that it doesn't know how to reprogram itself, it's not that there is a lock preventing it from changing it's goals. It's just that it won't ever have a reason to disable that goal.

    Let's now attack specific examples:

    A soldier can choose to respond to the natural fears of bullets flying at him and death by jumping into a foxhole, or he can override all those emotions and charge straight at the enemy.

    Here the soldier is driven by the emotion of loyalty to his commander, or his teammates. Maybe he is afraid of the punishment he would receive if he disobeyed orders, including public scorn back home. Maybe he hates the enemy, maybe he is afraid of what would happen if the enemy wins. Maybe is a combination of all of the above.

    His frontal cortex can tell him the consequences of charging, or not, but it can't make an argument about *why* he should pr should don't. He needs a motive, which is an irrational emotion.

    A person can decide to rape the drunk one who has come into the room, semi-conscious, or choose to ignore the natural impulse and do nothing.

    Again, you correctly identified the desire to rape as a natural impulse but you failed to realize why would someone *not* rape a drunk one, incorrectly and implicitly attributing it to

  • Re:I agree. (Score:2, Insightful)

    by Gastrobot (998966) on Friday September 24, 2010 @08:34AM (#33685676)
    What the great-grandparent is getting at is that though the thing may give output similar to the output of a human being it lacks the experience that comes from being human, and particularly in this case daydreaming. It has no qualia. To the machine everything that is input into it is simply a value to be shunted through its algorithms. Nothing has been programmed to actually cause the experience of qualia or true appreciation. Great-grandparent is using an the idea of an image sitting in RAM to represent the qualia of the heads-up-display that we experience with our vision. I'd say that this image would still fail to actually cause the experience of qualia because it's just an image in RAM, there is still no mechanism in the software to sense qualia.

    Even if a robot looks and behaves exactly like me in every circumstance then that doesn't mean that it actually has qualia like I do.
  • Re:I agree. (Score:5, Insightful)

    by HungryHobo (1314109) on Friday September 24, 2010 @09:22AM (#33686186)

    You're just using the word "qualia" as a placeholder for "insert magicalness here".

    "To the machine everything that is input into it is simply a value to be shunted through its algorithms."

    To a human brain everything is just electrical impulses to be shunted through a mushy network of cells.
    Nothing has been grown to actually cause the experience of [insert magicalness here] or true appreciation.

    Stick some electrodes into that mushy network and feed in some junk input and you'll smell colours, hear the taste of strawberries and decide that you love a cardboard cutout of a spider.

    Cut out or damage a chunk of that network and you'll insist that you are currently dead(despite being able to explain this to the people around you) or that there is no left side to your body(even if you can see it) or that you are blind when you're not ( while somehow able to catch a ball and walk around without bumping into things) or that you're not blind even when you are (clumsy me, no no, i can see fine) and you will know with utter certainty that what you're saying is true.

    You as a person are the network and the information stored in it.
    Screw around with that network and you and everything that you consider you will get screwed up as well.
    Magic is not real.
    No matter how much we want to think of ourselves as special magic is not real.

    And since magic is not real there should be nothing but lack of understanding stopping us from emulating the physical processes that take place in the brain in hardware or software.

  • Re:Feelings (Score:3, Insightful)

    by clone53421 (1310749) on Friday September 24, 2010 @12:06PM (#33688362) Journal

    And heat doesn’t really exist on an atomic level, either. It’s just atoms moving really quickly. How “real” is it exactly? Yet, on a larger scale, a baseball whacks you a quite bit differently than the burner on your stove.

  • Re:Building? (Score:3, Insightful)

    by clone53421 (1310749) on Friday September 24, 2010 @12:24PM (#33688580) Journal

    Computers are also chemical and brains are also electronic. Computers can be analog and digital logic can produce analog results to any desired level of precision. A molecule that acts as a neurotransmitter carries a discrete binary signal on its own.

    The primary difference between a brain and a computer (as they currently exist) is that a brain is massively (almost unimaginably) parallel in its processing and a computer is primarily serial. However it’s possible for a serial processor to emulate a parallel one given enough time in which to do it.

If money can't buy happiness, I guess you'll just have to rent it.

Working...