Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Music Technology

Dartmouth Contests Showcase Computer-Generated Creativity 50

An anonymous reader writes: A series of contests at Dartmouth College will pit humans versus machines. Both will produce literature, poetry and music which will then be judged by humans who will try and determine which selections were computer made. "Historically, often when we have advances in artificial intelligence, people will always say, 'Well, a computer couldn't paint a sunset,' or 'a computer couldn't write a beautiful love sonnet,' but could they? That's the question," said Dan Rockmore, director of the Neukom Institute for Computational Science at Dartmouth.
This discussion has been archived. No new comments can be posted.

Dartmouth Contests Showcase Computer-Generated Creativity

Comments Filter:
  • by ArcadeMan ( 2766669 ) on Sunday July 05, 2015 @04:22PM (#50049107)

    The contest itself was created by a computer.

  • Computers certainly can do those things.

    A much better, and much more fundamental question is - "Would a computer ever WANT to paint a sunset, or write a sonnet?"

    People have struggled with their own motivations for some time. AI is just beginning to consider these factors.

    Can they? Almost certainly. Why would someone want to?

    • I can see a company making an AI to generate beautiful imagery that they can sell.

    • "Would a computer ever WANT to paint a sunset, or write a sonnet?"

      Most humans don't want to do these things, either. Even the people who do these things, they don't do them because they WANT to do them. They do them because it is in their nature. "Want" has nothing to do with it.

      • by AK Marc ( 707885 )
        It's in your nature to eat, so you don't "want" to eat, you eat because it's in your nature.

        What the AI researchers don't consider is that humans are driven by millions of (mostly conflicting) wants. Hierarchies of needs cover a small subset of those most prominent. But we aren't programming in behavioral parameters, just "intelligence", and that's why we'll fail. Humans have the desire to be liked, and to please others. How do you program that into a computer?
        • Better question: How much does the desire to be liked and please others factor into the production of art?
      • So, what you're saying is, I play guitar because all humans have an instinctual impulse to play guitar?

        Because that doesn't really make sense.

  • by John Allsup ( 987 ) <slashdot.chalisque@net> on Sunday July 05, 2015 @04:33PM (#50049139) Homepage Journal
    A major feature of the Turing Test is that it is interactive: later lines of conversation are sent to the computer after earlier lines of conversation are known. A 'Turing Test' where someone is given a transcript and asked to decide who in the transcript is human or computer is a much weaker test. A more 'Turing Test' like test would be one where I give the computer/human a brief to draw a sketch in, say, an hour, and see the end result, then get to ask for a few more sketches. The kind of thing they are developing here is more like an algorithm to generate a single line of conversation given knowledge (pre algorithm design) of the previous lines in the conversation.

    Even so, computer generated art is something which should be explored, and then the art will be in finding new clever ways to use the computer as an artistic medium.
    • by PhilHibbs ( 4537 ) <snarks@gmail.com> on Sunday July 05, 2015 @05:09PM (#50049209) Journal

      It doesn't "misunderstand the turing test". It's a different test. No computer has yet passed the Turing Test, so doesn't it make sense to have other tests?

      Let's assume that the Turing Test is a good test for AI. It's debatable, but let's accept the premise. We don't have good AI yet, so what is the point in testing what we have against a test for good AI? Doesn't it make sense to aim for something with a lower bar, achieve that, and then tackle the tougher problem? When I was at school, we didn't set the high jump at olympic champion levels. We set it at a level that was a stretch for us but still achievable.

      • by AK Marc ( 707885 ) on Sunday July 05, 2015 @06:31PM (#50049551)

        Let's assume that the Turing Test is a good test for AI. It's debatable, but let's accept the premise. We don't have good AI yet, so what is the point in testing what we have against a test for good AI?

        The same reason we tested inferior chess programs against grand masters. So we could learn the weaknesses, and improve upon them. So testing an AI improves the AI, like testing a chess program lead to improvements in the chess program.

        When I was at school, we didn't set the high jump at olympic champion levels.

        When I was at school, the pool was olympic length, and the high jump could be set at olympic heights, as well as lower ones. So you do what you can, and compare your failure to the desired levels. It's not just the pass-fail as given. But it lets you compare your failure to the ideal.

        Like lasting longer in chess against a grand master, or fooling more people in a Turing test (or lasting longer in the question sequence until the tester correctly identifies the AI).

        • > The same reason we tested inferior chess programs against grand masters. So we could learn the weaknesses, and improve upon them. So testing an AI improves the AI, like testing a chess program lead to improvements in the chess program.

          This is ridiculous. There is no indication that the 'Turing test' has done ANYTHING to improve AI and machine learning research so far. If you knew even the tiniest bit about machine learning and AI you wouldn't be saying this.

          • by AK Marc ( 707885 )
            Is there any indication that having your chess program lose to a grand master lead to improvements in the chess performance?
            • Don't play dumb with me. I'm talking about aiming for success in the test, not simply losing in the test.

              • by AK Marc ( 707885 )

                Don't play dumb with me.

                Just trying to reply in a manner you understand.

                I'm talking about aiming for success in the test, not simply losing in the test.

                I'm talking about aiming for success as well. Losing leads to winning. It does for real intelligence, yet you assert the opposite for artificial intelligence.

                • No, that's not what I assert. It's ironic that you use passive-aggressive insults like "a manner you understand" yet you aren't even capable of understanding simple English.

                  I said, and I repeat, that the Turing test has done nothing to improve AI and machine learning research. And I'm specifically talking about aiming for success in the test. I'll repeat it again for good measure: Aiming for success in the Turing test has done nothing to improve AI and machine learning research. If you knew even the tiniest

                  • by AK Marc ( 707885 )
                    So your aggressive-aggressive insults are better. I know more than you do, and you are wrong. The test is defined. It gives a goal. Having a goal has moved AI forward. If you knew even the tiniest bit about anything, you wouldn't be saying this.
        • by PhilHibbs ( 4537 )

          The same reason we tested inferior chess programs against grand masters.

          I'm pretty sure that the first early attempts at chess programs were mostly tested against college or chess club players. I don't remember any "1k ZX Chess vs Kasparov" events.

          • by AK Marc ( 707885 )
            Yes, you test against the worst player you expect to lose to. Any easier, and you won't learn. Any harder and you will be beaten so overwhelmingly that any one mistake would have been inconsequential to the overall result.
  • A series of contests at Dartmouth College will pit humans versus machines.

    Do you want Skynet? Because that's how you get Skynet.

  • not only no, but Hell No.
    • by mlts ( 1038732 ) on Sunday July 05, 2015 @05:17PM (#50049233)

      With the boilerplate novels and cookie-cutter movies being cranked out, I wonder if a computer would eventually be a better writer/artist than what we have now.

      One could even add music into the list as well, where a marketing person could click on an interface, randomly select what an album would have for songs, and the computer would create a band (name, color scheme), write the lyrics, compose the pieces, even pick people from YouTube who would become the band members. Or toss the physical band members, have an avatar like Miku and call it done. Pop music can just be relegated to a cronjob that fires off, weights lyrics on statistics gleaned from ad sites and news articles, makes the songs, mixes/masters the album, and spits out music for the music stores, no human effort needed in the creativity process.

      Similar with movies. The computer would grab weighting on what social topics are being thought about when the movie is created (so the movie has some impact), create some characters, follow a meta-script to generate the dialog, generate terrain and scenery render the scenes and CGI action, and out pops a blockbuster hit at the push of a button, no actors needed.

      • by AK Marc ( 707885 )
        I can only seem "dry wit" comedy improving. Many of the jokes would be something like an allusion. It's a common phrase for someone to say a quote from Shakespeare as a joke, or to refer to the deeper meaning without spending time explaining. Often it's humorous as one is making fun of the trivial nature assigned to a deep thought. Someone trying to pick between a green and orange squishy drink, saying "to be or not to be" as if the selection was a life-and death manner would be funny (or at least an at
      • > I wonder if a computer would eventually be a better writer/artist than what we have now.

        In all likelihood they probably are, if we're talking about the crap that comes out of major studios. Have you ever seen what goes in when they're 'writing' shows and movies? It's basically like a lab or industrial operation.

  • It has been over a week since SCOTUS gayed up marriage, now it is time to gay up computers.
  • by Anonymous Coward

    Real art, in it's natural form, from humans anyway, comes from discovering new truths of the world around us and expressing them in a form that appeals to one or more of the five senses. Since comptuers can only understand what they know, and not infer on new understandings, they cannot, and never will be able to, create real art.

    Example: A computer can recreate the Mona Lisa in a near infinite number of ways. It cannot, however, create the original Mona Lisa without there having been a Mona Lisa to create

    • by AK Marc ( 707885 )

      Since comptuers can only understand what they know, and not infer on new understandings, they cannot, and never will be able to, create real art.

      So, when a computer can infer on new understandings, they will be able to create art? Your tone makes it seem impossible, but your premise outlines conditions under which they can do it. All we need is an inference engine.

    • by shoor ( 33382 )

      Real art, in it's natural form, from humans anyway, comes from discovering new truths of the world

      I disagree with that. Science might lead to discovering new truths, but I don't think art typically does that. I think art is a kind of outlet for stuff simmering inside the mind. Whether that stuff is 'true' or not is almost irrelevant. The art may lead to self-discovery, which could be a kind of truth, but it doesn't necessarily do that. It could lead to self-deception instead.

      This project, as it mention

    • They cannot make shit up. I can make shit up.

      'Making shit up' still requires some form of understanding of underlying concept (how would you know you're making shit up?). And understanding new things are still derived from either existing knowledge or observation and experiments (it does not have to be rigorous). As long as you have a stick for the agent to measure against and a proper problem definition, an artificial intelligence program could certainly 'create real art'. The problem is that we don't eve

    • by PhilHibbs ( 4537 )

      Since comptuers can only understand what they know, and not infer on new understandings, they cannot, and never will be able to, create real art.

      That's an assertion ab nihilo. I could equally assert that they will be able to, and there we have a disagreement with neither of us presenting evidence to back ourselves up.

      Example: A computer can recreate the Mona Lisa in a near infinite number of ways. It cannot, however, create the original Mona Lisa without there having been a Mona Lisa to create from.

      Counterexample: A human can recreate the Mona Lisa in a near infinite number of ways. It cannot, however, create the original Mona Lisa without there having been a Mona Lisa to create from.
      See what I did there?

  • by FranTaylor ( 164577 ) on Sunday July 05, 2015 @05:30PM (#50049253)

    Dartmouth men in 2015 are still unable to discern intelligence in human females

    "When better women are made, Dartmouth men will make them"

  • There are humans at Dartmouth!?!? Who knew?
  • What males this program any different from the countless others that build more or less plausible scripts and stories from a set of stereotypical building blocks?
  • "If you locked 1,000 code monkeys in a room with Teletypes, could they pump out some Shakespeare?"

    • Run AI through millions of generations of a genetic algorithm and they could pump out Shakespeare and only Shakespeare.
      • The real Shakespeare could only pump out Shakespeare too.

        • by Jeremi ( 14640 )

          The real Shakespeare could only pump out Shakespeare too.

          ... and the real Shakespeare was also the result of millions of generations of a genetic algorithm. It seems the experiment has already been done :)

  • This contest pits Human artists against Human machine programmers (who are also artists, but opinions on this may vary). One holds a paints brush, the other a violin, a third one a computer.

    It is not the computer that generates the art works, just as the paint brush and the violin do not create the art.
    • If a human machine programmer can, say, put out ten musical albums of comparable quality to the one that a human recording artist can put out in the same time, that's still a win for having the best tool for the job.

  • Can we get a computer to create art? It is an interesting idea to see how close a computer can get to what we recognize as art. But even if it comes up with something good, there will still be people who will say "Computers cannot create art. By definition they just can't. If a computer has created it, it can't be art, full stop. No discussion."

    To get around this mental barrier, let me pose a different question. Suppose you were to make something like the little robots that are exploring Mars now, but th

    • It is going to be on a strange planet by itself. Do you want it to fear its own death?

      Yes, in the sense that a robot should take duration of its useful mission as an optimization parameter.

      To long for the companionship of its peers?

      Not in a lone mission, but yes in a mission where multiple robots must cooperate, such as one rescuing another from a crater.

Fast, cheap, good: pick two.

Working...